JP3749227B2 - Stereoscopic image processing method and apparatus - Google Patents

Stereoscopic image processing method and apparatus Download PDF

Info

Publication number
JP3749227B2
JP3749227B2 JP2003003761A JP2003003761A JP3749227B2 JP 3749227 B2 JP3749227 B2 JP 3749227B2 JP 2003003761 A JP2003003761 A JP 2003003761A JP 2003003761 A JP2003003761 A JP 2003003761A JP 3749227 B2 JP3749227 B2 JP 3749227B2
Authority
JP
Japan
Prior art keywords
parallax
stereoscopic image
image
stereoscopic
object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2003003761A
Other languages
Japanese (ja)
Other versions
JP2004007395A (en
Inventor
健 増谷
Original Assignee
三洋電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2002087493 priority Critical
Application filed by 三洋電機株式会社 filed Critical 三洋電機株式会社
Priority to JP2003003761A priority patent/JP3749227B2/en
Priority claimed from PCT/JP2003/003791 external-priority patent/WO2003081921A1/en
Publication of JP2004007395A publication Critical patent/JP2004007395A/en
Priority claimed from US10/949,528 external-priority patent/US8369607B2/en
Publication of JP3749227B2 publication Critical patent/JP3749227B2/en
Application granted granted Critical
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

[0001]
BACKGROUND OF THE INVENTION
The present invention relates to a stereoscopic image processing technique, and more particularly to a method and apparatus for generating or displaying a stereoscopic image based on a parallax image.
[0002]
[Prior art]
In recent years, the lack of network infrastructure has been regarded as a problem. However, the transition to broadband has started, and rather, the number of types and the number of contents that effectively use a wide band are becoming conspicuous. Video has always been the most important means of expression, but many of the efforts so far have been related to improvements in display quality and data compression ratios. There is a feeling that such efforts are in the back.
[0003]
Under such circumstances, stereoscopic video display (hereinafter simply referred to as “stereoscopic display”) has been studied in various ways, and has been put into practical use in a limited market where theater applications and special display devices are used. In the future, R & D in this direction will accelerate with the aim of providing more realistic content, and the time will come when individual users can enjoy stereoscopic display at home.
[0004]
In addition, stereoscopic display is expected to spread in the future, and therefore, a display form that could not be imagined with current display devices has also been proposed. For example, a technique for displaying a selected partial image of a two-dimensional image as a three-dimensional image is disclosed (see Patent Document 1).
[0005]
[Patent Document 1]
Japanese Patent Laid-Open No. 11-39507
[0006]
[Problems to be solved by the invention]
Under such circumstances, some problems have been pointed out for stereoscopic display. For example, it is difficult to optimize the parallax that causes a three-dimensional effect. Originally, it does not actually project a three-dimensional object, but rather casts the image by shifting it in pixel units with respect to the left and right eyes, and it is not easy to give the artificial three-dimensional effect a natural feeling. Absent.
[0007]
In addition, too much parallax may be a problem, and a viewer of a stereoscopic image (hereinafter also simply referred to as a user) may complain of mild discomfort. Of course, there are various factors such as not only the three-dimensional display but also the situation or sense of surroundings of the scene being displayed is not consistent with the situation. However, empirically, such a problem is easily observed when the parallax is too large, in other words, when the stereoscopic effect is too strong.
[0008]
The above is a human physiological story, but apart from that, there are technical factors that hinder the spread of stereoscopic video content and applications. Although stereoscopic vision is realized by parallax, even if the parallax is expressed by the shift amount of the pixels of the left and right images, the same stereoscopic video may or may not be properly stereoscopically viewed depending on the hardware of the display device. . If the parallax that expresses far away exceeds the interocular distance, stereoscopic viewing is theoretically impossible. Today, as the resolution and screen size of display devices diversify, such as PCs (personal computers), television receivers, and portable devices, the most suitable content for stereoscopic display is created by considering various hardware. Is more difficult, or it is more accurate that no methodology is provided for it.
[0009]
Also, given that methodology, it would be difficult to expect a general programmer to understand it and use it to create content and applications.
[0010]
The technique disclosed in the above document has been proposed as a technique for solving the above-mentioned problems, but in order to spread stereoscopic display in the future, further techniques are proposed and new techniques are accumulated. It is necessary to link technologies and apply them to products.
[0011]
The present invention has been made in view of such a background, and an object thereof is to propose a new expression method for stereoscopic display. Another object is to generate or display a stereoscopic image suitable for the user even if the display target image or the display device changes. Yet another object is to adjust the stereoscopic effect with a simple operation when a stereoscopic display is being made. Yet another object is to reduce the burden on the programmer when creating contents or applications capable of appropriate stereoscopic display. Still another object is to provide a technology for realizing appropriate stereoscopic display as a business model.
[0012]
[Means for Solving the Problems]
The inventor's knowledge that forms the basis of the present invention is that the appropriate parallax is once separated from the hardware of the display device and the elements such as the distance between the user and the display device (hereinafter collectively referred to as “hardware”). It is in. That is, the general parallax expression is once described in a general form that does not depend on hardware by generalizing the expression of the appropriate parallax with the camera interval and the optical axis crossing position described later. “Hardware-independent” means that reading of hardware information specific to the display device is not necessary in principle, and if this general description is made, the parallax image is based on the appropriate parallax. If desired, the desired three-dimensional display is realized.
[0013]
Providing appropriate parallax for the acquisition of appropriate parallax and providing control for realizing the appropriate parallax in the library, so that general programmers can call this library and be aware of the principles and programming of complex stereoscopic vision. 3D display is realized.
[0014]
Of the various aspects of the present invention, the first group is based on a technique for obtaining an appropriate parallax based on a user response. This technique can be used for “initial setting” of parallax by the user, and once the appropriate parallax is acquired in the apparatus, the appropriate parallax is realized even when another image is displayed. However, this technique is not limited to the initial setting, but is also used for “manual adjustment” in which the user appropriately adjusts the parallax of the image being displayed. The following relates to the first group.
[0015]
The present invention relates to a stereoscopic image processing apparatus, and to a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes. Whether or not the stereoscopic image is acceptable An instruction acquisition unit for acquiring a user response and a parallax specifying unit for specifying an appropriate parallax for the user based on the acquired response are included.
[0016]
The instruction acquisition unit is provided as, for example, a GUI (graphical user interface, the same applies hereinafter), and first displays while changing the parallax between viewpoint images. When the user feels the stereoscopic effect he likes, the user inputs that effect by operating a button or the like.
[0017]
A “stereoscopic image” is an image displayed with a stereoscopic effect, and the substance of the data is a “parallax image” in which a plurality of images are given parallax. A parallax image is generally a set of a plurality of two-dimensional images. Each image constituting the parallax image is a “viewpoint image” having a corresponding viewpoint. That is, a parallax image is composed of a plurality of viewpoint images, and when it is displayed, it is displayed as a stereoscopic image. The display of a stereoscopic image is also simply referred to as “stereoscopic display”.
[0018]
“Parallax” is a parameter for producing a three-dimensional effect, and various definitions are possible, but as an example, it can be expressed by a shift amount of a pixel representing the same point between viewpoint images. Hereinafter, unless otherwise specified, in this specification, the definition is followed.
[0019]
The range of the appropriate parallax may be specified. In this case, both ends of the range are called “limit parallax”. The “specific parallax specification” may be performed with a maximum value allowable as a parallax of a nearby object described later.
[0020]
The stereoscopic image processing apparatus of the present invention may further include a parallax control unit that performs processing so that the specified appropriate parallax is realized even when another image is displayed. When another image is a stereoscopic image generated from three-dimensional data as a starting point, the parallax control unit may determine a plurality of viewpoints for generating the stereoscopic image according to the appropriate parallax. More specifically, the distance between a plurality of viewpoints and the intersection position of the optical axes for viewing the object from these viewpoints may be determined. An example of these processes is performed by a camera arrangement determination unit described later. If these processes are performed in real time, the optimum stereoscopic display is always realized.
[0021]
The parallax control unit may perform control so that proper parallax is achieved for a predetermined basic three-dimensional space to be displayed. An example of this processing is performed by a projection processing unit described later.
[0022]
The parallax control unit may perform control so that the appropriate parallax is realized with respect to the coordinates of the closest object and the coordinates of the farthest object in the three-dimensional space. An example of this processing is performed by a projection processing unit described later. The object may be static.
[0023]
“Neighboring” is the line of sight of a camera placed at each of a plurality of viewpoints, that is, a plane at an optical axis crossing position (hereinafter also referred to as “optical axis crossing position”) (hereinafter also referred to as “optical axis crossing plane”). It refers to a state in which parallax that is stereoscopically viewed before is added. On the contrary, “distant” refers to a state in which a parallax is provided so as to be stereoscopically viewed from the optical axis crossing plane. The closer the parallax of the near object, the closer to the user is sensed, and the larger the parallax of the far object, the farther away from the user. In other words, unless otherwise specified, the parallax is defined as a non-negative value in which the positive and negative are not reversed in the near and far positions, and both the near and far parallaxes are zero at the optical axis crossing plane.
[0024]
Of the object or space to be displayed, the optical axis crossing plane coincides with the screen surface of the display device for a portion having no parallax. This is because the line of sight seen from both the left and right eyes reaches the same position in the screen plane, that is, intersects the pixels where the parallax is not attached.
[0025]
When the another image is a plurality of two-dimensional images to which parallax has already been given, the parallax control unit may determine the horizontal shift amount of the plurality of two-dimensional images according to the appropriate parallax. In this aspect, the input for stereoscopic display is not generated with a high degree of freedom starting from three-dimensional data, but is already generated parallax images, and the parallax is fixed. In this case, it is not possible to return to the original three-dimensional space or the real space where the image was actually taken and change the camera position to redraw or re-shoot. For this reason, the parallax is adjusted by shifting the viewpoint images constituting the parallax image or the pixels included in the viewpoint images horizontally.
[0026]
When the another image is a planar image to which depth information is given (hereinafter also referred to as “image with depth information”), the parallax control unit may adjust the depth according to the appropriate parallax. An example of this processing is performed by a two-dimensional image generation unit of a third stereoscopic image processing apparatus described later.
[0027]
The stereoscopic image processing apparatus further includes a parallax holding unit that records appropriate parallax, and the parallax control unit is activated at a predetermined timing, for example, when the apparatus is activated, or when the apparatus has a stereoscopic image processing function or a part thereof. For example, the appropriate parallax may be read and processing may be performed using the value as an initial value. That is, “startup” may have a hardware meaning or a software meaning. According to this aspect, once the user determines an appropriate parallax, automatic processing for adjusting the stereoscopic effect is realized thereafter. This is a function called “initial setting of appropriate parallax”.
[0028]
Another aspect of the present invention relates to a stereoscopic image processing method, the step of displaying a plurality of stereoscopic images with different parallax to a user, and the displayed stereoscopic image Whether it is acceptable or not Identifying appropriate parallax for the user based on the user response.
[0029]
Still another embodiment of the present invention also relates to a stereoscopic image processing method, The parallax allowed by the user as the parallax of the stereoscopic image is the appropriate parallax. And a step of processing the image before display so that the acquired appropriate parallax is realized. Here, “acquisition” may be a process of positively specifying or a process of reading from the parallax holding unit or the like.
[0030]
If each of these steps is implemented as a function of a 3D display library and the functions of this library can be called as a function from multiple programs, the programmer must write the program in consideration of the 3D display hardware. Is effective.
[0031]
The second group of the present invention is based on a technique for adjusting parallax based on a user instruction. This technique can be used for “manual adjustment” of parallax by the user, and the user can change the stereoscopic effect of the displayed image as appropriate. However, this technique is not limited to manual adjustment, and can also be used to automatically adjust the parallax of the image by reading the above-described appropriate parallax when stereoscopically displaying a certain image. The difference from the automatic adjustment of the first group is that the automatic adjustment of the second group acts on a two-dimensional parallax image or an image with depth information. Use a group of technologies. The following relates to the second group.
[0032]
An aspect of the present invention relates to a stereoscopic image processing apparatus, an instruction acquisition unit that acquires a user instruction for a stereoscopic image displayed from a plurality of viewpoint images, and a parallax amount between the plurality of viewpoint images according to the acquired instruction And a parallax control unit that changes. An example of this processing is shown in FIG. 45 described later, and is a typical example of “manual adjustment”. If the user's instruction is provided by a simple GUI such as button operation, it is highly convenient.
[0033]
Another aspect of the present invention also relates to a stereoscopic image processing device, and relates to a parallax amount detection unit that detects a first parallax amount that occurs when a stereoscopic image is displayed from a plurality of viewpoint images, and the first parallax amount is allowed by a user. A parallax controller that changes the parallax amount between the plurality of viewpoint images so as to fall within a range of a second parallax amount that is a parallax amount. This is a typical example of “automatic adjustment”, and the above-described appropriate parallax can be used as the second parallax amount. An example of this processing is shown in FIG. 46 described later.
[0034]
The parallax amount detection unit detects the maximum value of the first parallax amount, and the parallax control unit changes the parallax amount between the viewpoint images so that the maximum value does not exceed the maximum value of the second parallax amount. Also good. In order to avoid excessive stereoscopic effect due to excessive parallax, the intent is to protect the maximum value of parallax, that is, limit parallax. The maximum value here may be considered as the maximum value on the near side.
[0035]
The parallax amount detection unit calculates the corresponding point matching between the plurality of viewpoint images to detect the first parallax amount, or the first parallax amount recorded in advance in any header of the plurality of viewpoint images. It may be detected. An example of these processes is shown in FIG. 47 described later.
[0036]
The parallax control unit may change the amount of parallax between the plurality of viewpoint images by shifting the synthesis position of the plurality of viewpoint images. This is common to FIGS. The shift of the composition position is a horizontal or vertical shift in units of pixels or the entire image. When the input is an image with depth information, the parallax control unit may change the parallax amount by adjusting the depth information.
[0037]
Another aspect of the present invention relates to a stereoscopic image processing method, a step of obtaining a user instruction for a stereoscopic image displayed based on a plurality of viewpoint images, and a parallax amount between the plurality of viewpoint images according to the instruction Changing.
[0038]
Another aspect of the present invention also relates to a stereoscopic image processing method, the step of detecting a first amount of parallax that occurs when a stereoscopic image is displayed from a plurality of viewpoint images, and the first amount of parallax is an allowable amount of parallax of a user. And changing the amount of parallax between the plurality of viewpoint images so as to fall within the range of the second amount of parallax.
[0039]
These steps may be implemented as functions of a stereoscopic display library, and the functions of this library may be called as functions from a plurality of programs.
[0040]
The third group of the present invention is based on a technique for correcting parallax based on the position in the image. This “automatic correction” acts to reduce the user's uncomfortable feeling or refusal to stereoscopic display, and can be used in combination with the technologies of the first and second groups. In general, at the time of stereoscopic display, technical or physiological problems are pointed out, such as closer to the edge of the image, a plurality of viewpoint images are observed to be shifted and a sense of incongruity is likely to be produced. In the third group, this problem is reduced by reducing the parallax in a portion close to the image edge or adjusting the parallax so that the object moves from the near side to the far side. The following relates to the third group.
[0041]
An aspect of the present invention relates to a stereoscopic image processing apparatus, and maintains a parallax control unit that corrects parallax between a plurality of viewpoint images for displaying a stereoscopic image, and a correction map that the parallax control unit should refer to during the processing. The correction map is described so that the parallax is corrected based on the position in the viewpoint image. Examples of the correction map include a parallax correction map and a distance sense correction map.
[0042]
For example, the parallax control unit reduces the parallax at the peripheral portions of the plurality of viewpoint images, or changes the parallax so that the object is sensed further from the user. The parallax control unit may change the parallax by selectively performing processing on any of the plurality of viewpoint images.
[0043]
When multiple viewpoint images are generated from three-dimensional data, that is, when the viewpoint image can be generated by returning to the three-dimensional space, the parallax control unit controls the camera parameters to generate the parallax when generating the multiple viewpoint images. May be changed. Camera parameters include the distance between the left and right cameras, the angle at which an object is viewed from the camera, or the optical axis crossing position.
[0044]
Similarly, when a plurality of viewpoint images are generated from three-dimensional data, the parallax control unit may change the parallax by distorting the three-dimensional space itself in, for example, the world coordinate system when generating the plurality of viewpoint images. . On the other hand, when a plurality of viewpoint images are generated from an image with depth information, the parallax control unit may change the parallax by manipulating the depth information.
[0045]
Another aspect of the present invention relates to a stereoscopic image processing method, the step of acquiring a plurality of viewpoint images for displaying a stereoscopic image, and the parallax between the acquired viewpoint images based on positions in the viewpoint images. And changing steps. These steps may be implemented as functions of the stereoscopic display library, and the functions of this library may be called as functions from a plurality of programs.
[0046]
The fourth group of the present invention relates to a technology that provides the first to third groups and related functions as a software library, reduces the burden on programmers and users, and promotes the spread of stereoscopic image display applications. The following relates to the fourth group.
[0047]
An aspect of the present invention relates to a stereoscopic image processing method, which stores information related to stereoscopic image display on a memory, shares the stored information among a plurality of different programs, and any of these programs is a stereoscopic image. When displaying the image, the state of the image to be output is determined with reference to the stored information. An example of the state of the image is how much parallax is given to the parallax image.
[0048]
The “held information” may include information on any of the format of an image input to the stereoscopic image display device, the display order of viewpoint images, and the amount of parallax between viewpoint images. In addition to sharing the stored information, a process unique to stereoscopic image display may be shared by a plurality of programs. An example of “stereoscopic image display specific processing” is processing for determining stored information. Another example is a process related to a graphical user interface for determining an appropriate parallax, a display process for a parallax adjustment screen that supports realization of an appropriate parallax state, a process for detecting and tracking a user's head position, and a stereoscopic display device For example, a process of displaying an image for adjusting the image.
[0049]
Another aspect of the present invention relates to a stereoscopic image processing apparatus, and a stereoscopic effect adjusting unit that provides a user with a graphical user interface for adjusting the stereoscopic effect of a stereoscopic display image, and the result of adjusting the stereoscopic effect by the user. And a parallax control unit that generates a parallax image in a form that protects the limit parallax.
[0050]
The apparatus further includes an information detection unit that acquires information to be referred to in order to optimize the stereoscopic image display, and a conversion unit that converts the format of the parallax image generated by the parallax control unit according to the acquired information. May be included.
[0051]
The parallax control unit may control the camera parameters based on the three-dimensional data to generate a parallax image while protecting the limit parallax, or may control the depth of the image with depth information to generate the parallax image. Alternatively, the parallax image may be generated after determining the horizontal shift amounts of the plurality of two-dimensional images having parallax.
[0052]
The fifth group of the present invention relates to one application or business model using the above-described stereoscopic image processing technology or related technology. A fourth group of software libraries is available. The following relates to the fifth group.
[0053]
An aspect of the present invention relates to a stereoscopic image processing method, and converts an appropriate parallax for stereoscopic display of a parallax image into an expression format that does not depend on the display device hardware, and the appropriate parallax according to the expression format is changed between different display devices. Circulate.
[0054]
Another aspect of the present invention also relates to a stereoscopic image processing method, a step of reading an appropriate parallax of a user acquired by a first display device into a second display device, and a second display device according to the appropriate parallax. Adjusting the parallax between the parallax images, and outputting the adjusted parallax image from the second display device. For example, the first display device is a device normally used by the user, and the second display device is a device provided at another location. Further, based on the step of reading the information related to the hardware of the first display device into the second display device, the information related to the hardware of the first display device read and the information related to the hardware of the second display device. The parallax image in which the parallax is adjusted in the step of adjusting the parallax of the parallax image may further include a step of correcting the parallax according to the appropriate parallax in the second display device.
[0055]
Further, the hardware-related information may include at least one of the size of the display screen, the optimum observation distance of the display device, and the image separation performance of the display device.
[0056]
Another aspect of the present invention relates to a stereoscopic image processing device, which includes a first display device, a second display device, and a server connected via a network, and the first display device is acquired by the device. The user's appropriate disparity information is transmitted to the server, the server receives the appropriate disparity information, records it in association with the user, and when the user requests output of image data on the second display device, the device Reads out the appropriate parallax information of the user from the server, adjusts the parallax, and then outputs a parallax image.
[0057]
The sixth group of the present invention is based on a technique for proposing a new expression method using a stereoscopic image.
[0058]
One embodiment of the present invention relates to a stereoscopic image processing apparatus. This stereoscopic image processing apparatus is a stereoscopic image processing apparatus that displays a stereoscopic image based on a plurality of viewpoint images corresponding to different parallaxes, and is recommended when a stereoscopic image is displayed using the stereoscopic image display apparatus. A recommended parallax acquisition unit that acquires the parallax range to be displayed, and a parallax control unit that sets the parallax to display the stereoscopic display image within the acquired recommended parallax range.
[0059]
In addition, an object designating unit that accepts designation of a predetermined object included in the stereoscopic image from the user, and the position of the designated object correspond to the optical axis crossing positions associated with each of the plurality of viewpoint images, and are designated. The object may further include an optical axis crossing position setting unit that sets the crossing position of the optical axes so that the object is expressed near the position of the display screen on which the stereoscopic image is displayed.
[0060]
In addition, for the designated object, the optical axis correspondence information describing that the object is associated with the optical axis crossing position and that the object is represented in the vicinity of the position of the display screen is described above. You may further have the designation | designated information addition part linked | related with an object.
[0061]
The optical axis crossing position setting unit acquires the optical axis correspondence information, associates the optical axis crossing position with the object described in the acquired optical axis correspondence information, and associates the optical axis crossing position with each other. The object may be expressed near the position of the display screen on which the stereoscopic image is displayed.
[0062]
In addition, it is associated with image data used when generating a stereoscopic image, and the object included in the stereoscopic image includes information on whether or not to express in the basic expression space including the object to be stereoscopically displayed. An identification information acquisition unit that acquires identification information may be further included, and the parallax control unit may reflect the acquired identification information when the object is represented in a stereoscopic image.
[0063]
Further, the identification information may include information regarding timing when the object is expressed in the basic expression space, and the identification information acquisition unit may reflect the acquired timing when the object is expressed in the stereoscopic image. Good.
[0064]
Another aspect of the present invention relates to a stereoscopic image processing method. In this stereoscopic image processing method, a predetermined object included in a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes can be selected, and when an object is selected, the selected object is positioned at the position of the selected object. The optical axis crossing position associated with each of the plurality of viewpoint images is made to correspond, and the optical axis crossing position is made to substantially coincide with the position of the display screen on which the stereoscopic image is displayed. According to this stereoscopic image processing method, the display screen can be set at the boundary between the far space and the near space, and an expression can be made such that the object moves toward the observer beyond the display screen.
[0065]
The designated object may have a predetermined interface, and the optical axis crossing position setting unit may associate the optical axis crossing position on the interface. In addition, a stereoscopic image may be generated starting from three-dimensional data. When a stereoscopic image is generated starting from three-dimensional data, it is easy to add various effects to the stereoscopic image. For example, when an object is expressed so as to cross the interface, that is, the display screen, an effect of deforming the display screen can be added.
[0066]
Still another embodiment of the present invention also relates to a stereoscopic image processing method. In this stereoscopic image processing method, an interface that separates a space is set as a part of the stereoscopic image in the vicinity of a display screen on which a stereoscopic image generated based on a plurality of viewpoint images corresponding to different parallaxes is displayed. At the same time, a stereoscopic image is expressed using the interface as a boundary between the near space and the far space. The interface may be a boundary surface between substances or a thin plate. Thin plates include glass plates and paper.
[0067]
Still another embodiment of the present invention relates to a stereoscopic image processing method. This stereoscopic image processing method is a moving speed of an object to be expressed in a basic expression space that is included in a stereoscopic image generated based on a plurality of viewpoint images corresponding to different parallaxes and includes an object to be stereoscopically displayed. Changing for the near or far direction.
[0068]
Still another embodiment of the present invention also relates to a stereoscopic image processing method. In this stereoscopic image processing method, when generating a stereoscopic image based on a plurality of viewpoint images corresponding to different parallaxes, an object to be expressed in a basic expression space including an object to be stereoscopically displayed is set to a predetermined parallax range. While expressing so as to fit within, at least one of the forefront surface and the rearmost surface of the basic expression space is set to a position where no object exists.
[0069]
Still another embodiment of the present invention also relates to a stereoscopic image processing method. In this stereoscopic image processing method, when generating a stereoscopic image based on a plurality of viewpoint images corresponding to different parallaxes, when calculating the parallax of the object to be expressed in the basic expression space including the object to be stereoscopically displayed. Instead of the actual size of the object, the parallax of the object is calculated as a size including the extension area in front of the object. In addition, if the object moves further forward after it is positioned in the foreground of the basic expression space by including the front extension area, the object is expressed as moving in the front extension area. May be.
[0070]
Still another embodiment of the present invention also relates to a stereoscopic image processing method. In this stereoscopic image processing method, when generating a stereoscopic image based on a plurality of viewpoint images corresponding to different parallaxes, when calculating the parallax of the object to be expressed in the basic expression space including the object to be stereoscopically displayed. Instead of the actual size of the object, the parallax of the object is calculated as a size including the extension area behind the object. In addition, when an object moves further backward after it is positioned on the last surface of the basic representation space by moving in a form that includes the front extension area, the object is expressed as moving in the rear extension area. May be.
[0071]
The seventh group of the present invention is based on a technique for adjusting the parallax to be set according to the state of the image.
[0072]
One embodiment of the present invention relates to a stereoscopic image processing apparatus. When this stereoscopic image processing apparatus generates a stereoscopic image from three-dimensional data, the parallax is higher than the parallax of the range in which the ratio of the width and depth of the object represented in the stereoscopic image is correctly perceived by human eyes. It has a parallax control unit that controls so as not to increase.
[0073]
Another aspect of the present invention also relates to a stereoscopic image processing apparatus. In this stereoscopic image processing apparatus, when generating a stereoscopic image from a two-dimensional image to which depth information is given, a range in which the ratio of the width and depth of an object represented in the stereoscopic image is correctly perceived by human eyes A parallax control unit that controls the parallax not to be larger than the parallax.
[0074]
Still another embodiment of the present invention also relates to a stereoscopic image processing apparatus. The stereoscopic image processing apparatus includes an image determination unit that performs frequency analysis on a stereoscopic image to be displayed based on a plurality of viewpoint images corresponding to different parallaxes, and a parallax amount according to the amount of high-frequency components that is determined by frequency analysis. A parallax control unit that adjusts In addition, the parallax control unit may perform adjustment to increase the parallax amount when the amount of the high-frequency component is large.
[0075]
Still another embodiment of the present invention also relates to a stereoscopic image processing apparatus. The stereoscopic image processing apparatus adjusts the amount of parallax according to the amount of movement of a stereoscopic image, and an image determination unit that detects the movement of a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes. And a parallax control unit. Further, the parallax control unit may perform adjustment to reduce the parallax amount when the amount of movement of the stereoscopic image is small.
[0076]
Still another embodiment of the present invention also relates to a stereoscopic image processing apparatus. In the stereoscopic image processing apparatus, when a parameter related to camera arrangement set for generating a parallax image is changed when generating a stereoscopic image with data, a camera parameter is provided in advance for the variation of the parameter. Control to be within the threshold. According to this apparatus, it is possible to reduce that the parallax changes rapidly and the viewer of the stereoscopic image feels uncomfortable.
[0077]
Still another embodiment of the present invention also relates to a stereoscopic image processing apparatus. This stereoscopic image processing apparatus generates a maximum value of the depth included in the depth information, which is generated along with the progress of the two-dimensional moving image, when generating a stereoscopic image of the moving image from the two-dimensional moving image given the depth information. Control is performed so that the fluctuation of the minimum value falls within a predetermined threshold. According to this apparatus, it is possible to reduce that the parallax changes rapidly and the viewer of the stereoscopic image feels uncomfortable.
[0078]
Still another embodiment of the present invention relates to a stereoscopic image processing method. In this stereoscopic image processing method, an appropriate parallax of a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes is set in units of scenes.
[0079]
Still another embodiment of the present invention relates to a stereoscopic image processing method. In this stereoscopic image processing method, an appropriate parallax of a stereoscopic image displayed based on a plurality of viewpoint images corresponding to different parallaxes is set at a predetermined time interval.
[0080]
Another aspect of the present invention relates to a stereoscopic image processing apparatus. The stereoscopic image processing apparatus includes a camera arrangement setting unit that sets an arrangement of a plurality of virtual cameras for generating a plurality of viewpoint images when original data serving as a starting point of a stereoscopic image is input, and a virtual camera. An object area determination unit that determines whether or not an area where there is no object information to be displayed has occurred in the corresponding viewpoint image generated, and an area where no information about the object to be displayed exists. A camera parameter adjustment unit that adjusts at least one of the angle of view of the virtual camera, the camera interval, and the crossing position of the optical axes so that there is no region where no object information exists.
[0081]
It should be noted that any combination of the above-described constituent elements and a conversion of the expression of the present invention between a method, an apparatus, a system, a recording medium, a computer program, etc. are also effective as an aspect of the present invention.
[0082]
DETAILED DESCRIPTION OF THE INVENTION
FIG. 1 shows the positional relationship between the user 10, the screen 12, and the playback object 14 displayed in a three-dimensional manner. The distance between eyes of the user 10 is E, the distance between the user 10 and the screen 12 is D, and the width of the reproduction object 14 when displayed is W. Since the playback object 14 is stereoscopically displayed, it has pixels that are sensed closer to the screen 12, that is, pixels that are located closer to each other, and pixels that are sensed farther than the screen 12, that is, pixels that are located farther away. Pixels without parallax appear on the screen 12 at the same position from both eyes, and are thus detected on the screen 12.
[0083]
FIG. 2 shows an imaging system for generating the ideal display of FIG. The distance between the two cameras 22 and 24 is E, and the distance from these to the optical axis crossing position when viewing the real object 20 (this is called the optical axis crossing distance) is D, and the same width as the screen 12 is set. If an object 20 with an actual angle of view and a width of W is actually photographed, parallax images can be obtained from the two cameras. If this is displayed on the screen 12 of FIG. 1, the ideal state of FIG. 1 is realized.
[0084]
FIG. 3 and FIG. 4 show a state where the positional relationship of FIG. 2 is A times (A <1) and B times (B> 1), respectively. The ideal state of FIG. 1 is also realized by the parallax images obtained with these positional relationships. In other words, the basic of ideal stereoscopic display starts from making W: D: E constant. This relationship also serves as a basis for adding parallax.
[0085]
FIGS. 5 to 10 show the outline of processing until stereoscopic display is performed based on the three-dimensional data of the object 20 in the embodiment.
FIG. 5 shows a model coordinate system, that is, a coordinate space that each three-dimensional object 20 has. In this space, coordinates when the object 20 is modeled are given. Usually, the origin is at the center of the object 20.
[0086]
FIG. 6 shows the world coordinate system. The world space is a wide space in which a scene is formed by arranging objects 20, floors, and walls. From the modeling in FIG. 5 to the determination of the world coordinate system in FIG. 6 can be recognized as “construction of three-dimensional data”.
[0087]
FIG. 7 shows the camera coordinate system. Conversion to the camera coordinate system is performed by setting the camera 22 at an arbitrary angle of view from an arbitrary position in the world coordinate system. Camera position, direction, and angle of view are camera parameters. In the case of stereoscopic display, since the parameters are determined for two cameras, the camera interval and the optical axis crossing position are also determined. Also, the origin is moved so that the midpoint between the two cameras is the origin.
[0088]
8 and 9 show a perspective coordinate system. First, as shown in FIG. 8, the space to be displayed is clipped by the front projection plane 30 and the rear projection plane 32. As will be described later, one feature of the embodiment is that the surface having the maximum disparity point is the front projection surface 30 and the surface having the maximum disparity point is the rear projection surface 32. After clipping, this view volume is converted into a rectangular parallelepiped as shown in FIG. The processing in FIGS. 8 and 9 is also referred to as projection processing.
[0089]
FIG. 10 shows the screen coordinate system. In the case of stereoscopic display, images from a plurality of cameras are converted into coordinate systems having screens, and a plurality of two-dimensional images, that is, parallax images are generated.
[0090]
11, 12, and 13 illustrate the configuration of the stereoscopic image processing apparatus 100 that is partially different. Hereinafter, for convenience, the stereoscopic image processing devices 100 are also referred to as first, second, and third stereoscopic image processing devices 100, respectively. These three-dimensional image processing apparatuses 100 can be integrated into the apparatus, but here, they are divided into three parts to avoid the complexity of the figure. The first stereoscopic image processing apparatus 100 is effective when the object and space to be drawn can be obtained from the three-dimensional data stage, and therefore the main input is the three-dimensional data. The second stereoscopic image processing apparatus 100 is effective for the parallax adjustment of a plurality of two-dimensional images to which parallax has already been given, that is, existing parallax images, and therefore inputs a two-dimensional parallax image. The third stereoscopic image processing apparatus 100 operates the depth information of the image with depth information to realize proper parallax, and therefore mainly inputs the image with depth information. These three inputs are collectively referred to as “original data”.
[0091]
In the case where the first to third stereoscopic image processing devices 100 are integrated and mounted, an “image format determination unit” is provided as a preprocessing unit thereof, and after determining three-dimensional data, a parallax image, and an image with depth information, It is good also as a structure which starts the optimal thing among the 1st-3rd three-dimensional image processing apparatuses 100. FIG.
[0092]
The first stereoscopic image processing apparatus 100 has functions of “initial setting” and “automatic adjustment” when setting a stereoscopic effect for stereoscopic display. When the user designates a range of his / her appropriate parallax for a stereoscopically displayed image, this is acquired by the system, and thereafter, when another stereoscopic image is displayed, a conversion process is performed in advance to realize this proper parallax. Is displayed. Therefore, the first stereoscopic image processing apparatus 100 allows the user to enjoy a stereoscopic display that suits him / herself after the setting procedure is performed only once in principle.
[0093]
The first stereoscopic image processing apparatus 100 further has a sub-function called “parallax correction” that artificially relaxes the parallax at the periphery of the image. As described above, as the image edge is approached, a shift between a plurality of viewpoint images is easily recognized as a “double image”. This is mainly due to mechanical errors such as parallax barrier and screen warpage of the display device. Therefore, at the peripheral part of the image, 1) reduce both the near parallax and the far parallax, 2) reduce the near parallax, and leave the far parallax as it is, 3) regardless of the near parallax and the far parallax. Various methods are implemented, such as shifting to a distant parallax. This “parallax correction” function also exists in the third stereoscopic image processing apparatus 100, but the processing differs depending on the difference in input data.
[0094]
The first stereoscopic image processing apparatus 100 includes a stereoscopic effect adjusting unit 112 that adjusts the stereoscopic effect based on a response from the user to a stereoscopically displayed image, and a parallax that stores the appropriate parallax specified by the stereoscopic effect adjusting unit 112. The information holding unit 120, the parallax control unit 114 that reads the appropriate parallax from the parallax information holding unit 120, generates the parallax image having the appropriate parallax from the original data, acquires the hardware information of the display device, and also displays the stereoscopic display An information acquisition unit 118 having a function of acquiring a method and a format conversion unit 116 that changes the format of the parallax image generated by the parallax control unit 114 based on the information acquired by the information acquisition unit 118 are included. Original data is simply called three-dimensional data, but strictly speaking, this is object and space data described in the world coordinate system.
[0095]
Examples of information acquired by the information acquisition unit 118 include the number of viewpoints for stereoscopic display, the method of a stereoscopic display device such as space division or time division, whether or not shutter glasses are used, and the viewpoint image in a multi-view type. There is a method of arrangement, whether there is an arrangement of viewpoint images in which parallax is reversed in the parallax image, a result of head tracking, and the like. Only the result of the head tracking is exceptionally inputted to the camera arrangement determining unit 132 via a route (not shown) and processed there.
[0096]
The above configuration can be realized in hardware by a CPU, memory, or other LSI of an arbitrary computer, and in software, it can be realized by a program having a GUI function, a parallax control function, or other functions. Here, functional blocks realized by the cooperation are depicted. Accordingly, it is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof, and the situation is the same for the subsequent configurations.
[0097]
The stereoscopic effect adjusting unit 112 includes an instruction acquiring unit 122 and a parallax specifying unit 124. The instruction acquisition unit 122 acquires the appropriate parallax range for the stereoscopically displayed image when the user specifies the range. Based on the range, the parallax specifying unit 124 specifies an appropriate parallax when the user uses the display device. The appropriate parallax is expressed in an expression format that does not depend on the hardware of the display device. By realizing the appropriate parallax, stereoscopic vision suitable for the user's physiology becomes possible.
[0098]
The parallax control unit 114 includes a camera temporary arrangement unit 130 that temporarily sets camera parameters, a camera arrangement determination unit 132 that corrects camera parameters temporarily set according to appropriate parallax, and a plurality of cameras when camera parameters are determined. An origin moving unit 134 that performs origin movement processing to set the middle point as the origin, a projection processing unit 138 that performs the above-described projection processing, and conversion processing to a screen coordinate system after projection processing to generate a parallax image A two-dimensional image generation unit 142. In addition, a distortion processing unit 136 that performs spatial distortion conversion (hereinafter also simply referred to as distortion conversion) is provided between the camera temporary arrangement unit 130 and the camera arrangement determination unit 132 in order to reduce parallax in the peripheral portion of the image as necessary. ing. The distortion processing unit 136 reads out a correction map described later from the correction map holding unit 140 and uses it.
[0099]
If it is necessary to adjust the display device for stereoscopic display, a GUI (not shown) for that purpose may be added. With this GUI, processing such as determining the optimum display position by slightly shifting the entire displayed parallax image vertically and horizontally may be performed.
[0100]
The second stereoscopic image processing apparatus 100 in FIG. 12 receives a plurality of parallax images. This is also simply called an input image. The second stereoscopic image processing apparatus 100 reads the appropriate parallax previously acquired by the first stereoscopic image processing apparatus 100, adjusts the parallax of the input image, puts it within the range of the appropriate parallax, and outputs it. In that sense, the second stereoscopic image processing apparatus 100 has an “automatic adjustment” function of parallax. However, in addition to this, when the user wants to change the stereoscopic effect when the stereoscopic display is actually performed, a GUI function is provided, and a “manual adjustment” function for changing the parallax according to the user's instruction is also provided.
[0101]
The parallax of the already generated parallax image cannot normally be changed, but according to the second stereoscopic image processing apparatus 100, the level that can be sufficiently put into practical use by shifting the synthesis position of the viewpoint images constituting the parallax image You can change the stereoscopic effect. The second stereoscopic image processing apparatus 100 exhibits a good stereoscopic effect adjustment function even in a situation where input data cannot be traced back to three-dimensional data. Hereinafter, differences from the first stereoscopic image processing apparatus 100 will be mainly described.
[0102]
The stereoscopic effect adjusting unit 112 is used for manual adjustment. The instruction acquisition unit 122 realizes numerical input such as “+ n” and “−n” on a screen, for example, and the value is specified by the parallax specifying unit 124 as a parallax change amount. There are several possible relations between the numerical value and the specified stereoscopic effect. For example, “+ n” is an instruction to increase the stereoscopic effect, and “−n” is an instruction to decrease, and the amount of change to the stereoscopic effect may be larger as n increases. Also, “+ n” may be an instruction to move the object in the near direction, and “−n” may be an instruction to move the object in the far direction. As another method, the value of n may not be specified, and only the “+” and “−” buttons may be displayed, and the parallax may be changed each time the button is clicked.
[0103]
The second stereoscopic image processing apparatus 100 includes a parallax amount detection unit 150 and a parallax control unit 152. When the input image is a plurality of parallax images, the parallax amount detection unit 150 inspects the header area of the parallax images, and the parallax amount described in the form of the number of pixels, particularly the near maximum parallax pixel number and the far maximum parallax. If there is the number of pixels, this is acquired. If the amount of parallax is not described, the matching unit 158 identifies the amount of parallax by detecting a corresponding point between parallax images using a known method such as block matching. The matching unit 158 may perform processing only on an important region such as the center of the image, or may detect by focusing on the most important number of nearby parallax pixels. The detected amount of parallax is sent to the parallax control unit 152 in the form of the number of pixels.
[0104]
In general, when a stereoscopic image is displayed on a display screen of a mobile phone, individual differences regarding the stereoscopic effect are small, and it can be assumed that the user sometimes feels troublesome to input appropriate parallax. In addition, even in a stereoscopic display device used by an unspecified number of users, there is a possibility that the input of appropriate parallax is inconvenient. In such a case, the range of the appropriate parallax may be determined by the manufacturer of the stereoscopic image display device, the creator of the content to be displayed on the stereoscopic image display device, or according to general guidelines, etc. You may decide by the method of. For example, it reflects the guidelines and standards established by industry groups and academic groups related to stereoscopic images. As an example, if there is a guideline of “maximum parallax is about 20 mm on a 15-inch display screen”, processing such as following the guideline or performing correction based on the guideline can be mentioned. In this case, the stereoscopic effect adjusting unit 112 is not necessary.
[0105]
The position shift unit 160 of the parallax control unit 152 shifts the synthesis position of the viewpoint images constituting the parallax image in the horizontal direction so that the parallax amount between the viewpoint images becomes an appropriate parallax. The shift may be performed on any one of the viewpoint images. The position shift unit 160 also has another operation mode. When the user gives an instruction to increase or decrease the parallax via the stereoscopic effect adjusting unit 112, the image shift position is simply changed according to this instruction. That is, the position shift unit 160 has two functions: an automatic adjustment function for appropriate parallax and a manual adjustment function by the user.
[0106]
The parallax writing unit 164 writes the parallax amount by the number of pixels in one of the header areas of the plurality of viewpoint images constituting the parallax image for the above-described parallax amount detection unit 150 or for another use. The image edge adjustment unit 168 fills in the missing pixels generated at the image edge due to the shift by the position shift unit 160.
[0107]
The third stereoscopic image processing apparatus 100 in FIG. 13 receives an image with depth information as an input. The third stereoscopic image processing apparatus 100 adjusts the depth so that proper parallax is realized. In addition, the above-described “parallax correction” function is provided. The distortion processing unit 174 of the parallax control unit 170 performs distortion conversion according to the correction map stored in the correction map holding unit 176 in the manner described later. The depth information and the image after distortion conversion are input to the two-dimensional image generation unit 178, where a parallax image is generated. The two-dimensional image generation unit 178 is different from the two-dimensional image generation unit 142 of the first stereoscopic image processing apparatus 100, and appropriate parallax is considered here. Since the image with depth information is also two-dimensional as an image, the two-dimensional image generation unit 178 has a function similar to that of the position shift unit 160 of the second stereoscopic image processing apparatus 100 inside, although not shown in the figure. Accordingly, the pixels in the image are shifted in the horizontal direction to generate a stereoscopic effect. At this time, an appropriate parallax is realized by a process described later.
[0108]
The processing operation and the principle of each part of each stereoscopic image processing apparatus 100 in the above configuration are as follows.
FIGS. 14A and 14B show the left eye image 200 and the right eye image 202 respectively displayed in the specific parallax specifying process by the stereoscopic effect adjusting unit 112 of the first stereoscopic image processing apparatus 100. . In each image, five black circles are displayed, and a larger parallax is placed closer to the top, and a larger parallax is placed farther down.
[0109]
FIG. 15 schematically shows a sense of distance sensed by the user 10 when these five black circles are displayed. The user 10 responds that the range of these five senses of distance is “appropriate”, and the instruction acquisition unit 122 acquires this response. In the figure, five black circles with different parallaxes are displayed simultaneously or sequentially, and the user 10 inputs whether or not the parallax is acceptable. On the other hand, in FIG. 16, the display itself is performed with a single black circle, but the parallax is continuously changed, and when the user 10 reaches the limit allowed in each of the far and near positions, a response is made. The response may be performed by using a known technique such as normal key operation, mouse operation, voice input, and the like.
[0110]
Further, the determination of the parallax may be performed by a simpler method. Similarly, the setting range of the basic expression space may be determined by a simple method. FIG. 89 is a table used for simple determination of parallax and basic expression space. The setting range of the basic expression space is divided into four ranks A to D, from the setting for increasing the near space side to the setting for only the far space side, and further, the parallax is 5 levels from 1 to 5 respectively. It is divided into ranks. Here, for example, when the user prefers the strongest stereoscopic effect and prefers the most pop-up stereoscopic display, the rank is set to 5A. And it is not always necessary to determine the rank while confirming the stereoscopic display, and only the buttons for determining the rank may be displayed. There is a button for confirming the stereoscopic effect beside them, and an image for confirming the stereoscopic effect may be displayed by pressing the button.
[0111]
In either case of FIGS. 15 and 16, the instruction acquisition unit 122 can acquire the appropriate parallax as a range, and the near parallax and the far parallax limit parallax are determined. The near maximum parallax is a parallax corresponding to the proximity allowed for a point that appears closest to the user, and the far maximum parallax is a parallax corresponding to the distance allowed for a point that appears farthest from the user. However, in general, the near maximum parallax should be taken care of due to a user's physiological problem. Hereinafter, only the near maximum parallax may be referred to as a limit parallax.
[0112]
FIG. 17 shows the principle of actually adjusting the parallax between two viewpoints when a stereoscopically displayed image is taken out from three-dimensional data. First, the critical parallax determined by the user is converted into a prospective angle of the temporarily placed camera. As shown in the figure, the limit parallax between the near and far positions can be expressed as M and N in terms of the number of pixels, and the angle of view θ of the camera corresponds to the number of horizontal pixels L on the display screen. The near maximum prospective angle φ and the far maximum prospective angle ψ, which are angles, are represented by θ, M, N, and L.
tan (φ / 2) = M tan (θ / 2) / L
tan (ψ / 2) = Ntan (θ / 2) / L
Next, this information is applied to the extraction of the two viewpoint images in the three-dimensional space. As shown in FIG. 18, first, a basic expression space T (its depth is also expressed as T) is determined. Here, it is assumed that the basic expression space T is determined from restrictions on the arrangement of objects. Let S be the distance from the front projection plane 30 which is the front surface of the basic expression space T to the camera placement plane, that is, the viewpoint plane 208. T and S can be specified by the user. There are two viewpoints, and D is the distance from the viewpoint plane 208 of these optical axis intersecting planes 210. Let A be the distance between the optical axis crossing plane 210 and the front projection plane 30.
[0113]
Next, when the near parallax and the far parallax in the basic expression space T are P and Q, respectively,
E: S = P: A
E: S + T = Q: TA
Is established. E is the distance between viewpoints. Now, the point G, which is a pixel with no parallax, is at a position where the optical axes K2 from both cameras intersect on the optical axis intersecting surface 210, and the optical axis intersecting surface 210 is the position of the screen surface. The light rays K1 that produce the near maximum parallax P intersect on the front projection plane 30, and the light rays K3 that produce the far maximum disparity Q intersect on the rear projection plane 32.
[0114]
P and Q use φ and ψ as shown in FIG.
P = 2 (S + A) tan (φ / 2)
Q = 2 (S + A) tan (ψ / 2)
As a result,
E = 2 (S + A) tan (θ / 2) · (SM + SN + TN) / (LT)
A = STM / (SM + SN + TN)
Is obtained. Since S and T are already known, A and E are automatically determined in this way. Therefore, the optical axis crossing distance D and the inter-camera distance E are automatically determined, and the camera parameters are determined. If the camera arrangement determining unit 132 determines the arrangement of the cameras according to these parameters, then the projection processing unit 138 and the two-dimensional image generation unit 142 perform the processing independently on the images from the respective cameras. A parallax image having parallax can be generated and output. As described above, E and A do not include hardware information, and an expression format independent of hardware is realized.
[0115]
Thereafter, appropriate parallax can be automatically realized if a camera is arranged so as to protect A or D and E even when another image is stereoscopically displayed. The entire process from specifying the appropriate parallax to ideal 3D display can be automated, so if this function is provided as a software library, programmers who create content and applications do not need to be aware of 3D display programming. . In addition, when L, M, and N are represented by the number of pixels, L indicates a display range, and thus it can be indicated by L whether the display is a full screen or a part of the screen. L is also a parameter independent of hardware.
[0116]
FIG. 20 shows a four-lens camera arrangement with four cameras 22, 24, 26 and 28. To be precise, the above-mentioned A and E should be determined so as to obtain an appropriate parallax between adjacent cameras, such as between the first camera 22 and the second camera 24. Even if A and E determined between the second camera 24 and the third camera 26 close to each other are used between other cameras, substantially the same effect can be obtained.
[0117]
Note that T is a restriction on the arrangement of objects, but may be determined by a program as the size of a basic three-dimensional space. In this case, the object can always be arranged only in the basic expression space T throughout the program, and for effective display, the object may sometimes be given a parallax so as to intentionally jump out of this space.
[0118]
As another example, T may be determined for the coordinates of the closest object and the most distant object in the three-dimensional space. If this is performed in real time, the basic expression space T is always obtained. Objects can be placed in As an exception to always putting an object in the basic expression space T, a short-time exception can be made if it is operated under the relaxed condition that “the average of the positions for a certain period of time only needs to be in the basic expression space T”. Furthermore, the object that defines the basic expression space T may be limited to a static object. In this case, an exceptional operation in which a dynamic object protrudes from the basic expression space T can be given. As yet another example, conversion may be performed in which a space in which objects are already arranged is reduced to the size of the width T of the basic expression space, or may be combined with the operations described above. A method for intentionally displaying an object so as to jump out of the basic expression space will be described later.
[0119]
In addition, if the stereoscopic effect adjustment unit 112 of the first stereoscopic image processing apparatus 100 displays an image that is easy to produce a double image, the marginal parallax is determined to be small, and other images are displayed. The appearance frequency of double images can be reduced. As images that are likely to produce double images, those with contrasting colors and brightness between the object and the background are known, and such images can be used at the stage of identifying the critical parallax, that is, at the initial setting. .
[0120]
FIG. 21 to FIG. 36 show processing by the distortion processing unit 136 of the first stereoscopic image processing apparatus 100 and its principle.
FIG. 21 conceptually shows an example of the correction map stored in the correction map holding unit 140 of the first stereoscopic image processing apparatus 100. This map directly corrects the parallax, and the entire map directly corresponds to the parallax image, and the parallax becomes smaller as it goes to the peripheral part. FIG. 22 shows a change in parallax that occurs as a result of camera parameter manipulation by the camera placement determination unit 132 that has received a camera placement by the distortion processing unit 136 according to this correction map. “Normal parallax” is added when viewing the front direction from the left and right viewpoint positions of the two cameras, while “small parallax” is added when viewing the direction far from the front. Actually, the camera arrangement determining unit 132 decreases the camera interval as it goes to the periphery.
[0121]
FIG. 23 shows another example in which the camera arrangement determining unit 132 changes the parallax by changing the camera arrangement in accordance with an instruction from the distortion processing unit 136. Here, while moving only the left camera of the two cameras, the parallax changes as “normal parallax” → “medium parallax” → “small parallax” toward the periphery of the image. This method has a lower calculation cost than FIG.
[0122]
FIG. 24 shows another example of the correction map. This map also changes the parallax, and the vicinity of the center of the image is not touched with the normal parallax, and the parallax is gradually reduced in the other parallax correction areas. FIG. 25 shows the camera position that the camera placement determination unit 132 changes according to this map. When the camera direction deviates greatly from the front, the position of the left camera is shifted to the right camera for the first time and “small parallax” is added.
[0123]
FIG. 26 conceptually shows another example of the correction map. This map corrects the sense of distance from the viewpoint to the object, and in order to realize this, the camera arrangement determining unit 132 adjusts the optical axis crossing distance of the two cameras. If the optical axis crossing distance is reduced as it goes to the periphery of the image, the object appears to be relatively recessed in the distant direction, so that the object is achieved particularly in the sense of reducing the near parallax. In order to reduce the optical axis crossing distance, the camera arrangement determining unit 132 may change the optical axis direction of the camera, and may change the direction of one of the cameras. FIG. 27 shows the change of the optical axis crossing position or the optical axis crossing surface 210 when a two-dimensional image is generated by the map of FIG. The optical axis crossing surface 210 is closer to the camera as it is closer to the image.
[0124]
FIG. 28 shows another correction map related to the sense of distance, and FIG. 29 shows how the camera arrangement determining unit 132 changes the optical axis crossing plane 210 in accordance with an instruction from the distortion processing unit 136 according to the map of FIG. In this example, the object is placed at the normal position without correction in the center area of the image, and the position of the object is corrected in the peripheral area of the image. For that purpose, in FIG. 29, the optical axis crossing surface 210 does not change near the center of the image, and the optical axis crossing surface 210 approaches the camera after a certain point. In FIG. 29, only the left camera is changed in direction.
[0125]
30A to 30F show another distortion conversion by the distortion processing unit 136. FIG. Unlike the previous examples, instead of changing the camera position, the camera coordinate system directly distorts the 3D space itself. 30A to 30F, the rectangular area is a top view of the original space, and the hatched area is a top view of the space after conversion. For example, the point U in the original space in FIG. This means that this point has been moved away. In FIG. 30 (a), the space is crushed in the direction of the arrow in the depth direction as it goes to the peripheral part, and the distance closer to a certain sense of distance, like the point W in FIG. I feel a sense. As a result, the sense of distance is uniform at the periphery of the image, and there are no objects that are particularly close to the image. This solves the double image problem and makes it easy to adapt to the user's physiology.
[0126]
30 (b), 30 (c), 30 (d), and 30 (e) all show modified examples of conversion that brings the sense of distance close to a constant value at the periphery of the image, and FIG. An example is shown in which all points are converted in the distant direction.
[0127]
FIG. 31 shows the principle for realizing the conversion of FIG. The rectangular parallelepiped space 228 includes a space where the projection processing of the first camera 22 and the second camera 24 is performed. The view volume of the first camera 22 is determined by the angle of view of the camera, the front projection plane 230, and the rear projection plane 232, and that of the second camera 24 is determined by the angle of view of the camera, the front projection plane 234, and the rear projection plane. 236. The distortion processing unit 136 performs distortion conversion on the rectangular parallelepiped space 228. The origin is the center of the rectangular parallelepiped space 228. In the case of the multi-lens type, the conversion principle is the same just by adding more cameras.
[0128]
FIG. 32 shows an example of distortion conversion, which employs reduction conversion in the Z direction. Actually, processing is performed on each object in the space. FIG. 33 expresses this conversion as a parallax correction map. The parallax is normal parallax, the parallax decreases as the absolute value of X increases, and X = ± A indicates no parallax. Here, since the reduction conversion is performed only in the Z direction, the conversion formula is as follows.
[Expression 1]
The conversion will be described with reference to FIG. First, consider the range of X ≧ 0 and Z ≧ 0. When the point (X0, Y0, Z0) is moved to the point (X0, Y0, Z1) by the reduction process, the reduction ratio Sz is
It is. The coordinates of C are (X0, Y0, 0) and the coordinates of D are (X0, Y0, B).
E is the intersection of the straight line and the plane. If the coordinates are (X0, Y0, Z2), Z2 can be obtained as follows.
[0129]
Z = B−X × B / A (plane)
X = X0, Y = Y0 (straight line)
Z2 = B−X0 × B / A
Therefore,
In general for X,
Sz = 1-X / A
It becomes. If the same calculation is performed for other ranges of X and Z, the following results are obtained and the conversion can be verified.
[0130]
When X ≧ 0, Sz = 1−X / A
When X <0, Sz = 1 + X / A
FIG. 35 shows another example of distortion conversion. Strictly speaking, taking into account that radiography is performed from the camera, reduction processing in the X-axis and Y-axis directions is also combined. Here, conversion is performed with the center of the two cameras as the representative of the camera position. The conversion formula is as follows.
[Expression 2]
FIG. 36 verifies this conversion. Again, consider the range of X ≧ 0 and Z ≧ 0. When the point (X0, Y0, Z0) is moved to the point (X1, Y1, Z1) by the reduction process, the reduction ratios Sx, Sy, Sz are
It becomes. Since E is an intersection of a plane and a straight line, Sx, Sy, and Sz can be obtained as described above.
[0131]
If the converted space is represented by a set of planes as described above, the processing changes at the boundary of the tangents between the surfaces, and a sense of incongruity may occur in some cases. In that case, the space may be constituted by a curved surface or a curved surface alone. The calculation is merely changed to one for obtaining the intersection E between the curved surface and the straight line.
[0132]
In the above example, the reduction ratio is the same on the same straight line CD, but weighting may be performed. For example, a weighting function G (L) for the distance L from the camera may be applied to Sx, Sy, and Sz.
[0133]
FIGS. 37 to 40 show processing by the distortion processing unit 174 of the third stereoscopic image processing apparatus 100 and its principle.
FIG. 37 shows a depth map of an image with depth information input to the third stereoscopic image processing apparatus 100. Here, it is assumed that the depth range has values of K1 to K2. Here, the near depth is positive and the far depth is negative.
[0134]
FIG. 38 shows the relationship between the original depth range 240 and the converted depth range 242. The depth approaches a certain value as it goes to the periphery of the image. The distortion processing unit 174 converts the depth map according to this correction. The same applies to the case where parallax is provided in the vertical direction. Since this conversion is also only reduction in the Z direction, it can be expressed by the following equation.
[Equation 3]
Sz is classified according to the value of X.
When X ≧ 0, Sz = 1−2X / L
When X <0, Sz = 1 + 2X / L
It becomes. By the above conversion, a new depth map having new elements shown in FIG. 39 is generated.
[0135]
FIG. 40 shows another distortion transformation principle for the depth map. More strictly speaking, the space is observed from the user 10 in a radial manner, so that the reduction processing in the X-axis and Y-axis directions is also combined. Here, the center between the eyes is the observation position. The specific processing is the same as in the case of FIG. The original depth map has only a Z value, but when this calculation is performed, the X value and the Y value are also held. The Z value is converted into a pixel shift amount in the X direction or the Y direction, but the X value and the Y value may be held as offset values for them.
[0136]
In any case, the original image with the depth map converted by the distortion processing unit 174 is input to the two-dimensional image generation unit 178, where a composition process shifted in the horizontal direction so as to obtain an appropriate parallax is performed. Details thereof will be described later.
[0137]
41 to 51 show the processing of the position shift unit 160 of the second stereoscopic image processing apparatus 100 and the two-dimensional image generation unit 178 of the third stereoscopic image processing apparatus 100 that can be grasped as an extension thereof.
FIG. 41 shows the principle of shifting the combined position of two parallax images by the position shift unit 160. As shown in the figure, the positions of the right eye image R and the left eye image L match in the initial state. However, when the left eye image L is relatively shifted to the right as shown in the upper part of the figure, the parallax at the near point increases and the parallax at the far point decreases. Conversely, when the left eye image L is shifted relatively to the left as shown in the lower part of the figure, the parallax at the near point decreases and the parallax at the far point increases.
[0138]
The above is the essence of parallax adjustment by shifting the parallax image. One image may be shifted, or both may be shifted in opposite directions. From this principle, it can be seen that the stereoscopic display method can be applied to all methods using parallax, regardless of the glasses method or the glassesless method. Similar processing can be performed for multi-view images and vertical parallax.
[0139]
FIG. 42 shows the shift process at the pixel level. The left eye image 200 and the right eye image 202 both include a first square 250 and a second square 252. The first square 250 has a near parallax, and when the parallax amount is expressed as a positive number, it becomes “6 pixels”. On the other hand, the second square 252 has a distant parallax. When the parallax amount is expressed as a negative number, it becomes “−6 pixels”. Here, let this amount of parallax be F2 and F1, respectively.
[0140]
On the other hand, it is assumed that the appropriate parallax of the display device owned by the user is J1 to J2. The position shift unit 160 shifts the synthesis start position of both images by (J2-F2) pixels. FIG. 43 shows a state after the end of the shift. Now, assuming that F1 = −6 and F2 = 6, and J1 = −5 and J2 = 4, the composition start positions are −2 pixels, that is, It is shifted in the direction in which the whole shifts in the distant direction. As shown in FIG. 43, the final amount of parallax is E1 = −8 and E2 = 4, and at least falls within the limit parallax with respect to the close-up direction. In general, the double image in the close-up direction is more uncomfortable compared to the distant direction, and the subject is often photographed in the close-up direction. It is desirable to keep the directional parallax within limits. A processing example is shown below.
1. When the near point is outside the limit parallax and the far point is within the limit parallax, the near point is shifted to the limit parallax point. However, the processing is stopped when the disparity parallax reaches the interocular distance. 2. When the near point is outside the limit parallax and the far point is outside the limit parallax, the near point is shifted to the limit parallax point. However, the processing is stopped when the disparity parallax reaches the interocular distance. 3. If neither the near point nor the far point is within the limit parallax, no processing is performed.
4). If the near point is within the limit parallax and the far point is outside the limit parallax, the far point is shifted to the limit parallax point, but if the near point reaches the limit parallax point during the process, the processing is stopped. .
[0141]
FIG. 44 shows the missing image edge due to the shift of the composite position. Here, the shift amount of the left eye image 200 and the right eye image 202 is one pixel, and a missing portion 260 having a width of one pixel occurs at the right end of the left eye image 200 and the left end of the right eye image 202, respectively. At this time, the image edge adjustment unit 168 compensates the number of horizontal pixels by duplicating the pixel line at the image edge as shown in FIG.
[0142]
As another method, the missing portion 260 may be displayed in a specific color such as black or white, or may be hidden. Further, clipping and adding processing may be performed so as to be the same as the size of the initial image. Also, it may be considered that the size of the initial image is made larger than the actual display size in advance so that the missing portion 260 does not affect the display.
[0143]
FIG. 45 is a flow of manual adjustment of parallax by the second stereoscopic image processing apparatus 100. As shown in the figure, first, left and right images are manually created as parallax images (S10), and distributed through a network or other route (S12). This is received by the second stereoscopic image processing apparatus 100 (S14), and in the example of this figure, first, the images are synthesized and displayed as they are in the normal state without shifting (S16). That is, here, the case where the appropriate parallax has not yet been acquired or the case where the position shift unit 160 is not operated is considered. Subsequently, the user instructs the parallax image displayed stereoscopically via the stereoscopic effect adjustment unit 112 to adjust the parallax, and the position shift unit 160 receives this in the “manual adjustment mode” and adjusts the image synthesis position. Are displayed (S18). S10 and S12 are the procedure 270 for the image creator, and the procedure after S14 is the procedure 272 for the second stereoscopic image processing apparatus 100. Although not shown, if this shift amount is recorded in the header and is synthesized by referring to it from the next time, the labor of readjustment can be saved.
[0144]
FIG. 46 shows a flow of automatic adjustment by the second stereoscopic image processing apparatus 100. The left and right image generation (S30) and image distribution (S32), which are image creator procedures 270, are the same as those in FIG. Also, the image reception (S34) is the same in the procedure 272 of the second stereoscopic image processing apparatus 100. Next, the parallax preliminarily added between the parallax images, particularly the maximum parallax, is detected by the matching unit 158 of the parallax amount detection unit 150 (S36), while appropriate parallax, especially the critical parallax is acquired from the parallax information holding unit 120. (S38). Thereafter, the position shift unit 160 shifts the composite position of the image so as to satisfy the limit parallax by the above-described processing (S40), and the stereoscopic display is performed through the processing by the parallax writing unit 164, the image edge adjustment unit 168, and the format conversion unit 116. (S42).
[0145]
FIG. 47 shows a flow of still another automatic adjustment by the second stereoscopic image processing apparatus 100. After the left and right images are generated in the image creator procedure 270 (S50), the maximum parallax is detected at this time (S52) and recorded in the header of one of the viewpoint images of the parallax image (S54). This detection may be performed by corresponding point matching. However, when the creator manually generates the parallax image, it is naturally known in the editing process, and therefore this may be recorded. Thereafter, the image is distributed (S56).
[0146]
On the other hand, in the procedure 272 of the second stereoscopic image processing apparatus 100, the image reception (S58) is the same as in FIG. Next, the above-described maximum parallax is read from the header by the header inspection unit 156 of the parallax amount detection unit 150 (S60). On the other hand, the limit parallax is acquired from the parallax information holding unit 120 (S62), and the following processes S64 and S66 are the same as the processes S40 and S42 of FIG. According to this method, there is no need to calculate the maximum parallax. In addition, it is possible to realize an appropriate stereoscopic effect for the entire image. Furthermore, since the shift amount can be recorded in the header, there is no possibility of damaging the original image itself. Although not shown, if the maximum parallax detected in FIG. 46 is recorded in the header, it can be processed in accordance with the procedure shown in FIG.
[0147]
Note that the same processing can be performed even in the multi-view method, and the same processing may be performed on the amount of parallax between adjacent viewpoint images. However, in practice, the shift amount of the synthesis position may be determined by regarding the maximum parallax among the parallaxes among the plurality of viewpoint images as the “maximum parallax” between all the viewpoint images.
[0148]
The header information may be in at least one of the multi-viewpoint images. However, when the multi-viewpoint image is combined into one image, the header of the image may be used.
[0149]
In some cases, images that have already been combined may be distributed. In that case, the images are separated by inverse transform processing once, and the combined position shift amount is calculated and recombined, or the result is the same. This sort processing may be performed.
[0150]
48 to 51 show processing for shifting the composite position for the image with depth information. This is performed by the two-dimensional image generation unit 178 of the third stereoscopic image processing apparatus 100. FIG. 48 and FIG. 49 are a plane image 204 and a depth map that constitute an image with depth information, respectively. Here, the near depth is represented as positive and the far depth is represented as negative. There are a first quadrangle 250, a second quadrangle 252 and a third quadrangle 254 as objects, the first quadrangle 250 is depth “4”, the second quadrangle 252 is “2”, and the third quadrangle 254 is “−4”. . The first square 250 is the nearest placement point, the second square 252 is the intermediate near placement point, and the third square 254 is the farthest placement point.
[0151]
Based on the original planar image 204, the two-dimensional image generation unit 178 first performs a process of shifting each pixel by the value of the depth map to generate the other viewpoint image. If the reference is the left eye image, the original planar image 204 becomes the left eye image as it is. The first square 250 is shifted to the left by 4 pixels, the second square 252 is shifted to the left by 2 pixels, and the third square 254 is shifted to the right by 4 pixels, and the right eye image 202 is created as shown in FIG. The image edge adjustment unit 168 fills the pixel information missing portion 260 due to the movement of the object with the neighboring pixels determined to be the background whose parallax is “0”.
[0152]
Subsequently, the two-dimensional image generation unit 178 calculates a depth that satisfies the appropriate parallax. If the depth range is K1 to K2 and the depth value of each pixel is Gxy, the depth map is changed from Hxy to Gxy in FIG. Further, it is assumed that the appropriate parallax of the display device owned by the user is J1 to J2. In this case, in the depth map, the depth value G of each pixel is converted as follows, and a new depth value Fxy is obtained.
[0153]
Fxy = J1 + (Gxy−K1) × (J2−J1) / (K2−K1)
In the above example, if K1 = −4, K2 = 4, and J1 = −3 and J2 = 2, the depth map of FIG. 49 is converted to the depth map of FIG. The That is, “4” is converted to “2”, “2” is converted to “1”, and “−4” is converted to “−3”. Intermediate values between K1 and K2 are converted between J1 and J2. For example, the second square 252 has Gxy = 2 and Fxy = 0.75. When Fxy does not become an integer, rounding off or processing that reduces the near parallax may be performed.
[0154]
Although the above conversion formula is an example of linear conversion, a weighting function F (Gxy) for Gxy is further applied, and various other nonlinear conversions are also conceivable. Also, it is possible to newly generate left and right images from the original planar image 204 by shifting the objects in opposite directions. In the case of a multi-view type, a similar process may be performed a plurality of times to generate a multi-viewpoint image.
[0155]
The above is the configuration and operation of the stereoscopic image processing apparatus 100 according to the embodiment. Although the stereoscopic image processing apparatus 100 has been described as an apparatus, this may be a combination of hardware and software, or may be configured only by software. In that case, it is convenient if an arbitrary part of the stereoscopic image processing apparatus 100 is made into a library and can be called from various programs. The programmer can skip programming that requires knowledge of 3D display. For the user, regardless of the software and contents, the operation related to stereoscopic display, that is, the GUI is common, and the set information can be shared by other software, so that the trouble of resetting can be saved.
[0156]
Note that it is useful not only for the processing related to stereoscopic display but also for sharing information among a plurality of programs. Various programs can determine the state of the image with reference to the information. An example of information to be shared is information acquired by the information acquisition unit 118 of the stereoscopic image processing apparatus 100 described above. This information may be held in a recording unit or correction map holding unit 140 (not shown).
[0157]
52 to 54 show an example in which the above-described stereoscopic image processing apparatus 100 is used as a library. FIG. 52 shows the usage of the stereoscopic display library 300. The stereoscopic display library 300 is referred to by calling functions from a plurality of programs A302, B304, C306, and the like. The parameter file 318 stores the appropriate parallax of the user in addition to the above information. The stereoscopic display library 300 is used by a plurality of apparatuses A 312, B 314, C 316, etc. via an API (Application Program Interface) 310.
[0158]
Examples of the program A302 and the like include a game, a so-called Web3D three-dimensional application, a three-dimensional desktop screen, a three-dimensional map, a two-dimensional image parallax image viewer, and a depth information-added viewer. Of course, some of the games use different coordinates, but the 3D display library 300 can handle this.
[0159]
On the other hand, as an example of the device A312 or the like, any stereoscopic display device using parallax, such as a binocular or multi-lens parallax barrier method, a shutter glasses method, or a polarized glasses method.
[0160]
FIG. 53 shows an example in which the stereoscopic display library 300 is incorporated in the three-dimensional data software 402. The three-dimensional data software 402 includes a program main body 404, a stereoscopic display library 300 for realizing appropriate parallax therefor, and a photographing instruction processing unit 406. The program body 404 communicates with the user via the user interface 410. The photographing instruction processing unit 406 virtually photographs a predetermined scene during the operation of the program body 404 in accordance with a user instruction. The captured image is recorded in the image recording device 412. Also output to the stereoscopic display device 408.
[0161]
For example, assume that the three-dimensional data software 402 is game software. In that case, the user can execute the game while experiencing an appropriate stereoscopic effect by the stereoscopic display library 300 during the game. If the user wants to keep a record during the game, for example, when the player wins a complete battle game, an instruction is given to the shooting instruction processing unit 406 via the user interface 410 and the scene is recorded. At this time, using the stereoscopic display library 300, a parallax image is generated so as to have an appropriate parallax when played back later on the stereoscopic display device 408, and this is recorded in an electronic album or the like of the image recording device 412. Note that recording is performed with a two-dimensional image called a parallax image, so that the three-dimensional data itself of the program body 404 does not flow out, and it is possible to consider copyright protection.
[0162]
FIG. 54 shows an example in which the three-dimensional data software 402 of FIG. 53 is incorporated into a network-based system 430.
The game machine 432 is connected to the server 436 and the user terminal 434 via a network (not shown). The game machine 432 is for an arcade game, and includes a communication unit 442, three-dimensional data software 402, and a stereoscopic display device 440 that locally displays the game. The three-dimensional data software 402 is the one shown in FIG. The parallax image displayed on the stereoscopic display device 440 from the three-dimensional data software 402 is optimally set in advance for the stereoscopic display device 440. The parallax adjustment by the three-dimensional data software 402 is used when an image is transmitted to the user via the communication unit 442 as described later. The display device used here only needs to have a function of adjusting a parallax and generating a stereoscopic image, and does not necessarily have to be a device capable of stereoscopic display.
[0163]
The user terminal 434 includes a communication unit 454, a viewer program 452 for viewing a stereoscopic image, and a stereoscopic display device 450 of any size and type for locally displaying the stereoscopic image. The viewer program 452 includes the stereoscopic image processing apparatus 100.
[0164]
The server 436 associates the communication unit 460, an image holding unit 462 that records images virtually taken by the user in connection with the game, the user's appropriate parallax information, the user's email address, and other personal information with the user. A user information holding unit 464 for recording. The server 436 functions as an official game site, for example, and records a scene that the user likes during the game, a moving image or a still image of a famous game. The stereoscopic display can be either a moving image or a still image.
[0165]
An example of image capturing in the above configuration is performed as follows. The user performs stereoscopic display on the stereoscopic display device 450 of the user terminal 434 in advance, acquires appropriate parallax based on the function of the stereoscopic image processing device 100, notifies the server 436 of this via the communication unit 454, and transmits user information. It is stored in the holding unit 464. This appropriate parallax is a general-purpose description not related to the hardware of the stereoscopic display device 450 owned by the user.
[0166]
The user plays a game with the game machine 432 at an arbitrary timing. In the meantime, the stereoscopic display device 440 performs stereoscopic display using the parallax initially set or the parallax manually adjusted by the user. When the user wishes to record an image during game play or replay, the 3D display library 300 built in the 3D data software 402 of the game machine 432 is stored in the server 436 via the two communication units 442 and 460. A parallax image related to an image virtually acquired by the image holding unit 462 via the two communication units 442 and 460 is acquired again by acquiring the appropriate parallax of the user from the user information holding unit 464 and generating a parallax image accordingly. Is stored. If the user returns to the home and then downloads the parallax image to the user terminal 434, the stereoscopic display can be performed with a desired stereoscopic effect. Even in this case, the parallax can be manually adjusted by the stereoscopic image processing apparatus 100 included in the viewer program 452.
[0167]
As described above, according to this application example, the programming related to the stereoscopic effect that must be set for each hardware of the display device and for each user is concentrated in the stereoscopic image processing device 100 and the stereoscopic display library 300. Programmers don't have to worry about any complex requirements for stereoscopic display. This applies not only to game software but also to arbitrary software that uses stereoscopic display, and it eliminates restrictions on the development of content and applications that use stereoscopic display. Therefore, the spread of these can be dramatically promoted.
[0168]
Especially for games and other applications that originally have 3D CG data, it has been difficult to code accurate 3D displays. In many cases, it was not used for stereoscopic display. According to the stereoscopic image processing apparatus 100 or the stereoscopic display library 300 according to the embodiment, such adverse effects can be removed, and the stereoscopic display application can be enhanced.
[0169]
In FIG. 54, the user's appropriate parallax is registered in the server 436, but the user may bring an IC card or the like on which the information is recorded to use the game machine 432. You may record the score about this game and the picture you like on this card.
[0170]
The present invention has been described based on the embodiments. This embodiment is an exemplification, and it will be understood by those skilled in the art that various modifications can be made to combinations of the respective constituent elements and processing processes, and such modifications are also within the scope of the present invention. is there. Here are some examples.
[0171]
The first stereoscopic image processing apparatus 100 can perform processing with high accuracy by inputting three-dimensional data. However, it is also possible to drop the three-dimensional data into an image with depth information and generate a parallax image using the third stereoscopic image processing apparatus 100. In some cases, it may be less computationally expensive. Similarly, when inputting a plurality of viewpoint images, it is also possible to create a depth map by using matching point matching with high accuracy. A parallax image may be generated using the image processing apparatus 100.
[0172]
In the first stereoscopic image processing apparatus 100, the temporary camera arrangement unit 130 is configured as the stereoscopic image processing apparatus 100, but this may be preprocessing of the stereoscopic image processing apparatus 100. This is because processing up to the temporary placement of the camera can be performed regardless of the appropriate parallax. Similarly, arbitrary processing units constituting the first, second, and third stereoscopic image processing apparatuses 100 can be taken out of the stereoscopic image processing apparatus 100, and the degree of freedom of the configuration of the stereoscopic image processing apparatus 100 is allowed. Will be understood by those skilled in the art.
[0173]
In the embodiment, the case where the parallax control is performed in the horizontal direction has been described, but the same processing can also be performed in the vertical direction.
[0174]
During the operation of the stereoscopic display library 300 and the stereoscopic image processing apparatus 100, a unit for enlarging the character data may be provided. For example, in the case of a parallax image with two horizontal viewpoints, the horizontal resolution of an image visible to the user is halved. As a result, the readability of the characters can be reduced, and therefore a process of extending the characters twice in the horizontal direction is effective. When there is parallax in the vertical direction, it is also useful to extend the character in the vertical direction as well.
[0175]
During the operation of the stereoscopic display library 300 and the stereoscopic image processing apparatus 100, an “in-operation display unit” that puts characters and marks such as “3D” in a displayed image may be provided. In this case, the user can know whether or not the image can adjust the parallax.
[0176]
A stereoscopic display / normal display switching unit may be provided. This unit includes a GUI. When the user clicks a predetermined button, the display is switched from the stereoscopic display to the normal two-dimensional display and vice versa.
[0177]
The information acquisition unit 118 does not necessarily acquire information by user input, but may include information that can be automatically acquired by a function such as plug and play.
[0178]
In the embodiment, a method of deriving E and A is used. However, a method of deriving other parameters by fixing these may be used, and designation of variables is free.
[0179]
We propose another expression method for 3D display. In general planar image display, there is a limit in terms of reality in the expression in the depth direction, in particular, where “an object passes through a certain interface”. In addition, it is difficult for the observer to recognize that there is an interface that actually separates the space on the surface of the window. Therefore, as described below, the stereoscopic image display device can display an object in a three-dimensional manner so that an entity such as a screen or a frame matches the interface on the object represented in the image. Thus, a new expression method is born by such a display. In general, the display screen and its surrounding frame are visually recognized, so a display method that uses this like a window can be considered, and it is possible to specify the interface between space and space or a plate-like object on that surface. Is required. In this case, the optical axis crossing position D is designated in the positional relationship shown in FIG.
[0180]
In the positional relationship of the imaging system shown in FIG. 18, when the near parallax and the far parallax in the basic expression space T are P and Q, respectively,
E: S = P: A
E: S + T = Q: TA
The following relational expression was obtained. Solving these relational expressions for the near parallax and the far parallax,
E = PS / A
E = Q (S + T) / (TA)
Is obtained. By selecting the smaller E of these two Es, a stereoscopic image with an appropriate parallax range can be obtained.
[0181]
FIG. 55 shows a state in which an image composed of three-dimensional data is displayed on the display screen 400. In this image, one glass surface 401 of the aquarium 410 coincides with the display screen 400, and it is expressed that the fish 301 is swimming in the aquarium 410. If the processing is performed so that the far space from the display screen 400 becomes the far space and the near space is the near space, the fish 301 is normally represented as swimming in the far space as shown in FIG. Occasionally, as shown in FIG. 57, an expression such as “fish 301 breaks through display screen 400 and appears in the near space” can be made. Further, when the fish 301 passes through the display screen 400, for example, it can be expressed as “spray splashes from the periphery of the display screen 400 and the interface is regenerated when the fish 301 passes through”. As an example of another expression, for example, “There is no water in the near space in front of the display screen, so the fish 301 becomes suffocating when swimming in the near space for a while, and again penetrates the interface, that is, the display screen 400, and the far space. You can also say "Return to."
[0182]
It is not always necessary to regenerate the broken interface when the object passes through the interface, and then the interface remains broken or the interface is deformed in accordance with the collision of the object. It is clear that various expressions relating to the interaction between the interface and the object can be made, such as not passing, further transmitting only the shock at that time, and applying an electric shock as an effect on the image. Further, the interface may be a single surface, but a plate-like object such as glass or a thin object such as paper may be disposed. Further, the interface does not have to be completely coincident with the display screen, and may be in the vicinity thereof. It is clear that the above-described expression effect cannot sufficiently convey the situation to the observer with a flat image. In particular, when the original data serving as the starting point of the stereoscopic image is three-dimensional data, editing for expressing the effects as described above is facilitated.
[0183]
Such an expression for matching the interface of the object to be displayed with the display screen can be generated by the method shown in FIG. That is, the virtual water tank 410 is arranged in the three-dimensional space, and two images having parallax are generated from the two virtual cameras 430 and 440 arranged on the left side thereof. At that time, the optical axis crossing positions of the two virtual cameras 430 and 440 are made to coincide with one surface of the aquarium. Such an image can also be taken as shown in FIG. Two virtual cameras 430 and 440 are arranged above the actual water tank 410 to photograph the water tank 410. At that time, the optical axis crossing positions of the two cameras are made to coincide with the water surface.
[0184]
FIG. 60 shows a configuration of the fourth stereoscopic image processing apparatus 100 for realizing the above processing. The stereoscopic image processing apparatus 100 has a configuration in which an object specifying unit 180 is further provided in the stereoscopic effect adjusting unit 112 of the first stereoscopic image processing apparatus 100 shown in FIG. The object designating unit 180 performs processing for positioning or matching the interface of the object designated by the user near the display screen. Here, the user is assumed to be a producer of a stereoscopic image, and the above-described processing is performed at the time of producing or editing a stereoscopic image. Note that the user may be an observer.
[0185]
First, the processing procedure for the stereoscopic image processing apparatus 100 shown in FIG. 60 will be described. The object designation unit 180 receives designation of an object corresponding to the optical axis crossing planes of the two virtual cameras 430 and 440 from a user by a predetermined input device such as a mouse and notifies the parallax control unit 114 of the designated object. The parallax control unit 114, more specifically, the camera arrangement determination unit 132 adjusts the plane of the object designated by the user so that it becomes the optical axis intersection plane of the two virtual cameras 430 and 440. Operations other than this processing may be the same as the operations of the stereoscopic image processing apparatus 100 shown in FIG. Information indicating that the object is displayed in the vicinity of the display screen is added to the object thus determined. At the time of display, it is read out as appropriate to determine the optical axis crossing distance D, and the inter-camera distance E is determined by the processing described above.
[0186]
In addition, we propose another expression method. When there are a plurality of objects to be displayed on the display screen, it is not always necessary to keep all the objects within the appropriate parallax. Sometimes, for effective display, some objects may be displayed under certain conditions, for example, for a certain period, excluding the appropriate parallax conditions. As described above, the basic expression space is determined for the static object as the method. More specifically, for each object, whether the object should be expressed in the basic expression space including the object to be stereoscopically displayed is determined. Information to be discriminated (hereinafter also simply referred to as “identification information”) may be provided. The object to be expressed in the basic expression space is also referred to as “basic expression space calculation object”. Based on this identification information, the basic expression space may be determined as needed.
[0187]
If the identification information is configured so that it can be changed as needed, it is possible to flexibly set conditions for removing from the appropriate parallax. For example, if the identification information describes the specification of the time to be removed from the appropriate parallax condition, it is possible to automatically return to the appropriate parallax range after the specified time has passed.
[0188]
A method for temporarily removing some objects from the appropriate parallax condition and displaying them on the display screen will be described below. For example, in the first stereoscopic image processing apparatus 100 shown in FIG. 11, the camera placement determination unit 132 corrects the temporarily set camera parameter according to the appropriate parallax, but if the function is further expanded as follows. Good. That is, the camera arrangement determining unit 132 reads identification information associated with each object, and arranges camera parameters in a manner that reflects the identification information.
[0189]
In addition, another expression method is proposed. When the front and back of the basic expression space, that is, the front projection plane that is the near limit and the rear projection plane that is the limit limit are determined by an object, an expression that moves in the space before and after the space corresponding to the object become unable. FIG. 61 shows the depth direction, particularly the basic expression space T, for the sake of convenience regarding the image displayed by the fourth stereoscopic image processing apparatus 100. A front projection plane 310 is set on the left side of the drawing, and a rear projection plane 312 is set on the right side. A basic expression space T is defined between the front projection plane 310 and the rear projection plane 312. Within the basic expression space T, a house 350 is represented as a stationary object on the front projection plane 310 side, and a tree 370 is represented on the rear projection plane 312 side. Furthermore, a bird 330, which is a dynamic object, moves forward in the space above these two stationary objects. If the bird 330 moves within the range of the basic expression space T, its movement can be expressed. However, when the bird 330 reaches the front projection plane 310 or the rear projection plane 312, the bird 330 thereafter moves to the left side of FIG. The object is located on the front projection plane, or the rear projection plane 312 (not shown), like the bird 330 shown in FIG. 5, and the bird 330 is fixed at the maximum parallax and cannot move further forward or backward in the real space. If the object can be expressed as if it is moving as much as possible, it is possible to maintain a sense of reality for the object.
[0190]
As described above, a process of removing a dynamic object from the target of the basic expression space T can be considered. However, the user may feel uncomfortable except for aiming at an effect as described above, and the basic expression space T It is often preferable to express in the range of.
[0191]
Therefore, as shown in FIG. 62, a region where no object exists is included in the basic expression space T. In FIG. 62, a space where nothing exists is provided as a part of the basic expression space T further in front of the stationary object house 350 in the front, and the bird 330 that is a dynamic object can move in front of the house 350. It is what I did. FIG. 63 further provides a space where nothing exists behind the still object tree 370 placed behind as a part of the basic expression space T. FIG. Thereby, for example, even if the bird 330 which is a dynamic object moves from behind and exceeds the position corresponding to the front surface of the house 350, the bird 330 is located within the range of the basic expression space T. Even if it moves, it is expressed with an appropriate parallax, and the observer who is the user does not feel uncomfortable regarding the movement.
[0192]
Also, as shown in FIG. 64, for example, a moving object 390 is formed as a target for calculating parallax in a form that includes not only the bird 330 but also the front and back spaces. When the forefront surface of the moving object 390 reaches the front projection plane 310, only the bird 330 is moved. In that case, for example, by lowering the moving speed of the bird 330 than the original speed, the time until the bird 330 reaches the front projection surface 310 immediately and the subsequent movement cannot be expressed can be delayed.
[0193]
As shown in FIG. 65, for example, after the moving object 390 exceeds the front projection plane 310, the bird 330 may be moved in a previously included space. As a result, the maximum parallax is determined by the moving object 390, and the bird 330 gradually approaches the maximum parallax, so that it can continue to move forward in the real space. This can be realized by determining whether to enable or disable moving according to the position of the object, that is, the bird 330. The moving speed may be set to any of the originally assumed moving speed, a fast speed, and a slow speed, and various expressions are possible by giving the moving speed flexibility. For example, moving closer to the end of the moving object 390 can change the moving speed slower, thereby expressing forward movement while preventing the parallax amount from becoming excessively large in the front-rear direction.
[0194]
Also, if another object appears before or after that, the maximum parallax will now depend on that object, so the bird 330 is returned to its original position in the moving object 390 little by little.
[0195]
Next, the principle of preventing a sudden change in parallax while changing the maximum parallax will be described with reference to FIGS. 17 and 18 described above. As mentioned above,
tan (φ / 2) = M tan (θ / 2) / L
E: S = P: A
P = 2 (S + A) tan (φ / 2)
From these equations, the parallax amount on the near side of an object is
M = LEA / (2S (A + S) tan (θ / 2))
It can be expressed as. Here, when the object moves forward, if the camera setting is not changed, A increases and S decreases, so that the amount of parallax increases.
[0196]
Here, when the object moves forward, M becomes M ′, S becomes S ′, and A becomes A ′.
M ′ = LEA ′ / (2S ′ (A ′ + S ′) tan (θ / 2))
M <M '
It can be expressed as.
[0197]
Change E and A 'in the camera settings,
M ″ = LE ″ A ″ / (2S ′ (A ″ + S ′) tan (θ / 2))
And convert this time
M <M "<M '
If the above relationship is satisfied, a sudden change in the amount of parallax can be prevented when stereoscopically displaying an object moving toward the observer. Only one of E and A ′ may be changed. At this time, M ″ is
M ″ = LE ″ A ′ / (2S ′ (A ′ + S ′) tan (θ / 2))
Or
M ″ = LEA ″ / (2S ′ (A ″ + S ′) tan (θ / 2))
It is expressed.
[0198]
To prevent a sudden change in the amount of parallax when moving toward the back of the object,
M> M "> M '
Satisfy this relationship.
[0199]
Also, the same applies to the disparity amount N on the far side.
N = LE (TA) / (2 (T + S) (A + S) tan (θ / 2))
And, similarly,
N ′ = LE (TA ′) / (2 (T + S ′) (A ′ + S ′) tan (θ / 2))
N ″ = LE ″ (TA −) ″ / (2 (T + S ′) (A ″ + S ′) tan (θ / 2))
Ask for. here
N> N "> N '
If the above relationship is satisfied, the movement speed on the actual coordinates against the movement of the object toward the observer can prevent a sudden change in the amount of parallax,
N <N "<N '
If this relationship is satisfied, a sudden change in the amount of parallax can be prevented with respect to the movement toward the back of the object.
[0200]
The configuration of the stereoscopic image display apparatus 100 that realizes the expression method as shown in FIGS. 61 to 65 will be described above. This stereoscopic image display apparatus 100 can be realized by the stereoscopic image display apparatus 100 shown in FIG. However, when correcting the camera parameters temporarily set according to the appropriate parallax, the camera arrangement determining unit 132 obtains information regarding the range to be calculated in the basic expression space and information regarding the change in the parallax amount of the object from the original data. It has a function to read and reflect it in camera parameters. This information may be included in the original data itself, or may be held in the parallax information holding unit 120, for example.
[0201]
In the embodiment, for example, when it is determined that the parallax is too large by the appropriate parallax processing with respect to a correct parallax state in which the sphere looks correct, the parallax of the stereoscopic image is processed to be small. At this time, the sphere looks like a shape collapsed in the depth direction, but generally the discomfort for such display is small. Since a person is usually accustomed to seeing a flat image, if the parallax is between 0 and a correct parallax, the person often does not feel discomfort.
[0202]
On the other hand, if it is determined that the parallax of the stereoscopic image is too small by the appropriate parallax processing for a parallax state in which the sphere looks correct, the parallax is processed to increase. At this time, for example, the sphere looks like a shape that swells in the depth direction, and a person may feel a sense of discomfort greatly for such a display.
[0203]
When a single object is displayed in 3D, the above-mentioned phenomenon is likely to make the person feel uncomfortable. This is especially true when displaying objects that are viewed in real life, such as buildings and vehicles. Tend to be clearly recognized. Therefore, in order to reduce the sense of incongruity, it is necessary to add correction to a process that increases the parallax.
[0204]
When a three-dimensional image is generated from three-dimensional data, the parallax can be adjusted relatively easily by changing the arrangement of the cameras. A procedure for correcting parallax will be described based on FIGS. 66 to 71. Further, the correction of the parallax can be performed by the first to fourth stereoscopic image processing apparatuses 100 described above. Here, it is assumed that a stereoscopic image is generated from the three-dimensional data by the first stereoscopic image processing apparatus 100 shown in FIG. It should be noted that the above-described correction processing can also be realized by fourth and sixth stereoscopic image display devices 100 described later.
[0205]
FIG. 66 shows a state where an observer observes a stereoscopic image on a display screen 400 of a certain stereoscopic image display apparatus 100. The screen size of the display screen 400 is L, the distance between the display screen 400 and the observer is d, and the interocular distance is e. Further, the near limit parallax M and the far limit parallax N are obtained in advance by the stereoscopic effect adjusting unit 112, and the distance between the near limit parallax M and the far limit parallax N is an appropriate parallax. Here, only the near limit parallax M is displayed for easy understanding, and the maximum pop-out amount m is determined from this value. The pop-out amount m refers to the distance from the display screen 400 to the near point. Note that the units of L, M, and N are “pixels”, and unlike other parameters such as d, m, and e, it is necessary to make adjustments using a predetermined conversion formula. It is expressed in the same unit system.
[0206]
At this time, in order to display the sphere 21, it is assumed that the camera arrangement is determined by the camera arrangement determination unit 132 of the parallax control unit 114 as shown in FIG. 67 with reference to the nearest and farthest placement points of the sphere 21. . The distance between the optical axes of the two cameras 22 and 24 is D, and the distance between the cameras is Ec. However, in order to facilitate parameter comparison, the coordinate system is enlarged or reduced so that the expected width of the camera at the optical axis crossing distance matches the screen size L. At this time, for example, the camera interval Ec is equal to the interocular distance e, and the optical axis crossing distance D is smaller than the observation distance d. Then, in this system, as shown in FIG. 68, when the observer observes from the camera position shown in FIG. When the sphere 21 is observed by the stereoscopic image display device 100 based on an image generated by such a photographing system, the sphere 21 extending in the depth direction over the entire appropriate parallax range is observed as shown in FIG.
[0207]
A method for determining whether or not a three-dimensional image needs to be corrected using this principle will be described below. FIG. 70 shows a state in which the nearest placement point of a sphere whose distance from the display screen 400 is A is captured in the camera arrangement shown in FIG. At this time, the maximum parallax M corresponding to the distance A is obtained by two straight lines formed by connecting each of the two cameras 22 and 24 and the point where the distance A is located. Further, FIG. 71 shows the camera interval E1 necessary for obtaining the parallax M shown in FIG. 70, where d is the optical axis tolerance distance between the two cameras 22 and 24 and the cameras. This can be said to be a conversion in which all imaging system parameters other than the camera interval coincide with the observation system parameters. 70 and 71 hold the following relationship.
M: A = Ec: DA
M: A = E1: d−A
Ec = E1 (DA) / (dA)
E1 = Ec (d−A) / (DA)
Then, it is determined that correction is necessary so that the parallax becomes small when E1 is larger than the interocular distance e. Since E1 may be set to the interocular distance e, Ec may be corrected as in the following equation.
Ec = e (DA) / (dA)
[0208]
The same applies to the farthest placement point. In FIGS. 72 and 73, when the distance between the nearest placement point and the farthest placement point of the sphere 21 is T, which is the basic expression space,
N: TA = Ec: D + TA
N: TA = E2: d + TA
Ec = E2 (D + TA) / (d + TA)
E2 = Ec (d + TA) / (D + TA)
Further, when E2 is larger than the interocular distance e, it is determined that correction is necessary. Subsequently, since E2 may be set to the interocular distance e, Ec may be corrected as in the following equation.
Ec = e (D + TA) / (d + TA)
[0209]
Eventually, if the smaller one of the two Ec obtained from the nearest placement point and the farthest placement point is selected, the parallax does not become too large for both the near placement and the far placement. The camera is set by returning the selected Ec to the original coordinate system of the three-dimensional space.
[0210]
More generally,
Ec <e (DA) / (dA)
Ec <e (D + TA) / (d + TA)
The camera interval Ec may be set so as to satisfy the two equations. 74 and 75, on the two optical axes K4 connecting the two cameras 22 and 24 placed at the observation distance d at the distance of the interocular distance e and the closest point of the object, or The interval when the two cameras are arranged on the two optical axes K5 connecting the two cameras 22 and 24 and the farthest position is the upper limit of the camera interval Ec. That is, the two cameras 22 and 24 may be arranged so as to be included between the two optical axes K4 in FIG. 74 or the narrower one of the two optical axes K5 in FIG.
[0211]
Here, the correction is performed only by the camera interval without changing the optical axis crossing distance, but the optical axis crossing distance may be changed to change the position of the object, or the camera interval and the optical axis crossing distance may be changed. Both may be changed.
[0212]
Correction is also required when using a depth map. If the depth map value represents the shift amount of the point by the number of pixels, and the initial value, generally the value described in the original data is in a state that realizes optimal stereoscopic vision, the appropriate parallax processing, The above processing is not performed when it is necessary to increase the depth map value range, and the above processing is performed only when it is necessary to reduce the depth map value range, that is, when it is necessary to reduce the parallax. Just do it.
[0213]
If the initial value of the parallax is set to be small, the maximum allowable value may be held in the header area of the image and the appropriate parallax processing may be performed so as to be within the maximum allowable value. In these cases, hardware information regarding the appropriate distance is required, but higher-performance processing can be realized as compared with the processing that does not depend on the hardware information described above. The above processing can be used not only when the parallax is automatically set, but also when the parallax is set manually.
[0214]
In addition, the limit of parallax that an observer feels uncomfortable varies depending on the image. In general, in an image with little change in pattern or color and an edge that is conspicuous, crosstalk becomes conspicuous when the parallax is increased. Also, an image with a large luminance difference on both sides of the edge is conspicuous if cross parallax is increased. That is, when an image to be stereoscopically displayed, that is, a parallax image, or a viewpoint image has few high-frequency components, the user tends to feel uncomfortable when viewing the image. Therefore, it is preferable to frequency-analyze the image by a technique such as Fourier transform and correct the appropriate parallax according to the distribution of the frequency components obtained as a result of the analysis. That is, for an image with a large amount of high-frequency components, correction is performed so that the parallax is larger than the appropriate parallax.
[0215]
Furthermore, crosstalk is conspicuous in images with less movement. In general, it is often possible to know whether a file type is a moving image or a still image by examining the extension of the file name. Therefore, when it is determined as a moving image, the motion state may be detected by a known motion detection method such as a motion vector, and the appropriate amount of parallax may be corrected according to the state. That is, correction is performed on an image with little motion so that the parallax is smaller than the original parallax. On the other hand, no correction is applied to an image with much movement. Alternatively, when it is desired to emphasize the motion, correction may be made so that the parallax is larger than the original parallax. Note that the correction of the appropriate parallax is an example, and any parallax range determined in advance can be corrected. In addition, the depth map can be corrected, and the amount of shift of the synthesis position of the parallax image can be corrected.
[0216]
Further, these analysis results may be recorded in the header area of the file, and the stereoscopic image processing apparatus may read the header and use it when displaying the stereoscopic image from the next time.
[0217]
In addition, the amount of high-frequency components and the motion distribution may be ranked by actual stereoscopic vision by the creator or user of the image, or may be ranked by stereoscopic vision by a plurality of evaluators, and the average value is used. The ranking method may be used.
[0218]
In addition, it is not necessary to strictly observe the appropriate parallax, and it is not always necessary to calculate the camera parameter, and the parallax may be performed at regular time intervals or scene changes. This is particularly effective when performed by an apparatus having a low processing capacity. For example, when calculating camera parameters at regular intervals, in the case of generating a stereoscopic image from 3D data, in the first stereoscopic image processing apparatus 100, the parallax control unit 114 uses an internal timer to arrange the cameras at regular intervals. The determination unit 132 may be instructed to recalculate the camera parameters. As the internal timer, a reference frequency of a CPU that performs arithmetic processing of the stereoscopic image processing apparatus 100 may be used, or a dedicated timer may be separately provided.
[0219]
FIG. 76 shows a configuration of the fifth stereoscopic image processing apparatus 100 that realizes calculating an appropriate parallax according to the state of an image. Here, in the first stereoscopic image processing apparatus 100 shown in FIG. 11, an image determination unit 190 is newly provided. Since other configurations and operations are the same, differences will be mainly described. The image determination unit 190 analyzes the frequency component of the image to determine the amount of the high frequency component, and notifies the parallax control unit 114 of the parallax suitable for the image. If the original data is a moving image, the image determination unit 190 A scene determination unit 194 that notifies the parallax control unit 114 of the calculation timing of the camera parameters by detecting a scene change or detecting a motion in the image. The scene change may be detected using a known method.
[0220]
When the original data is a moving image, the processing load of the frequency component detection unit 192 increases when the process of adjusting the appropriate parallax is always performed according to the amount of the high frequency component of the image. There is a concern that the cost of the stereoscopic image processing apparatus 100 may increase if an arithmetic processing apparatus that matches the processing load is used. As described above, since the appropriate parallax does not need to be strictly strictly observed, the frequency component of the image is analyzed when the image changes greatly, such as a scene change, based on the detection result of the scene determination unit 190. As a result, the processing load of the image determination portion 190 can be reduced.
[0221]
When a plurality of virtual cameras are arranged in a three-dimensional space and a parallax image corresponding to each of the virtual cameras is generated, a region where no object information exists in the parallax images may occur. In the following, the principle of generating a region in which no object information exists in a parallax image will be described, and a method for solving it will be described, taking as an example the case where a stereoscopic image is generated starting from three-dimensional data. FIG. 77 shows the relationship between the temporary camera position S (Xs, Ys, Zs), the angle of view θ, and the first to third objects 700, 702, and 704 set by the producer who creates the three-dimensional data. Yes.
[0222]
The temporary camera position S (Xs, Ys, Zs) becomes the center of the virtual cameras when the respective parallax images are generated based on the plurality of virtual cameras (hereinafter also referred to as the camera group center position S). The first object 700 hits the background. Here, the creator views the angle of view so that the second and third objects 702 and 704 fall within the angle of view θ and the object information exists within the angle of view θ of the first object 700 that is the background image. θ and the camera group center position S are set.
[0223]
Next, with a predetermined program, the optical axis crossing position A (Xa, Ya, Za), which is the reference for the near and far positions, is obtained so that the desired parallax can be obtained as shown in FIG. Parameters of the two virtual cameras 722 and 724, specifically, camera positions and respective optical axes are determined. At this time, when the angle of view θ is equal to the previously determined value, at the camera positions of the two virtual cameras 722 and 724, for example, depending on the size of the first object that is the background image, as shown in FIG. Thus, the first and second object zero areas 740 and 742 in which no object information exists are generated.
[0224]
The first object zero area 740 is α in terms of angle, and the second object zero area 742 is β, and there is no object information in these angle ranges. Therefore, as shown in FIG. 79, the angle of view θ may be adjusted so that α and β are eliminated. That is, the larger value of α and β is subtracted from the angle of view θ. At this time, in order not to change the optical axis direction, the new field angle θ1 is determined from θ1 = θ1-2 × α or θ1-2 × β in order to reduce the value to be subtracted from both the left and right of the field angle θ. It is done. However, since α and β may not be immediately determined from the parallax image, the angle of view θ is adjusted little by little, and each time there is an area in which no object information exists in the parallax image. May be confirmed. In addition, the presence or absence of an area where no object information exists may actually be confirmed by whether or not there is data to be input to the pixels on the display screen. Further, the adjustment is not limited so that the object information exists in all the pixels only by adjusting the angle of view θ, but the camera interval E and the optical axis crossing position A may be changed.
[0225]
FIG. 80 is a flowchart showing a view angle adjustment process. This angle of view adjustment process can be realized by the first stereoscopic image display apparatus 100 shown in FIG. First, when original data serving as a starting point of a stereoscopic image is input to the stereoscopic image display apparatus 100, the temporary camera placement unit 130 determines the camera group center position S (S110). Subsequently, the camera arrangement determining unit 132 determines the camera angle of view θ based on the camera group center position S (S112), determines the camera interval E (S114), and the optical axis crossing position A of the virtual camera. Is determined (S116). Further, the camera arrangement determining unit 132 performs coordinate conversion processing on the original data based on the camera interval E and the optical axis crossing position A (S118), and determines whether or not the object information exists in all the pixels on the display screen. (S120).
[0226]
If there is a pixel with no object information (N in S120), correction is performed to slightly narrow the angle of view θ (S122), the process returns to S114, and thereafter, until the object information is present in all pixels, The processing from S114 to S120 is continued. However, when the adjustment is performed so that the object information exists in all the pixels only by correcting the angle of view θ, the determination process of the camera interval E in S114 and the determination process of the optical axis crossing position A in S116 are skipped. If object information exists for all pixels (Y in S120), this angle-of-view adjustment processing ends.
[0227]
In the above-described embodiment, the stereoscopic image generated mainly from the three-dimensional data has been described. Hereinafter, a method for expressing a stereoscopic image from a real image will be described. The difference between the case where the three-dimensional data is the starting point and the case where the three-dimensional data is the starting point is that there is no concept of the depth T of the basic expression space when the three-dimensional data is the starting point. This can be paraphrased as the depth range T in which proper parallax display is possible.
[0228]
As shown in FIGS. 17 and 18, parameters necessary for camera setting for generating a stereoscopic image are the camera interval E, the optical axis crossing distance A, the angle of view θ, and the front projection plane 30 which is the front of the basic expression space. There are six types: a distance S from the camera placement plane, that is, the viewpoint plane 208, a distance D of the optical axis crossing plane 210 from the viewpoint plane 208, and a depth range T. Between these, the following relational expression is satisfied.
E = 2 (S + A) tan (θ / 2) · (SM + SN + TN) / (LT)
A = STM / (SM + SN + TN)
D = S + A
Therefore, if three types are designated among the six types of parameters E, A, θ, S, D, and T, the remaining parameters can be calculated. In general, any parameter can be specified freely, but in the embodiment described above, θ, S, and T are specified, and E, A, and D are calculated. If θ and S are automatically changed, the enlargement ratio changes, and there is a risk that the expression intended by the programmer or the photographer may not be achieved. It is often not preferable to automatically determine these. T can also be a parameter representing the limitation of the expression range, and is preferably determined in advance. In the case of three-dimensional data, it is almost the same as changing any parameter. However, it differs in the case of live-action. Depending on the structure of the camera, the price varies greatly and the operability also changes, so it is desirable to change the parameters to be specified according to the application.
[0229]
FIG. 81 shows a relationship between a stereoscopic photograph photographing apparatus 510 that takes a stereoscopic photograph at an amusement facility, a photo studio, and the like and a subject 552. The stereoscopic photography apparatus 510 includes a camera 550 and a stereoscopic image processing apparatus 100. Here, the shooting environment is fixed. That is, the position of the camera 550 and the position of the subject 552 are determined in advance, and θ, S, and T are determined as parameters. This imaging system is a state in which the example shown in FIG. 18 is replaced with an actual camera 550. Two cameras 522 and 524 are provided in one camera 550, and only the camera 550 becomes a base point of a stereoscopic image. Two parallax images can be taken.
[0230]
FIG. 82 shows the configuration of the sixth stereoscopic image processing apparatus 100 that performs this processing. This stereoscopic image processing apparatus 100 is obtained by replacing the parallax detection unit 150 with a camera control unit 151 in the stereoscopic image processing apparatus 100 shown in FIG. The camera control unit 151 includes a lens interval adjustment unit 153 and an optical axis adjustment unit 155.
[0231]
The lens interval adjusting unit 153 adjusts the position of the two lenses 522 and 524 to adjust the camera interval E, more specifically, the lens interval E. Further, the optical axis adjustment unit 155 adjusts D by changing the optical axis directions of the two lenses 522 and 524. The subject 552 inputs appropriate parallax information of a stereoscopic image display device held at home or the like through a portable recording medium such as a memory or a card, or communication means such as the Internet. The information acquisition unit 118 receives the input of the appropriate parallax and notifies the camera control unit 151 of the input. In response to the notification, the camera control unit 151 calculates E, A, and D, and adjusts the lenses 522 and 524 so that the camera 550 captures an image with appropriate parallax. This is realized by sharing the processing of the stereoscopic display device on which the subject is displayed and the stereoscopic photography device 510 by the library.
[0232]
If you want to place the subject on the screen during display, you can also determine D and A and shoot the subject with the subject positioned at D. In this case, separate the appropriate parallax for near and far positions. It is sufficient to select the one where E becomes smaller. T may be larger than the range occupied by the subject. If there is a background, T may be determined including the background.
[0233]
In addition, the appropriate parallax information does not necessarily have to be checked with a stereoscopic image display device owned by the user who is the subject. For example, a desired stereoscopic effect may be selected with a typical stereoscopic image display device at the shooting site. This selection can be made by the stereoscopic effect adjusting unit 112. Alternatively, it is simply selected from items such as “on screen / distant / close” and “three-dimensionality: large / medium / small” and correspondingly held in the disparity information holding unit 120 in advance. Camera parameters provided may be used. The change of the optical axis crossing position may be changed by the mechanism structure, but may be realized by changing a range used as an image by using a high-resolution CCD (Charge Coupled Device). The function of the position shift unit 160 may be used for this processing.
[0234]
83, a camera 550 that can be moved to a place where a human cannot enter is installed, the camera 550 is operated by a remote operation using a controller 519, and a captured image is observed by a stereoscopic image display device 511. It shows how it is. The stereoscopic image display device 511 incorporates the stereoscopic image display device 100 having the configuration shown in FIG.
[0235]
The camera 550 has a mechanism capable of automatically adjusting the lens interval E. The camera 550 has an optical zoom function or an electronic zoom function, which determines θ. However, the amount of parallax changes by this zoom operation. In general, the farther the image is taken, the smaller the angle formed by the optical axes between the viewpoints at the time of display. Therefore, at the lens interval E, the parallax becomes small and the stereoscopic effect becomes poor. Therefore, it is necessary to appropriately change camera settings such as the lens interval E and the zoom amount. Here, camera settings are automatically controlled in such a case, and complicated camera settings are greatly reduced. Note that the controller 519 may be used to adjust camera settings.
[0236]
When the operator first operates the optical zoom or the electronic zoom using the controller 519, θ is determined. Next, the camera 550 is moved to display the subject to be photographed in the center of the stereoscopic display device 511. The camera 550 focuses on the subject by the autofocus function and simultaneously acquires the distance. In the initial state, this distance is D. That is, the camera 550 is automatically set so that the subject appears to be positioned near the display screen. The range of T can be changed manually, and the operator designates the distribution in the depth direction of the object whose context is to be grasped in advance. Thus, θ, D, and T are determined. Thereby, E, A, and S are determined from the three relational expressions shown above, and the camera 550 is automatically adjusted appropriately. In this example, since S is determined later, it is uncertain which range T will eventually become. Therefore, T should be set large to some extent.
[0237]
If it is desired to display the subject at the edge of the screen, the subject is once displayed in the center, a predetermined button is pressed so that the focus and D can be fixed, and then the orientation of the camera 550 is changed. If the focus and D can be changed manually, the depth position of the subject can be freely changed.
[0238]
FIG. 84 shows an example of shooting by the stereoscopic image shooting device 510. The stereoscopic image capturing apparatus 510 has the configuration shown in FIG. The appropriate parallax of the stereoscopic image display device held by the photographer in advance is input to the camera 550 through a recording medium such as a portable memory or a communication means such as the Internet. Here, a camera having a simple structure and available at a relatively low price is assumed as the camera 550. Here, the camera interval E, the optical axis crossing distance D, and the angle of view θ are fixed, and A, S, and T are determined from the three relational expressions shown above. The appropriate range of the distance to the subject can be calculated from these values, so the distance to the subject is measured in real time, and whether the calculated distance is appropriate is photographed with a message, lamp color, etc. Can be notified. The distance to the subject may be acquired by a known technique such as an autofocus distance measurement function.
[0239]
As described above, the combination of which camera parameter is used as a variable or constant is free, and there are various forms according to the application. In addition to the above, the camera 550 may be attached to various devices such as a microscope, a medical endoscope, and a portable terminal.
[0240]
Note that if the parallax is optimized for a specific stereoscopic display device, stereoscopic viewing may be difficult with other stereoscopic display devices. However, in general, the performance of the device is improved, and it is rare that the parallax is too large for the next stereoscopic display device to be purchased. Rather, it is important to perform the adjustment described above in order to avoid a risk that stereoscopic viewing becomes difficult regardless of the performance of the stereoscopic display device due to inadequate setting of the photographing device. Here, the stereoscopic display device includes a stereoscopic image processing device for realizing stereoscopic viewing.
[0241]
The appropriate parallax obtained by the stereoscopic effect adjusting unit 112 of the first to sixth stereoscopic image processing apparatuses 100 is a parameter determined by the user while stereoscopically viewing the specific stereoscopic image processing apparatus 100, and the stereoscopic image processing apparatus In 100, the appropriate parallax is maintained thereafter. This stereoscopic effect adjustment operation takes into account two factors, “image separation performance” unique to the stereoscopic display device and “physiological limit” unique to the observer. “Image separation performance” is an objective factor that represents the ability to separate multiple viewpoint images. Stereoscopic display devices with low performance can easily detect crosstalk even with little parallax, and can be adjusted by multiple observers. The range of the appropriate parallax when performing is narrowed on average. On the contrary, if the image separation performance is high, even if a large parallax is applied, the crosstalk is hardly sensed, and the range of the appropriate parallax tends to be wide on average. On the other hand, the “physiological limit” is a subjective factor. For example, even if the image separation performance is very high and the images are completely separated, the parallax range in which the viewer does not feel discomfort varies. This appears as a variation in appropriate parallax in the same stereoscopic image processing apparatus 100.
[0242]
The image separation performance is also called the degree of separation, and can be determined by a method of measuring the illuminance of the reference image 572 while moving the illuminometer 570 in the horizontal direction at the optimum observation distance as shown in FIG. At that time, in the case of the two-lens type, for example, all white is displayed in the left eye image and all black is displayed in the right eye image. If the images are completely separated, the illuminance at the position where the right eye image can be seen is zero. In contrast, image separation performance can be obtained by measuring the degree of white leakage of the left eye image. This graph and the graph on the right end are examples of measurement results. Since this measurement is almost equivalent to measuring the density of moire, the image separation performance is also measured by capturing a moire image at a distance where the moire is observed as shown in FIG. 86 and analyzing the density. be able to.
[0243]
Even in a glasses-type stereoscopic display device or the like, image separation performance can be measured by measuring leakage light in the same manner. In practice, the calculation may be performed by taking the measurement value when the left and right images are all black into consideration as the background. Further, the image separation performance can be determined by an average value of ranking evaluation by a large number of observers.
[0244]
As described above, the image separation performance of the stereoscopic display device can be provided with a judgment criterion such as an objective numerical value. For example, the rank of the stereoscopic display device 450 in FIG. If the appropriate parallax is known, the appropriate parallax can be converted to match the rank of the other stereoscopic display device 440. In addition, the stereoscopic display device includes parameters that are eigenvalues such as a screen size, a pixel pitch, and an optimum observation distance, and information on these parameters is also used for conversion of appropriate parallax.
[0245]
Hereinafter, an example of appropriate parallax conversion will be described in order for each parameter with reference to FIGS. 87 and 88. Here, it is assumed that the appropriate parallax is held by N / L and M / L. Here, M is the near limit parallax, N is the far limit parallax, and L is the screen size. By expressing the ratio value in this way, the difference in pixel pitch between the stereoscopic display devices can be ignored. Therefore, in the drawings used below, it is assumed that the pixel pitches are equal for ease of explanation.
[0246]
First, conversion for differences in screen size will be described. As shown in FIG. 87, it is preferable to perform processing so as not to change the absolute value of the parallax regardless of the screen size. That is, the stereoscopic expression range in the front-rear direction is made the same. Assume that the screen size has increased a times, as in the state shown in the lower side from the state shown in the upper side of the figure. At this time, even when the screen size is different by converting N / L to N / (aL) and M / L to M / (aL), appropriate parallax is realized. In this figure, an example of the nearest placement point is shown.
[0247]
Next, conversion for differences in observation distance will be described. As shown in FIG. 88, when the optimum observation distance d is b times, it is preferable that the absolute value of the parallax is also b times. That is, the angle of parallax that the eye sees is kept constant. Therefore, even when the optimum observation distance is different by converting N / L to bN / L and M / L to bM / L, appropriate parallax is realized. In this figure, it represents as an example of the nearest placement point.
[0248]
Finally, a description will be given of taking into consideration factors of image separation performance. Here, it is assumed that the rank r of the image separation performance is an integer equal to or greater than 0, and 0 is such that the performance is so bad that parallax cannot be added. When the image separation performance of the first stereoscopic display device is r0 and the image separation performance of the second stereoscopic display device is r1, c / r1 / r0 is set, N / L is set to cN / L, and M / L Is converted to cM / L. As a result, appropriate parallax can be realized even in a stereoscopic display device having different image resolution. Note that the formula for deriving c shown here is an example, and may be derived from other formulas.
[0249]
When all the above processes are performed, N / L is eventually converted to bcN / (aL) and M / L is converted to bcM / (aL). This conversion can be applied to both horizontal parallax and vertical parallax. Note that the above-described conversion of appropriate parallax can be realized by the configurations shown in FIGS. 52, 53, and 54.
[0250]
Further, the front and back surfaces of the basic expression space may be determined using a Z buffer. The Z buffer is a hidden surface processing technique, and a depth map of an object group viewed from the camera can be obtained. You may use the minimum value and maximum value which remove this Z value as a position of the forefront and back. As processing, processing for obtaining a Z value from the position of the virtual camera is added. Since this process does not require the final resolution, the processing time is shortened by reducing the number of pixels. By this method, the hidden portion is ignored, so that the appropriate parallax range can be used effectively. Also, it is easy to handle even if there are multiple objects.
[0251]
Further, the parallax control unit 114, when generating a stereoscopic image from three-dimensional data, changes the camera parameter to the variation of the parameter when a parameter related to camera arrangement set to generate the parallax image is changed. On the other hand, control may be performed so as to be within a predetermined threshold. Further, the parallax control unit 114 generates the maximum depth value included in the depth information that is generated as the two-dimensional moving image progresses when generating a stereoscopic image of the moving image from the two-dimensional moving image to which the depth information is given. Or you may control so that the fluctuation | variation of a minimum value may be settled in the threshold value provided previously. The threshold value used for these controls may be held in the parallax information holding unit 120.
[0252]
When generating a 3D image from 3D data, if the basic representation space is determined from the objects that exist in the field of view, the size of the basic representation space increases rapidly due to the rapid movement, frame-in, and frame-out of the object. Instead, the parameters related to camera placement may fluctuate greatly. When this variation is larger than a predetermined threshold value, the variation may be permitted up to the threshold value. In addition, when generating a stereoscopic image from a two-dimensional moving image to which depth information is given, if the maximum or minimum value of the parallax amount is determined from the maximum or minimum value of the depth, the same inconvenience can be considered. It is done. A threshold value may be provided for this variation.
[0253]
【The invention's effect】
The present invention has the following effects.
1. It is possible to generate or display a stereoscopic image that is easily adapted to human physiology.
2. Even if the display target image changes, a stereoscopic image appropriate for the user can be generated or displayed.
3. The stereoscopic effect of the 3D display can be adjusted with a simple operation.
4). It is possible to reduce the burden on the programmer when creating contents or applications capable of appropriate stereoscopic display.
5. The user's effort to optimize the stereoscopic display is reduced.
6). The same applies to devices that cannot easily be plug-and-played in principle, such as a parallax barrier attached later, that can easily realize stereoscopic adjustment and head tracking information that are not subject to the plug-and-play function.
[Brief description of the drawings]
FIG. 1 is a diagram illustrating a positional relationship among a user, a screen, and a playback object 14 that are ideally viewed stereoscopically.
FIG. 2 is a diagram illustrating an example of an imaging system that realizes the state of FIG. 1;
FIG. 3 is a diagram illustrating another example of an imaging system that realizes the state of FIG. 1;
4 is a diagram illustrating another example of an imaging system that realizes the state of FIG. 1; FIG.
FIG. 5 is a diagram illustrating a model coordinate system used in the first stereoscopic image processing apparatus.
FIG. 6 is a diagram showing a world coordinate system used for the first stereoscopic image processing apparatus.
FIG. 7 is a diagram illustrating a camera coordinate system used for the first stereoscopic image processing apparatus.
FIG. 8 is a diagram showing a view volume used for the first stereoscopic image processing apparatus.
9 is a diagram showing a coordinate system after perspective transformation of the volume of FIG. 8; FIG.
FIG. 10 is a diagram illustrating a screen coordinate system used in the first stereoscopic image processing apparatus.
FIG. 11 is a configuration diagram of a first stereoscopic image processing apparatus.
FIG. 12 is a configuration diagram of a second stereoscopic image processing apparatus.
FIG. 13 is a configuration diagram of a third stereoscopic image processing apparatus.
FIGS. 14A and 14B are diagrams illustrating a left-eye image and a right-eye image displayed by the stereoscopic effect adjustment unit of the first stereoscopic image processing apparatus, respectively.
FIG. 15 is a diagram illustrating a plurality of objects having different parallaxes displayed by the stereoscopic effect adjusting unit of the first stereoscopic image processing apparatus.
FIG. 16 is a diagram illustrating an object whose parallax changes, which is displayed by the stereoscopic effect adjusting unit of the first stereoscopic image processing apparatus.
FIG. 17 is a diagram illustrating a relationship between a camera angle of view, an image size, and parallax when appropriate parallax is realized.
18 is a diagram illustrating a positional relationship of the imaging system that realizes the state of FIG. 17;
FIG. 19 is a diagram illustrating a positional relationship of an imaging system that realizes the state of FIG. 17;
FIG. 20 is a diagram illustrating a camera arrangement when a multi-viewpoint image is generated with appropriate parallax.
FIG. 21 is a diagram illustrating a parallax correction map used by the distortion processing unit of the first stereoscopic image processing apparatus.
22 is a diagram showing a camera viewpoint when generating a parallax image in accordance with the parallax correction map of FIG. 21. FIG.
FIG. 23 is a diagram showing another camera viewpoint when generating a parallax image in accordance with the parallax correction map of FIG.
FIG. 24 is a diagram illustrating a parallax correction map used by the distortion processing unit of the first stereoscopic image processing apparatus.
FIG. 25 is a diagram showing a camera viewpoint when generating a parallax image according to the parallax correction map of FIG. 24;
FIG. 26 is a diagram illustrating a sense of distance correction map used by the distortion processing unit of the first stereoscopic image processing apparatus.
FIG. 27 is a diagram showing a camera viewpoint when generating a parallax image according to the sense of distance correction map of FIG.
FIG. 28 is a diagram showing another sense of distance correction map used by the distortion processing unit of the first stereoscopic image processing apparatus.
29 is a diagram illustrating a camera viewpoint when generating a parallax image according to the sense of distance correction map of FIG.
30 (a), FIG. 30 (b), FIG. 30 (c), FIG. 30 (d), FIG. 30 (e), and FIG. 30 (f) are all first stereoscopic image processing apparatuses. It is a top view of the parallax distribution obtained as a result of processing the distortion processing unit in the three-dimensional space.
FIG. 31 is a diagram illustrating a principle of processing by a distortion processing unit of the first stereoscopic image processing apparatus.
32 is a diagram specifically illustrating the process of FIG. 31. FIG.
FIG. 33 is a diagram specifically showing the processing of FIG. 31;
34 is a diagram specifically illustrating the processing of FIG. 31. FIG.
FIG. 35 is a diagram illustrating another example of processing by the distortion processing unit of the first stereoscopic image processing apparatus.
36 is a diagram specifically illustrating the processing of FIG. 35. FIG.
FIG. 37 is a diagram showing a depth map.
FIG. 38 is a diagram illustrating an example of processing by a distortion processing unit of the third stereoscopic image processing apparatus.
FIG. 39 is a diagram illustrating a depth map generated by processing by the distortion processing unit of the third stereoscopic image processing apparatus.
FIG. 40 is a diagram illustrating another example of processing by the distortion processing unit of the third stereoscopic image processing apparatus.
FIG. 41 is a diagram illustrating an example of processing by a two-dimensional image generation unit of the second stereoscopic image processing apparatus.
FIG. 42 is a diagram illustrating an example of a parallax image.
FIG. 43 is a diagram illustrating a parallax image in which the synthesis position is shifted by the two-dimensional image generation unit of the second stereoscopic image processing apparatus.
FIG. 44 is a diagram illustrating processing of an image edge adjustment unit of the second stereoscopic image processing apparatus.
FIG. 45 is a diagram illustrating processing of the second stereoscopic image processing apparatus.
FIG. 46 is a diagram illustrating another process of the second stereoscopic image processing apparatus.
FIG. 47 is a diagram illustrating another process of the second stereoscopic image processing apparatus.
FIG. 48 is a diagram illustrating a planar image to which a depth map is added.
FIG. 49 is a diagram showing a depth map.
FIG. 50 is a diagram illustrating a state in which a parallax image is generated based on a depth map by a two-dimensional image generation unit of a second stereoscopic image processing apparatus.
FIG. 51 is a diagram illustrating a depth map corrected by the two-dimensional image generation unit of the second stereoscopic image processing apparatus.
FIG. 52 is a diagram illustrating a manner in which the stereoscopic image processing apparatus according to the embodiment is used as a library.
FIG. 53 is a configuration diagram in which a stereoscopic display library is incorporated into three-dimensional data software.
FIG. 54 is a diagram illustrating a manner in which a stereoscopic display library is used in a network-based system.
FIG. 55 is a diagram showing a state in which an image composed of three-dimensional data is displayed on the display screen.
FIG. 56 is a diagram showing another state in which an image composed of three-dimensional data is displayed on the display screen.
FIG. 57 is a diagram showing another state in which an image composed of three-dimensional data is displayed on the display screen.
FIG. 58 is a diagram showing a method for matching the interface of an object to be displayed with the display screen.
FIG. 59 is a diagram showing another state in which an image is taken with the optical axis crossing positions of two virtual cameras aligned with one surface of the aquarium.
FIG. 60 is a block diagram of a fourth stereoscopic image processing apparatus.
FIG. 61 is a diagram illustrating a convenient basic expression space T regarding an image displayed by the fourth stereoscopic image processing apparatus.
FIG. 62 is a diagram representing a region where no object exists by including it in the basic representation space T.
FIG. 63 is a diagram representing a region where no object exists by including it in the basic representation space T.
FIG. 64 is a diagram illustrating a state in which a moving object is formed so as to include not only a bird but also a space before and after it as a target for calculating parallax;
FIG. 65 is a diagram illustrating a state in which a bird 330 is moved in a previously included space after the moving object has passed the front projection plane.
FIG. 66 is a diagram showing a state where an observer observes a stereoscopic image on the display screen.
FIG. 67 is a diagram illustrating a camera arrangement determined by a camera arrangement determining unit.
68 is a diagram illustrating a state in which an observer observes a parallax image obtained with the camera arrangement in FIG. 67. FIG.
69 is a diagram illustrating a state in which an observer observes a display screen of an image in which appropriate parallax is obtained with the camera arrangement of FIG. 67 at the position of the observer illustrated in FIG. 66.
70 is a diagram showing a state in which the nearest point of a sphere located at a distance A from the display screen is photographed with the camera arrangement shown in FIG. 67. FIG.
71 is a diagram illustrating a relationship between an optical axis tolerance distance between two cameras and the cameras, and a camera interval necessary for obtaining the parallax illustrated in FIG. 70. FIG.
72 is a diagram showing a state where the farthest point of a sphere located at a distance T-A from the display screen is photographed with the camera arrangement shown in FIG. 67. FIG.
73 is a diagram showing a relationship between an optical axis tolerance distance between two cameras and the cameras and a camera interval E1 necessary for obtaining the parallax shown in FIG. 72. FIG.
[Fig. 74] Fig. 74 is a diagram illustrating a relationship between camera parameters required for setting the parallax of a stereoscopic image within an appropriate parallax range.
FIG. 75 is a diagram illustrating a relationship of camera parameters required for setting the parallax of a stereoscopic image within an appropriate parallax range.
FIG. 76 is a block diagram of a fifth stereoscopic image processing apparatus.
77 is a diagram illustrating a relationship between a temporary camera position, an angle of view, and first to third objects set by a producer who creates three-dimensional data. FIG.
78 is a diagram showing a state in which two virtual cameras are arranged based on the temporary camera position determined in FIG. 77. FIG.
FIG. 79 is a diagram illustrating a state in which the camera arrangement is adjusted so that an area where no object information exists does not occur.
FIG. 80 is a diagram illustrating a view angle adjustment process.
FIG. 81 is a diagram illustrating a relationship between a subject and a stereoscopic photography device that takes a stereoscopic photograph at an amusement facility or a photo studio.
FIG. 82 is a diagram illustrating a configuration of a sixth stereoscopic image processing apparatus.
[Fig. 83] Fig. 83 is a diagram illustrating a state in which the camera is operated by remote control and a captured image is observed with the stereoscopic image display device.
[Fig. 84] Fig. 84 is a diagram illustrating an example of shooting by a stereoscopic image capturing device including a sixth stereoscopic image processing device.
FIG. 85 is a diagram showing a state in which image resolution is measured with an illuminometer.
FIG. 86 is a diagram showing a moire image used for measurement of image resolution.
[Fig. 87] Fig. 87 is a diagram illustrating a conversion example of appropriate parallax.
[Fig. 88] Fig. 88 is a diagram illustrating another conversion example of appropriate parallax.
FIG. 89 is a diagram illustrating a table used for simple determination of parallax and a basic expression space.
[Explanation of symbols]
10 users, 12 screens, 14 playback objects, 20 real objects, 22, 24, 26, 28 cameras, 30 front projection planes, 32 rear projection planes, 100 stereoscopic image processing devices, 112 stereoscopic effect adjustment units, 114, 152, 170 Parallax Control Unit, 116 Format Conversion Unit, 118 Information Acquisition Unit, 122 Instruction Acquisition Unit, 124 Parallax Identification Unit, 132 Camera Placement Determination Unit, 136,174 Distortion Processing Unit, 140,176 Correction Map Holding Unit, 142 Two-dimensional Image Generation unit, 150 parallax amount detection unit, 151 camera control unit, 156 header inspection unit, 158 matching unit, 160 position shift unit, 164 parallax writing unit, 168 image edge adjustment unit, 178 two-dimensional image generation unit, 180 object designation , 190 image determination unit, 192 frequency component detection unit, 94 scene determination unit, 210 optical axis crossing plane, 300 stereoscopic display library, 400 display screen, 402 three-dimensional data software, 406 shooting instruction processing unit, 430 network-based system, 432 game machine, 434 user terminal, 436 server, 452 viewer program, 510 stereoscopic photography device.

Claims (23)

  1. An instruction acquisition unit that acquires a user's response as to whether or not a stereoscopic image displayed with various parallaxes can be allowed when displaying a stereoscopic image based on a plurality of viewpoint images corresponding to different parallaxes;
    Based on the acquired response, a parallax identifying unit that identifies appropriate parallax as parallax that the user can tolerate ,
    A parallax control unit that performs processing on the other stereoscopic image with the specified appropriate parallax so that the user can accept the different stereoscopic image when displaying a stereoscopic image different from the stereoscopic image;
    A stereoscopic image processing apparatus comprising:
  2. Said another three-dimensional image is a stereoscopic image generated starting from the three-dimensional data, according to claim 1 wherein the parallax control unit, characterized in that determining a plurality of viewpoints for generating the stereoscopic image according to the appropriate parallax The device described in 1.
  3. The apparatus according to claim 2 , wherein the parallax control unit determines a distance between the plurality of viewpoints and an intersection position of the optical axes in which an object is viewed from the viewpoints.
  4. The apparatus according to claim 1 , wherein the parallax control unit performs control so that the appropriate parallax is realized for a predetermined basic three-dimensional space to be displayed.
  5. The parallax control unit may be any of claims 1 to 3, wherein the controller controls to the appropriate parallax is realized on the coordinates of the objects that are most To置coordinates of objects which are most approximation in the three-dimensional space The device described in 1.
  6. The images constituting the another stereoscopic image are a plurality of two-dimensional images to which parallax has already been given, and the parallax control unit determines a shift amount in the horizontal direction of the plurality of two-dimensional images according to the appropriate parallax. The apparatus of claim 1 .
  7. Images constituting said another three-dimensional image is a planar image depth information is given, the parallax control unit in claim 1, characterized in that to adjust the depth information of the plane image according to the appropriate parallax The device described.
  8. A parallax holding unit that records the appropriate parallax;
    The apparatus according to claim 1 , wherein the parallax control unit reads the appropriate parallax at a predetermined timing, and performs the processing using the value as an initial value.
  9. An object designation unit that accepts designation of a predetermined object included in the stereoscopic image from a user;
    The position of the designated object is made to correspond to the intersection position of the optical axes associated with each of the plurality of viewpoint images, and the designated object is expressed in the vicinity of the position of the display screen on which the stereoscopic image is displayed. An optical axis crossing position setting unit for setting the crossing position of the optical axes to
    A device according to any one of claims 1 to 8 , characterized in that
  10. The designated object has a predetermined interface;
    The apparatus according to claim 9 , wherein the optical axis crossing position setting unit associates the crossing position of the optical axis with the interface.
  11. Optical axis correspondence information describing that the specified object is associated with the intersection position of the optical axis and that the object is represented near the position of the display screen is the object. A designation information adding part to be associated with,
    The stereoscopic image processing apparatus according to claim 9 , further comprising:
  12. The optical axis crossing position setting unit acquires the optical axis correspondence information, associates the crossing position of the optical axis with the object described in the acquired optical axis correspondence information, and determines the crossing position of the optical axis. The stereoscopic image processing apparatus according to claim 11 , wherein the associated object is expressed near a position of a display screen on which the stereoscopic image is displayed.
  13. It is associated with the image data used when generating the stereoscopic image, and the object included in the stereoscopic image includes information on whether or not to express in the basic expression space including the object to be stereoscopically displayed. An identification information acquisition unit for acquiring identification information;
    Based on the acquired identification information, a parallax control unit that reflects the parallax amount on the object;
    The apparatus according to claim 1 , further comprising:
  14. The parallax control unit further controls the parallax so that the parallax does not become larger than the parallax in a range where the width / depth ratio of the object expressed in the stereoscopic image is correctly perceived by human eyes. The apparatus according to claim 1 .
  15. A camera placement setting unit for setting placement of a plurality of virtual cameras for generating a plurality of viewpoint images when original data serving as a starting point of the stereoscopic image is input;
    An object area determination unit that determines whether or not an area in which information of an object to be displayed does not exist in the viewpoint image generated corresponding to each of the virtual cameras;
    When there is an area where there is no object information to be displayed, at least one of the angle of view of the virtual camera, the camera interval, and the crossing position of the optical axes is set so that there is no area where the object information does not exist. A camera parameter adjustment unit to be adjusted;
    The apparatus according to claim 1 , further comprising:
  16. When the stereoscopic image is generated from three-dimensional data as a starting point, the parallax control unit, when generating the stereoscopic image, when a parameter related to camera arrangement set to generate the parallax image is changed, The apparatus according to claim 1 , wherein the camera parameter is controlled so as to be within a predetermined threshold with respect to a change in the parameter.
  17. When the stereoscopic image is generated starting from a two-dimensional moving image to which depth information is given, the parallax control unit generates the maximum depth included in the depth information that is generated as the two-dimensional moving image progresses. The apparatus according to any one of claims 1 to 16 , wherein the control is performed so that the fluctuation of the value or the minimum value falls within a predetermined threshold value.
  18. The apparatus according to claim 1 , further comprising an image determination unit that analyzes the stereoscopic image in units of scenes.
  19. Displaying a plurality of stereoscopic images with different parallax to the user;
    Identifying an appropriate parallax as a parallax acceptable to the user based on a user response as to whether or not a plurality of stereoscopic images displayed at various parallaxes are acceptable ;
    When another stereoscopic image different from the plurality of stereoscopic images is displayed, the other stereoscopic image is processed with the appropriate parallax specified for the plurality of stereoscopic images so that the user can accept the different stereoscopic image. Steps,
    A stereoscopic image processing method comprising:
  20. The method according to claim 19 , wherein each of the steps is implemented as a function of a library for stereoscopic display, and the function of the library can be called as a function from a plurality of programs.
  21. 21. The method according to claim 19 , further comprising a step of changing a moving speed of an object to be expressed in a basic expression space including an object to be stereoscopically displayed in the stereoscopic image with respect to a near position or a far distance direction. the method according to.
  22. While representing an object to be represented in the basic expression space including the object to be displayed in three dimensions so that it falls within a predetermined parallax range, the object exists on at least one of the forefront surface and the rearmost surface of the basic expression space. The method according to any one of claims 19 to 21 , further comprising the step of adjusting the appropriate parallax by setting the position to a position where it is not performed.
  23. Displaying a plurality of stereoscopic images with different parallax to the user;
    Identifying an appropriate parallax as a parallax acceptable to the user based on a user response as to whether or not a plurality of stereoscopic images displayed at various parallaxes are acceptable ;
    When another stereoscopic image different from the plurality of stereoscopic images is displayed, the other stereoscopic image is processed with the appropriate parallax specified for the plurality of stereoscopic images so that the user can accept the different stereoscopic image. Steps,
    A computer program for causing a computer to execute.
JP2003003761A 2002-03-27 2003-01-09 Stereoscopic image processing method and apparatus Expired - Fee Related JP3749227B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2002087493 2002-03-27
JP2003003761A JP3749227B2 (en) 2002-03-27 2003-01-09 Stereoscopic image processing method and apparatus

Applications Claiming Priority (25)

Application Number Priority Date Filing Date Title
JP2003003761A JP3749227B2 (en) 2002-03-27 2003-01-09 Stereoscopic image processing method and apparatus
PCT/JP2003/003791 WO2003081921A1 (en) 2002-03-27 2003-03-27 3-dimensional image processing method and device
EP11161295.8A EP2357838B1 (en) 2002-03-27 2003-03-27 Method and apparatus for processing three-dimensional images
EP11161315A EP2357839A3 (en) 2002-03-27 2003-03-27 Method and apparatus for processing three-dimensional images
EP20110161303 EP2381691B1 (en) 2002-03-27 2003-03-27 Method and apparatus for processing three-dimensional images
EP11161320A EP2357840A3 (en) 2002-03-27 2003-03-27 Method and apparatus for processing three-dimensional images
EP11161269A EP2362670B1 (en) 2002-03-27 2003-03-27 Method and apparatus for processing three-dimensional images
EP03715473A EP1489857B1 (en) 2002-03-27 2003-03-27 3-dimensional image processing method and device
EP11161284A EP2357837A3 (en) 2002-03-27 2003-03-27 Method and apparatus for processing three-dimensional images
EP20110161281 EP2357836B1 (en) 2002-03-27 2003-03-27 Method and apparatus for processing three-dimensional images
CN038071592A CN1643939B (en) 2002-03-27 2003-03-27 Method and apparatus for processing three-dimensional images
EP11161340A EP2387248A3 (en) 2002-03-27 2003-03-27 Method and apparatus for processing three-dimensional images
KR1020047015105A KR100812905B1 (en) 2002-03-27 2003-03-27 3-dimensional image processing method and device
EP11161274A EP2357835A3 (en) 2002-03-27 2003-03-27 Method and apparatus for processing three-dimensional images
EP11161329.5A EP2357841B1 (en) 2002-03-27 2003-03-27 Method and apparatus for processing three-dimensional images
US10/949,528 US8369607B2 (en) 2002-03-27 2004-09-27 Method and apparatus for processing three-dimensional images
US12/976,262 US8131064B2 (en) 2002-03-27 2010-12-22 Method and apparatus for processing three-dimensional images
US12/986,551 US8724886B2 (en) 2002-03-27 2011-01-07 Method and apparatus for processing three-dimensional images
US12/986,509 US8417024B2 (en) 2002-03-27 2011-01-07 Method and apparatus for processing three-dimensional images
US12/986,471 US8577127B2 (en) 2002-03-27 2011-01-07 Method and apparatus for processing three-dimensional images
US12/986,453 US8254668B2 (en) 2002-03-27 2011-01-07 Method and apparatus for processing three-dimensional images
US12/986,491 US8879824B2 (en) 2002-03-27 2011-01-07 Method and apparatus for processing three-dimensional images
US12/986,530 US8577128B2 (en) 2002-03-27 2011-01-07 Method and apparatus for processing three-dimensional images
US13/088,752 US8472702B2 (en) 2002-03-27 2011-04-18 Method and apparatus for processing three-dimensional images
US13/283,361 US8559703B2 (en) 2002-03-27 2011-10-27 Method and apparatus for processing three-dimensional images

Publications (2)

Publication Number Publication Date
JP2004007395A JP2004007395A (en) 2004-01-08
JP3749227B2 true JP3749227B2 (en) 2006-02-22

Family

ID=30446074

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003003761A Expired - Fee Related JP3749227B2 (en) 2002-03-27 2003-01-09 Stereoscopic image processing method and apparatus

Country Status (1)

Country Link
JP (1) JP3749227B2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110059531A (en) 2009-11-27 2011-06-02 소니 주식회사 Image processing apparatus, image processing method and program
EP2509325A2 (en) 2011-04-08 2012-10-10 Sony Corporation Image processing apparatus, image processing method, and program
US8630480B2 (en) 2010-11-29 2014-01-14 Sony Corporation Image processing apparatus, display apparatus, image processing method and image processing program
US8723933B2 (en) 2008-08-12 2014-05-13 Sony Corporation Three-dimensional image correction device, three-dimensional image correction method, three-dimensional image display device, three-dimensional image reproduction device, three-dimensional image provision system, program, and recording medium

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101313740B1 (en) * 2007-10-08 2013-10-15 주식회사 스테레오피아 OSMU( One Source Multi Use)-type Stereoscopic Camera and Method of Making Stereoscopic Video Content thereof
JP2009246625A (en) * 2008-03-31 2009-10-22 Fujifilm Corp Stereoscopic display apparatus, stereoscopic display method, and program
JP4962674B1 (en) * 2009-04-03 2012-06-27 ソニー株式会社 Information processing apparatus, information processing method, and program
JP4915456B2 (en) 2009-04-03 2012-04-11 ソニー株式会社 Information processing apparatus, information processing method, and program
JP5409107B2 (en) 2009-05-13 2014-02-05 任天堂株式会社 Display control program, information processing apparatus, display control method, and information processing system
US20120189208A1 (en) * 2009-09-16 2012-07-26 Pioneer Corporation Image processing apparatus, image processing method, image processing program, and storage medium
JP5405264B2 (en) 2009-10-20 2014-02-05 任天堂株式会社 Display control program, library program, information processing system, and display control method
EP2355526A3 (en) 2010-01-14 2012-10-31 Nintendo Co., Ltd. Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method
US9693039B2 (en) 2010-05-27 2017-06-27 Nintendo Co., Ltd. Hand-held electronic device
CN103098479A (en) * 2010-06-30 2013-05-08 富士胶片株式会社 Image processing device, method and program
WO2012046369A1 (en) * 2010-10-07 2012-04-12 パナソニック株式会社 Image capturing device, disparity adjustment method, semiconductor integrated circuit, and digital camera
JP4956658B2 (en) * 2010-10-12 2012-06-20 シャープ株式会社 3D image conversion device and 3D image display device
JP5594067B2 (en) * 2010-11-02 2014-09-24 ソニー株式会社 Image processing apparatus and image processing method
JP5725902B2 (en) 2011-02-24 2015-05-27 任天堂株式会社 Image processing program, image processing apparatus, image processing method, and image processing system
JP2012174237A (en) 2011-02-24 2012-09-10 Nintendo Co Ltd Display control program, display control device, display control system and display control method
JP2012244396A (en) * 2011-05-19 2012-12-10 Sony Corp Image processing apparatus, image processing method, and program
JP5181083B1 (en) 2012-01-19 2013-04-10 パナソニック株式会社 Stereoscopic image display control device, stereoscopic image display control method, and program
US9591290B2 (en) * 2014-06-10 2017-03-07 Bitanimate, Inc. Stereoscopic video generation
JP5955373B2 (en) * 2014-12-22 2016-07-20 三菱電機株式会社 3D stereoscopic display device and 3D stereoscopic display signal generation device

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8723933B2 (en) 2008-08-12 2014-05-13 Sony Corporation Three-dimensional image correction device, three-dimensional image correction method, three-dimensional image display device, three-dimensional image reproduction device, three-dimensional image provision system, program, and recording medium
KR20110059531A (en) 2009-11-27 2011-06-02 소니 주식회사 Image processing apparatus, image processing method and program
JP2011113363A (en) * 2009-11-27 2011-06-09 Sony Corp Image processor, image processing method, and program
EP2345993A2 (en) 2009-11-27 2011-07-20 Sony Corporation Image processing apparatus, image processing method and program
US9098907B2 (en) 2009-11-27 2015-08-04 Sony Corporation Image processing apparatus, image processing method and program
US9934575B2 (en) 2009-11-27 2018-04-03 Sony Corporation Image processing apparatus, method and computer program to adjust 3D information based on human visual characteristics
US8630480B2 (en) 2010-11-29 2014-01-14 Sony Corporation Image processing apparatus, display apparatus, image processing method and image processing program
EP2509325A2 (en) 2011-04-08 2012-10-10 Sony Corporation Image processing apparatus, image processing method, and program

Also Published As

Publication number Publication date
JP2004007395A (en) 2004-01-08

Similar Documents

Publication Publication Date Title
US8228327B2 (en) Non-linear depth rendering of stereoscopic animated images
CA2752691C (en) Systems, apparatus and methods for subtitling for stereoscopic content
KR100990416B1 (en) Display apparatus, image processing apparatus and image processing method, imaging apparatus, and recording medium
JP5679978B2 (en) Stereoscopic image alignment apparatus, stereoscopic image alignment method, and program thereof
JP4508878B2 (en) Video filter processing for stereoscopic images
US7528830B2 (en) System and method for rendering 3-D images on a 3-D image display screen
CN1144157C (en) System and method for creating 3D models from 2D sequential image data
JP4707368B2 (en) Stereoscopic image creation method and apparatus
US9445072B2 (en) Synthesizing views based on image domain warping
US6496598B1 (en) Image processing method and apparatus
JP2006507764A (en) Critical alignment of parallax images for autostereoscopic display
US20110109720A1 (en) Stereoscopic editing for video production, post-production and display adaptation
JP4517664B2 (en) Image processing apparatus and method, program, and recording medium
JP4351996B2 (en) Method for generating a stereoscopic image from a monoscope image
EP2262273A2 (en) Image data creation device, image data reproduction device, and image data recording medium
US20050219239A1 (en) Method and apparatus for processing three-dimensional images
US7596259B2 (en) Image generation system, image generation method, program, and information storage medium
JP2012227924A (en) Image analysis apparatus, image analysis method and program
JP2007141228A (en) Virtual view specification and synthesis in free viewpoint
CN103997599B (en) Image processing equipment, image pick up equipment and image processing method
JP2007096951A (en) Multi-viewpoint image creating apparatus, method, and program
US20070035530A1 (en) Motion control for image rendering
JP2008090617A (en) Device, method and program for creating three-dimensional image
US20110228051A1 (en) Stereoscopic Viewing Comfort Through Gaze Estimation
JP4879326B2 (en) System and method for synthesizing a three-dimensional image

Legal Events

Date Code Title Description
A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20041109

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20050106

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20050802

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20050901

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20051028

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20051115

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20051130

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20081209

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20091209

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20101209

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20101209

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111209

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111209

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20121209

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20121209

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20131209

Year of fee payment: 8

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

LAPS Cancellation because of no payment of annual fees