WO2004107764A1 - 画像表示装置及びプログラム - Google Patents
画像表示装置及びプログラム Download PDFInfo
- Publication number
- WO2004107764A1 WO2004107764A1 PCT/JP2004/007185 JP2004007185W WO2004107764A1 WO 2004107764 A1 WO2004107764 A1 WO 2004107764A1 JP 2004007185 W JP2004007185 W JP 2004007185W WO 2004107764 A1 WO2004107764 A1 WO 2004107764A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display
- fade
- image data
- eye
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/341—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
Definitions
- the present invention relates to an image display device and a program that enable an observer to perform stereoscopic vision, and are suitable for use, for example, when a fade-in or fade-out function is provided.
- a stereoscopic video can be generated when this file is opened. Also, a method has been proposed in which two images are broadcast as one-channel images so that the receiver can perform stereoscopic viewing (see Patent Document 2). If a video file consisting of two videos is created, a stereoscopic video can be generated when this file is opened.
- Patent Document 2 JP-A-10-174064
- Patent Document 3 JP-A-7-170451
- the present invention changes the parallax generated by the right-eye image and the left-eye image to produce a display effect in which the display object moves away during fade-out and approaches the display object during fade-in. It is a thing to tighten.
- the invention according to claim 1 is an image display device that displays a right-eye image and a left-eye image on a display screen, wherein the right-eye image and the left-eye image are displayed on the display screen.
- Display control means for controlling wherein the display control means controls the right-eye image and the right-eye image on the display screen so that the right-eye image and the left-eye image separate in a predetermined direction over time during a fade-out process.
- a means for controlling the arrangement of the left-eye image is provided.
- the invention according to claim 2 is the image display device according to claim 1, wherein the display control means is configured to make the right-eye image and the left-eye image have a regular size with the passage of time during the fade-out processing. It is characterized by further including a means for controlling to reduce from.
- the invention according to claim 3 is the image display device according to claim 1 or 2, wherein an empty area is provided in the display area for the left-eye image and the display area for the right-eye image during the fade-out processing. When this occurs, the next image for the left eye or the image for the right eye is assigned to this empty area.
- the invention according to claim 4 is an image display device that displays a right-eye image and a left-eye image on a display screen, wherein the display of the right-eye image and the left-eye image on the display screen is performed.
- a means for controlling the arrangement of the left-eye image is provided.
- the display control means is configured to set the right-eye image and the left-eye image to have a regular size with the lapse of time during the fade-in process. It is characterized by further including a means for controlling so as to enlarge the image.
- the invention according to claim 6 is a program for providing a computer with a three-dimensional stereoscopic image display function of displaying a right-eye image and a left-eye image on a display screen, the program being provided on the display screen.
- the display control process includes a display control process for controlling display of the right-eye image and the left-eye image, and the display control process causes the right-eye image and the left-eye image to separate in a predetermined direction as time passes during a fade-out process.
- the method includes a process of controlling an arrangement of the right-eye image and the left-eye image on the display screen.
- the invention according to claim 7 is the program according to claim 6, wherein the display control processing is performed such that the right-eye image and the left-eye image change from a regular size as time passes during the fade-out processing. It is characterized by further including a process of controlling to reduce.
- the invention according to claim 8 is the program according to claim 6 or 7, wherein an empty area occurs in the display area for the left-eye image and the display area for the right-eye image during the fade-out processing. Allocating the next left-eye image or right-eye image to the empty area.
- the invention according to claim 9 is a program for giving a computer a three-dimensional stereoscopic image display function of displaying a right-eye image and a left-eye image on a display screen.
- Display control processing for controlling the display of the right-eye image and the left-eye image in the display control processing, wherein the right-eye image and the left-eye image from a predetermined direction over time during fade-in processing
- the method further includes a process of controlling an arrangement of the right-eye image and the left-eye image on the display screen so as to be closer to each other.
- the right-eye image and the left-eye image become regular sizes with time during the fade-in process. And a process of controlling the enlargement.
- the present invention also includes one that provides a transition effect such that when a display object is managed as an object, the object gradually disappears from the screen or emerges on the screen for each object.
- the present invention is an image display device for displaying, as a three-dimensional image, original image data in which a display target is managed as an object, and specifies an object to be fade-in or fade-out out of the objects.
- a display means for displaying the generated stereoscopic image data.
- the out-of-object designating means may be configured to include means for determining the context of each object, and specifying an object to be set as a fade-in or fade-out based on the result of the determination. it can.
- the objects can be sequentially deleted from the foremost object.
- the transition effect setting means in the present invention can be configured to set the transmittance according to the progress of an object designated as a fade-in or fade-out target.
- the stereoscopic image data generating means according to the present invention thins out display pixels of the specified object according to the set transmittance, and fills the objects behind the pixels with respect to pixels remaining after the thinning out. It can be configured to fit in. With this configuration, for example, at the time of fade-out, the object to be erased can be gradually erased while the object behind the object gradually rises, and the stereoscopic effect and the sense of reality can be realized. It can provide a certain transition effect.
- the color of the designated object can be made lighter or darker according to the progress of the transition effect. Thereby, the sense of presence at the time of fade-in or fade-out can be further improved.
- the present invention can also be expressed as a program that gives the functions of the above-described device or each unit to a computer.
- the following inventions each capture the present invention as a program.
- the invention according to claim 14 is a program for providing a computer with a function of displaying, as a stereoscopic image, original image data in which a display target is managed as an object, wherein the object to be faded in or faded out of the object Out-of-object designation processing to specify the object to be converted, transition effect setting processing to set the transition effect outside the specified object, and generation of stereoscopic image data by incorporating the object with the transition effect and other objects And a display process for displaying the generated stereoscopic image data.
- the object designating process determines a context of each object, and sets an object to be set for fade-in or fade-out based on a result of the determination. Is specified.
- the transition effect setting process includes a process of setting a transmittance for the specified object
- the stereoscopic image data generation process includes And a process of thinning out display pixels of the specified object in accordance with the set transmittance, and embedding the object data behind the pixels remaining after the thinning out.
- the display plane is pseudo-rotated in the front-rear direction, whereby the currently displayed image disappears from the front surface to the back surface, and the image to be displayed next is output from the back surface to the front surface. Including manifestation.
- the display plane in the predetermined rotation state is set as the stereoscopic viewpoint.
- the geometric figure information when viewed from the viewpoint is obtained by arithmetic processing, or the geometric figure information for each viewpoint previously obtained by the arithmetic processing is read from the storage means, and the geometrical figure from each viewpoint is displayed as an image to be displayed. (The currently displayed image or the image to be displayed next) and combine them to form one displayed image.
- the display plane changes every moment due to the pseudo rotation, and the image on the display plane is stereoscopically viewed. Therefore, it is possible to give the viewer a movement and a three-dimensional effect at the same time, and realize a fade-in / fade-out operation with a sense of realism due to a synergistic effect.
- the invention according to claim 17 relates to an image display device, and when a display plane is pseudo-rotated in the front-rear direction, the visual plane power in a predetermined rotation state is assumed in advance.
- Geometric figure providing means for providing geometric figure information that is sometimes viewed
- image size changing means for changing the size of an image for each viewpoint according to the geometric figure information of the viewpoint
- Display image generating means for generating a display image by synthesizing a viewpoint image.
- the invention according to claim 18 is based on the image display device according to claim 17, wherein the image size changing means is configured to execute the processing when the image for each viewpoint is provided as image data for three-dimensional stereoscopic display. It is characterized in that image data for two-dimensional display is constructed from image data for each viewpoint, and an image for each viewpoint is obtained based on the image data for two-dimensional display.
- the invention of claim 19 is the image display device according to claim 17 or 18, wherein the image of each viewpoint being displayed is targeted before the pseudo rotation angle reaches 90 °.
- the processing by the image size changing means and the processing by the display image generating means are executed, and until the pseudo rotation angle reaches 90 ° force 180 °, the image of each viewpoint to be displayed next is targeted. It is characterized in that the processing by the image size changing means and the processing by the display image generating means are executed.
- the geometric figure providing means associates the geometric figure information of each viewpoint with the rotation angle. And storing the geometric figure information of each viewpoint when the display plane is pseudo-rotated in the front-rear direction based on the geometric figure information stored in the storage means.
- the invention of claim 21 is a program for providing an image display function to a computer, wherein the display plane in a predetermined rotation state is assumed in advance when the display plane is pseudo-rotated in the front-rear direction.
- a geometric figure providing process for providing geometric figure information that is viewed when viewed from a viewpoint an image size changing process for changing the size of an image for each viewpoint according to the geometric figure information of the viewpoint,
- image data for two-dimensional display is constructed from image data for the two-dimensional display, and an image for each viewpoint is acquired based on the image data for two-dimensional display.
- the invention according to claim 23 is the program according to claim 21 or 22, wherein the image size is set to the image of each viewpoint being displayed until the pseudo rotation angle reaches 90 °. A change process and a display image generation process are performed, and the pseudo rotation angle is 90. Until the image reaches 180 °, the image size changing process and the display image generating process are performed for the image of each viewpoint to be displayed next.
- the geometric figure providing process stores the geometric figure information of each viewpoint in association with the rotation angle. Wherein the geometric figure information of each viewpoint when the display plane is pseudo-rotated in the front-back direction is set based on the geometric figure information stored in the database.
- the image display device of the present invention also includes the following processing as processing corresponding to “fade-in / fade-out”. That is, the image display device of the present invention is based on video data. Means for generating synthesized video data by synthesizing the pixel value of the currently displayed video data and the pixel value of the next display video data at a specified ratio in an image display device for driving a display; When switching from a stereoscopic video to another stereoscopic video, switching from a stereoscopic video to a stereoscopic video, or switching from a stereoscopic video to a stereoscopic video, the pixel value ratio of the current display video data is determined by a predetermined value. Display switching control means for designating the ratio so that the ratio gradually decreases over time and finally reaches 0%.
- the image display device of the present invention changes the pixel value of the currently displayed video data to the pixel value of the next display video data in response to the image display device that drives the display based on the video data.
- Means for switching from a stereoscopic video to another stereoscopic video, switching from a stereoscopic video to a stereoscopic video, or switching from a stereoscopic video to a stereoscopic video Display switching control means for designating switching pixels so that the ratio of the number of pixels of data gradually decreases over a predetermined period of time and finally reaches 0%.
- the display switching control means is configured to designate the switching pixels so that a line-shaped or block-shaped area on the screen increases its width or number. You can.
- switching from a stereoscopic image to another stereoscopic image switching from a stereoscopic image to a planar image, or switching from a stereoscopic image to a stereoscopic image is performed.
- the switching is not instantaneous, but rather is performed gradually and the change in parallax becomes gradual, so that discomfort can be reduced.
- the program of the present invention provides a computer for driving a display based on video data, and combining a pixel value of the currently displayed video data with a pixel value of the next displayed video data at a specified ratio.
- the ratio is specified so that the pixel value ratio of the currently displayed video data gradually decreases over a predetermined period of time and finally reaches 0%, and the pixel value ratio functions as display switching control means. It is characterized.
- the program of the present invention causes a computer to display on the basis of video data.
- the ratio of the number of pixels of the currently displayed video data is gradually reduced over a predetermined period of time so that it finally becomes 0%.
- a switching pixel is designated to function as display switching control means.
- the computer functions as a means for designating the switching pixel so that the line-shaped or block-shaped area on the screen increases its width or number. , You can.
- each object gradually disappears from the screen or a transition effect such that the object emerges on the screen is given to each object.
- the synergistic effect of the powerful object transition effect and the stereoscopic effect can realize a realistic fade-in / fade-out operation.
- switching from a stereoscopic video to another stereoscopic video, switching from a stereoscopic video to a planar video, or switching from a planar video to a stereoscopic video is performed gradually.
- FIG. 1 shows a configuration of an image display device according to an embodiment of the present invention.
- the image display device includes an input device 101, a command input unit 102, a control unit 103, an image processing unit 104, a display control unit 105, a display device 106, a storage unit 107, a development memory 108, and a graphic memory 109.
- the input device 101 is provided with input means such as a mouse and a keyboard, and is used at the time of inputting commands such as "configuration of a reproduced image", a reproduction command, an image feed command, a fade-in command and a "fade-out command".
- the command input unit 102 sends various commands input from the input device 101 to the control unit 103.
- the control unit 103 controls each unit according to the input command transferred from the command input unit 102.
- the image processing unit 104 processes the right-eye image data and the left-eye image data expanded in the expansion memory 108 in response to a command from the control unit 103, and displays image data constituting one screen. Generate data. Then, the generated display image data is stored in the graphic memory 109.
- the display control unit 105 sends the image data stored in the graphic memory 109 to the display device 106 in response to a command from the control unit 103.
- the display device 106 reproduces the image data received from the display control unit 105 on a display screen.
- the storage unit 107 is a database that stores a plurality of image files, and each image file stores a predetermined number of still image data.
- each still image data is composed of image data for the right eye and image data for the left eye for performing three-dimensional stereoscopic image display.
- the expansion memory 108 is composed of a RAM (Random Access Memory), and the reproduction target still image data (the right eye image data and the left eye image data) read from the storage unit 107 by the image processing unit 104. Image data) is temporarily stored.
- the graphic memory 109 is configured by a RAM, and sequentially stores the image data for three-dimensional stereoscopic display generated by the image processing unit 104.
- the first still image data (the right-eye image data and the left-eye image data) among the still image data constituting the finale ) Is read by the image processing unit 104 and expanded on the expansion memory 108. Thereafter, the image processing unit 104 generates the right-eye image (R image) and the right-eye image (L image) such that the images of the right-eye image (L image) are arranged on the screen as shown in FIG. Data and The left-eye image data is mapped on the graphic memory 109.
- R indicates a display area (pixel) of the right-eye image on the screen
- L indicates a display area (pixel) of the left-eye image on the screen.
- the assignment of the display area is determined according to the configuration of the three-dimensional filter. That is, when the display image is viewed through the three-dimensional filter 1, the right-eye image is projected to the viewer's right eye, and the left-eye image is projected to the viewer's left eye. A display area (pixels) for the image and the left-eye image is allocated.
- the image data mapped on the graphic memory 109 is sent to the display device 106 by the display control unit 105 and reproduced on the display screen.
- the right-eye image data and the left-eye image data of the next still image constituting the file are expanded on the development memory 108. Is opened and the same processing as described above is performed. Similarly, every time a feed command is input, the next image data for the right eye and the next image data for the left eye are developed on the development memory 108, and the above processing is executed. This allows the display device 10
- Figure 3 shows the processing flow when a strong fade-out command is input.
- DR1 and DL1 indicate the right-eye image data and the left-eye image data that are being reproduced and displayed, respectively
- DR2 and DL2 are the right-eye image data and the left-eye image data that will be reproduced and displayed next, respectively. Indicates image data for use.
- a shift amount SL is calculated from a preset fade-out speed (S101).
- the shift amount SL refers to a shift amount when the right-eye image and the left-eye image are rearranged in a rightward and leftward direction from a position on the display screen, respectively.
- the left-eye image data DL1 is mapped in the left image data area on the graphic memory 109 by shifting the shift amount to the left by the shift amount SL. (S102). Then, the data area for the left eye remaining after the mapping is The next left-eye image data DL2 to be displayed in the remaining area is mapped (S103).
- the shift processing of the right-eye image data is similarly executed. That is, the right eye image data DL1 is mapped to the right image data area on the graphic memory 109 so as to be shifted rightward by the shift amount SL (S104). Then, the next right-eye image data DL2 to be displayed in the remaining area is mapped to the right-eye data area remaining after the mapping (S105).
- the image data on the graphic memory 109 is transferred to the display device 106.
- the image for the right eye and the image for the left eye are separated from each other by several pixels in the left and right direction, and the next image for the right eye and the image for the left eye are filled in the empty area created by the separation.
- the inserted image is displayed on the display device 106 (S106).
- Such processing can be easily realized, for example, by expressing how the shift amount changes with time by expressing the relationship between the time or the number of processing cycles and the shift amount by a function. .
- FIG. 4 shows an example of image display during the above processing.
- (A) shows the display state before the fade-out command is input
- (b) shows the display state when the first processing cycle (S101 and S106) is executed after the fade-out command is input.
- c) shows the display state when the second processing cycle is executed after the fade-out command is input, comparing the synthesized image, left-eye image (L image), and right-eye image (R image), respectively. This is shown.
- the display of the next still image is omitted in the composite image of FIG.
- the first processing cycle as shown in FIG. 6B is executed, the L image slides to the left by a few pixels, and an empty area (hatched area) is generated in the L image display area at the right end. .
- the corresponding part of the next L image is embedded in this area.
- the R image slides a few pixels to the right, and the corresponding part of the next R image is embedded in the left R image display area (hatched area).
- the arrangement of the L image and the R image is further separated as shown in FIG. Also gets bigger. As a result, the same object on the L image and the R image is recognized as if it were further retracted in the depth direction.
- the fade-out operation in which the next still image is gradually displayed while the still image being displayed is gradually retracted in the depth direction is executed.
- the L image and the R image are slid in the left and right direction, but this is based on the premise that the L image and the R image are set to have a horizontal parallax. Therefore, if the parallax direction is, for example, the vertical or diagonal direction, the slide is performed in that direction. If the L image and the R image are moved in the same direction at the same time, they are moved while the parallax is maintained, so that the stereoscopic vision itself is not affected and only the variety of transition effects can be enhanced. .
- FIG. 5 shows a processing flow when a fade-in command is input.
- DR1 and DL1 are the right-eye image data and the left-eye image data that are being reproduced and displayed, respectively.
- DR2 and DL2 indicate the right-eye image data and the left-eye image data to be reproduced and displayed next, respectively.
- a shift amount SL is calculated from a preset fade-in speed (S111).
- the shift amount SL refers to the amount of approach when the R image and the L image enter the right and left directions and the left, respectively, in the display screen.
- the entry processing of the left-eye image data is completed as described above, the entry processing of the right-eye image data is similarly executed. That is, a free area corresponding to the shift amount SL is secured at the right end of the right image data area on the graphic memory 109 (S114). Then, the next left-eye image data DR2 is mapped to this empty area (S115). In the right image data area other than the empty area, the previous right eye image data DR1 is held as it is.
- the image data on the graphic memory 109 is transferred to the display device 106.
- an image in which the next L image and R image enter the displayed L image and R image by several pixels from the left and right directions is displayed on the display device 106 (S116).
- the processing of SI 11 and SI 16 is executed until the R image and the L image are all displayed on the display screen (S117).
- the processing flow in FIG. 5 is changed to return to S117 and S112.
- the shift amount SL is reset when returning from S117 to S111. By doing so, a more active fade-in operation can be performed.
- the amount of shift is This can be easily realized by expressing the relationship between time or the number of processing cycles and the shift amount by a function.
- the next L image and R image enter the display screen with the parallax gradually narrowing, so that the next still image gradually pops forward. While doing so, it is possible to execute a fade-in operation that gradually enlarges the next still image.
- the left and right images are slid in the left and right directions.
- This is based on the premise that parallax in the horizontal direction is set for the L and R images. Therefore, if the parallax direction is, for example, the vertical or diagonal direction, the slide is performed from that direction. If the L image and the R image are moved in the same direction at the same time, they are moved while maintaining the parallax, so that the stereoscopic vision itself is not affected, and only the variety of transition effects is increased. Can be.
- the still image at the time of fade-out includes a characteristic object in the center thereof, a force that can provide a smooth fade-out effect is obtained at a position shifted from the center of the still image at the time of fade-out.
- the target object includes a typical object, the object is not projected to both the right and left eyes at the time of fade-out, so that the above-mentioned fade-out effect, that is, a display effect in which the object is pulled in the depth direction, is obtained. It is difficult to achieve.
- FIG. 6 shows an image display example at the time of fade-out according to the present embodiment.
- the left-eye image (L image) and the right-eye image (R image) are reduced at a predetermined reduction rate at the time of fade-out, and the boundary between the reduced images is displayed on the display screen.
- the reduced image is slid left or right until it touches the boundary.
- an effective fade-out operation can be realized even for a still image that does not include a characteristic object in the center.
- the feeling of retraction in the depth direction can be further increased as compared with the case where they are simply separated.
- FIG. 7 shows a processing flow during a strong fade-out operation.
- DR1 and DL1 indicate the right-eye image data and the left-eye image data that are being reproduced and displayed, respectively
- DR2 and DL2 are the right-eye image data and the left-eye image data that will be reproduced and displayed, respectively.
- 3 shows image data for use.
- the reduction scale and the positions of the L and R images are calculated from the preset fade-out speed (S201).
- the arrangement positions of the L image and the R image are positions where the left boundary and the right boundary respectively contact the boundary of the display screen as described above.
- the reduction ratio R is set as a reduction ratio for the displayed L image and R image.
- the reduced left-eye image data DL1 is placed in an area corresponding to the L image arrangement position set in S201 in the left image data area on the graphic memory 109. It is mapped (S203). Then, the next left-eye image data DL2 to be displayed in the remaining area is mapped to the left-eye data area remaining after the mapping (S204).
- mapping is performed to an area corresponding to the R image arrangement position set in S201 (S205). Then, the next right-eye image data DR2 to be displayed in the remaining area is mapped to the right-eye data area remaining after the mapping (S206).
- the image data in the graphic memory 109 is transferred to the display device 106.
- a composite image as shown in the uppermost part of FIG. 6B is displayed on the display device 106 (S207).
- the processing cycle in S201 and S207 is executed for a predetermined number of cycles (S208).
- S208 a predetermined number of cycles
- next left-eye image data DL2 and right-eye image data DR2 are stored in the left-eye data area and the right-eye data area on the graphic memory 109. Only is mapped (S209). Then, the image data is transferred to the display device 106, and a composite image including only the next L image and R image is displayed on the display device 106 (S210).
- the reduction rate R may be fixed to a value determined in advance, and if the fade-out effect is to be executed more actively, the reduction rate R may be set in each processing cycle (S201 to S207 above).
- the reduction ratio R may be changed.
- the arrangement position of the reduced L image and the R image may be set so that the boundary between the reduced L image and the R image is separated from the boundary of the display screen.
- both the reduction ratio and the shift amount may be variably set. By combining the change in the reduction ratio and the change in the shift amount, more various fade-out processes can be realized.
- the L image and the R image are slid in the left and right directions, but this is based on the premise that the L image and the R image are set to have a parallax in the horizontal direction. Therefore, if the parallax direction is, for example, the vertical or diagonal direction, the slide is performed in that direction.
- the parallax direction is, for example, the vertical or diagonal direction
- the slide is performed in that direction.
- Fig. 8 shows a processing flow when a fade-in command is input.
- the display screen at the time of the strong fade-in operation is opposite to that at the time of the fade-out operation in FIG. 7 described above, where the next left-eye image and right-eye image respectively touch the left and right boundaries with the display screen boundary. It will gradually expand in a state where it has been done.
- the image size S and the layout positions of the L and R images are calculated from the preset fade-in speed (S211).
- the arrangement positions of the L image and the R image are positions where the left boundary and the right boundary respectively contact the boundary of the display screen as described above.
- the size of the arrangement area of the L image and the R image is set according to the size of the image size S.
- the next image data DL2 for the left eye and the right eye The image data DR2 is processed to generate left-eye image data DL2 and right-eye image data DR2 of the image size S (S212).
- the left-eye image data DL2 of the image size S is stored in the area corresponding to the L image arrangement position set in S211 in the left image data area on the graphic memory 109. It is mapped (S213).
- the left-eye image data DL1 is held in the left-eye data area other than the area used at the time of the mapping.
- the processing for arranging the image data for the left eye is completed as described above, the processing for arranging the image data for the right eye is similarly executed. That is, the right-eye image data DR2 having the image size S is mapped to an area corresponding to the arrangement position of the R image set in S211 in the right image data area on the graphic memory 109 (S214). . It should be noted that the right-eye image data DR1 is held as it is in the right image data area other than the area used for the mapping.
- the image data on the graphic memory 109 is transferred to the display device 106.
- a composite image in which the next L image and R image are reduced to a predetermined size and incorporated at the screen boundary position is displayed. Is displayed on the display 106 (S215).
- the processing cycle from S211 to S215 is repeated until only the next L image and R image are displayed on the display screen (S216).
- the image size S set in each processing cycle is enlarged by a predetermined ratio from the image size S one cycle before.
- the arrangement area of the L image and the R image is enlarged accordingly, compared to one cycle before.
- the parallax between the L image and the R image gradually decreases while the L image and the R image gradually increase. Compared to the case where the parallax between the images is simply reduced, the feeling of the still image jumping forward can be further increased. In addition, since the L image and the R image do not protrude from the screen, an effective fade-in operation can be realized even if a characteristic object is not included in the center of the still image.
- both the enlargement ratio and the shift amount may be variably set. More various fade-in processes can be realized by combining the change in the enlargement ratio and the change in the shift amount.
- the left and right images are slid in the left and right directions.
- This is based on the premise that parallax in the horizontal direction is set for the L and R images. Therefore, if the parallax direction is, for example, vertical or diagonal, the directional force is slid. If the L image and the R image are moved in the same direction at the same time, they are moved while maintaining the parallax, so that the stereoscopic vision itself is not affected, and only the variety of transition effects is increased. Can be.
- fade-out and fade-in processing unique to 3D stereoscopic display is realized by gradually changing the display position of L image and R image, and the reduction ratio and enlargement ratio.
- Processing methods that have been used in the field of 2D display such as fade-out and fade-in, gradually darken or brighten, or gradually display images
- the interocular distance is set in advance, and when calculating the shift amount in each processing cycle, the interocular distance is set so as not to exceed the interocular distance in the entire processing. If it exceeds, it is realized by a method such as making the shift amount zero at that time.
- the present invention is applied to a so-called binocular image display device, but the present invention is also applied to an image display device having a larger number of photographing viewpoints. Applicable.
- FIG. 9 shows an example of image display when the invention relating to the fade-out processing of FIG. 3 is applied to a four-eye image display device.
- Figure (a) shows the image display state of each viewpoint before the fade-out command is input
- Figure (b) shows the image display state of each viewpoint when the first processing cycle is executed after the fade-out command is input.
- FIG. 7C shows the image display state of each viewpoint when the second processing cycle is executed after the fade command is input.
- the slide amount (Sl, S2, S3, S4) per cycle for each viewpoint is set as follows.
- the images at viewpoint 1 and viewpoint 4 disappear from the display screen before the images at viewpoint 2 and viewpoint 3. Therefore, if the viewer views the display screen from viewpoint 1 and viewpoint 2, for example, the image at viewpoint 1 disappears first, and thereafter, an effective fade-out operation cannot be realized. Therefore, when the image of viewpoint 1 disappears, the image of viewpoint 2 also disappears, and only the next image of viewpoint 1 and viewpoint 2 is displayed on the display screen. Bye, The same applies to the viewpoint 3 and viewpoint 4 images.
- an image slide reverse to that at the time of the fade-out operation in Fig. 9 may be executed. Note that, as described above, the images at viewpoint 1 and viewpoint 4 disappear from the display screen before the images at viewpoint 2 and viewpoint 3 during fade-out. In the display screen, the image of point 3 precedes the images of point 1 and point 4
- the image size in each processing cycle is gradually reduced. If you want to execute image processing. At this time, in each processing cycle, for example, the image size is set so that the boundary between the images of viewpoint 1 and viewpoint 4 is in contact with the boundary of the display screen. In this case, the images at viewpoints 2 and 3 are more slurried than the images at viewpoints 1 and 4. Since the id is delayed, its border will always be away from the border of the display screen.
- an image slide process that is the reverse of that at the time of the strong fade-out operation may be performed. That is, the image of each viewpoint (the image to be faded in) may be slid in the opposite direction to the above and gradually enlarged while entering the display screen.
- the slide processing of these four viewpoint images is performed by the display processing of the display image data by the image processing unit 104 and the mapping processing on the graphic memory 109, as in the embodiment of the two viewpoint image. Is done.
- the storage unit 107 stores image data of each viewpoint. Then, the image data of each viewpoint is mapped to the viewpoint data area on the graphic memory 109 as it is or reduced to a predetermined size while being shifted by a predetermined amount. Thereby, the slide processing of the four viewpoint images is performed.
- the image displayed after the fade-out is the next still image constituting the image file, but it is of course possible to use another background image.
- the three-dimensional stereoscopic image display device can also be realized by adding the function shown in Fig. 1 to a personal computer or the like.
- a program for performing the functions shown in FIG. 1 is downloaded to a personal computer via a disk or via the Internet.
- the present invention can be abstracted as a program for providing such a function.
- FIG. 10 shows a configuration of an image display device according to another embodiment.
- the raw image data is CG (Computer Graphics) data
- the CG data is traced from a preset viewpoint to generate three-dimensional image data.
- the image display device includes an input device 201, a command input unit 202, a control unit 203, a format analysis unit 204, a transition effect control unit 205, a composite image generation unit 206, a display control unit 207, and a display.
- the apparatus includes a device 208, a storage unit 209, a development memory 210, and a graphic memory 211.
- the input device 201 includes input means such as a mouse and a keyboard, and is used for inputting commands such as "editing of the configuration of a reproduced image", a reproduction command, an image feed command, and a fade-in / fade-out command.
- the command input unit 202 sends various commands input from the input device 201 to the control unit 203.
- the control unit 203 controls each unit according to the input command transferred from the command input unit 202.
- the format analysis unit 204 analyzes the CG data of the playback target image, and identifies the number of objects included in the image, the arrangement position of each object, the context between the objects, and the like. Then, the identification result is sent to the transition effect control unit 205 and the composite image generation unit 206. The details of the processing in the format analysis unit 204 will be described later.
- the transition effect control unit 205 controls the execution of the transition effect process in response to the input of a fade-in / fade-out command from the input device 201.
- the details of the processing in the transition effect control unit 204 will be described later.
- the composite image generation unit 206 generates the left-eye image data and the right-eye image data developed by the CG data developed in the development memory 210, and maps them to the graphic memory 211. Further, when a transition effect command is input from the transition effect control unit 205, the image data for the left eye and the image data for the right eye to which the transition effect has been applied are generated, and these are stored in the graphic memory 211. The details of the process in the composite image generation unit 206 will be described later.
- the display control unit 207 sends the image data stored in the graphic memory 211 to the display device 208 in response to a command from the control unit 203.
- the display device 208 reproduces the image data received from the display control unit 207 on a display screen.
- the storage unit 209 is a database that stores a plurality of image files. Stores a predetermined number of still image data.
- each still image data is CG data in the present embodiment.
- the development memory 210 is configured by a RAM (Random Access Memory), and has a storage unit 20.
- RAM Random Access Memory
- the graphic memory 211 is configured by a RAM, and sequentially stores the image data for three-dimensional stereoscopic display generated by the composite image generating unit 206.
- FIG. 11 a method for defining an object using CG data and a process for arranging each object in a three-dimensional space will be described.
- the figure shows the processing principle when arranging three objects AC in a three-dimensional space.
- the objects A to C are individually defined by a contour on a three-dimensional coordinate axis as shown in the upper part of FIG. 11 and an attribute (pattern, color, etc.) of the contour surface.
- Each object is placed in the three-dimensional space by positioning the origin of the coordinate axis of each object on the coordinate axis that defines the three-dimensional space as shown in the lower part of the figure.
- the format analysis unit 204 determines the context of each object when viewing the three-dimensional space from a rough, predetermined stereoscopic viewpoint, using CG data that defines the outside of each object. Is analyzed and discriminated. Then, information on the context is sent to the transition effect control unit 205 and the composite image generation unit 206 together with information on the number of objects included in the image and the arrangement position of each object.
- the synthetic image generation unit 206 traces the three-dimensional space from the viewpoint for the left eye (L) and the viewpoint for the right eye (R) as shown in FIG. Image) data and right Generates data for an ophthalmic image (R image). Then, the left-eye image (L image) and the right-eye image (R image) are arranged on the screen as shown in, for example, a partially enlarged view in the upper center of FIG.
- the image data (image data for L) and the image data for right eye (image data for R) are mapped on the graphic memory 211.
- R indicates the display area (pixels) of the right-eye image on the screen
- L indicates the display area (pixels) of the left-eye image on the screen.
- the assignment of the display area is determined according to the configuration of the three-dimensional filter. That is, when viewing the display image through the three-dimensional filter, the display area of the R image and the L image is set so that the R image is projected to the viewer's right eye and the L image is projected to the viewer's left eye. (Pixels) are assigned.
- the composite image generation unit 206 performs a transparency process on the fade-in or fade-out target object specified by the transition effect control unit 205, and performs left-eye image data and right-eye image data.
- FIG. 13 shows a process of generating left-eye image data.
- FIG. 4A shows a state when the transmittance is not set in the sphere object
- FIG. 4B shows a state when the sphere object is set to semi-transparent.
- FIG. 4C shows a state in which the sphere object is set to be fully transparent.
- the sphere object is semi-transparent, the sphere and the background thereof are traced according to the transmittance of the sphere object to generate L image data.
- the transmittance of the sphere is set to 30%, 70% of the image data for the sphere (the pixels in the area are evenly thinned out) is obtained by tracing this sphere. The remaining 30% will be the image data obtained by tracing the object behind this sphere.
- the image data of the background image is used.
- the image data for L is generated by tracing only behind the sphere object at the time of tracing.
- the R image data is generated by tracing the sphere and the background thereof in accordance with the transmittance of the sphere object. Also, when transmittance is set for other objects, L image data and R image data are generated by the same processing.
- the first still image data (CG data) of the still image data constituting the finale is read out and stored in the development memory 210. Will be expanded to. Thereafter, the composite image generation unit 206 generates the right-eye image data and the left-eye image data from the read image data as described above. Then, the generated right-eye image data and left-eye image data are mapped on the graphic memory 211.
- the image data mapped on the graphic memory 211 is sent to the display device 208 by the display control unit 207 and reproduced on the display screen. Thereafter, when a still image transmission command is input from the input device 201, the next still image data (CG data) constituting the file is expanded on the expansion memory 210, and the same processing as described above is performed. Be executed. Similarly, each time a feed command is input, the next still image data is developed on the development memory 210, and the above processing is executed. As a result, the still images constituting the file are sequentially displayed on the display device 208.
- FIG. 14 shows a processing flow at the time of strong fade-out processing.
- the transition effect control unit 205 determines whether the object existing on the screen and the L-viewpoint and the R-viewpoint based on the analysis result received from the format analysis unit 204.
- the context of each object is extracted (S301).
- the object located at the foremost position is set as a deletion target (S302). It should be noted that objects other than the object at the foremost position can also be set as objects to be deleted.
- the transition effect control unit 205 sets the transmittance of the object to be erased (S303), and sends the transmittance and the identification information of the object to be erased to the composite image generating unit 206.
- the composite image generation unit 206 transmits the transmittance and the identification information of the object to be deleted.
- the three-dimensional space is traced from the L viewpoint, and L image data is generated (S304). At this time, all the unappeared objects are traced as all transparent. Then, the generated L image data is mapped to the L image data area on the graphic memory 211 (S305).
- the three-dimensional space is traced from the R viewpoint to generate R image data (S306), and this is mapped to the R image data area on the graphic memory 211 (S307).
- the mapping process on the graphic memory 211 is completed, the image data on the graphic memory 211 is transferred to the display device 208, whereby the composite image obtained by combining the L viewpoint image and the R viewpoint image is formed. It is reproduced and displayed on the display device 208 (S308). Thereafter, it is determined whether or not the object to be erased has been completely erased (transmittance 100%). If the object has not been completely erased, the process returns to S303, the transmittance is increased by one step, and the above processing is repeated. You.
- Powerful S303 The processing of S308 is repeatedly executed until the object to be erased is completely erased (S309).
- the object to be erased is completely erased, it is determined whether or not all the objects on the screen have been erased (S310). If NO, the process returns to S302, and a new object is set as the object to be erased.
- the object to be erased is, for example, the object located closest to the foreground when viewed from the L and R viewpoints, among the objects remaining on the screen. Then, when all the objects on the screen have been erased, the fade-out processing ends (S310).
- FIG. 15 shows a processing flow at the time of strong fade-in processing.
- the transition effect control unit 205 determines whether the objects existing on the screen and the L and R viewpoints The context of each object is extracted (S321). Then, the object located at the innermost position is set as an appearance target (S322). It is also possible to set an object other than the object located at the innermost position as an appearance target. After that, the transition effect control unit 205 sets the transmittance of the appearance target object (S 323), and sends the transmittance and the identification information of the appearance target object to the composite image generation unit 206.
- the composite image generation unit 206 traces the three-dimensional space from the L viewpoint based on the transmittance and the identification information of the appearance target object as described with reference to FIG. Then, image data for L is generated (S324). Then, the generated L image data is mapped to the L image data area on the digital memory 211 (S325). Similarly, the 3D space is traced from the R viewpoint to generate R image data (S326), and this is mapped to the R image data area on the graphic memory 211 (S327).
- the mapping process on the graphic memory 211 is completed, the image data on the graphic memory 211 is transferred to the display device 208, whereby the composite image obtained by combining the L viewpoint image and the R viewpoint image is formed. It is reproduced and displayed on the display device 208 (S328). Thereafter, it is determined whether the object to be erased has completely appeared (transmittance 0%), and if not, the process returns to S323 to reduce the transmittance by one step and repeat the above processing.
- Powerful S323 The processing of S328 is repeatedly executed until the object to appear completely appears (S329). Then, when the object to appear completely appears, it is determined whether or not all objects in the screen have appeared (S330). If NO, the process returns to S322, and a new object is set as the appearance object.
- the object to appear is, for example, the object located at the innermost position when viewed from the L viewpoint and the R viewpoint, among objects not displayed on the screen. When all objects appear on the screen, the fade-in process ends (S330).
- an object in each state can be stereoscopically viewed while erasing or appearing the objects in order, so that a realistic fade-out / fade-in operation can be realized.
- the object is erased or appeared by thinning out the display pixels.
- the color of the object to be erased or appeared can be changed by the transition effect. May be made lighter or darker depending on the degree of the darkness.
- FIG. 16 shows a configuration of an image display device according to another embodiment.
- the raw image data is MPEG data
- the raw data includes a background image and the background image. It is assumed that an object to be incorporated in an image is prepared in advance and stored in the storage unit for each stereoscopic viewpoint.
- the image display device includes an input device 201, a command input unit 202, a control unit 203, a decode processing unit 221, a transition effect control unit 222, a composite image generation unit 223, a display control unit 207, and a display device. 208, a storage unit 224, a development memory 210, and a graphic memory 211.
- the configuration other than the decode processing unit 221, the transition effect control unit 222, the composite image generation unit 223, and the storage unit 224 is the same as the configuration in the above embodiment (see FIG. 10).
- the decode processing unit 221 decodes the MPEG data of the image to be reproduced, and expands the decoded image data in the expansion memory 210. In addition, it extracts the number of objects included in the image, the arrangement position outside each object, the front-back relationship between the objects, and the like, and sends the extraction result to the transition effect control unit 222 and the composite image generation unit 223. The details of the processing in the decoding processing unit 221 will be described later.
- the transition effect control unit 222 controls the execution of the transition effect process in response to the input of the fade-in / fade-out command from the input device 201. The details of the processing in the transition effect control unit 222 will be described later.
- the composite image generation unit 223 generates left-eye image data and right-eye image data from the MPEG data expanded in the expansion memory 210, and maps these to the graphic memory 211. Further, when a transition effect command is input from the transition effect control unit 222, the image data for the left eye and the image data for the right eye to which the transition effect has been applied are generated and are mapped to the graphic memory 211. The details of the process in the composite image generation unit 223 will be described later.
- the storage unit 224 is a database that stores a plurality of image files, and each image file stores a predetermined number of still image data.
- each still image data is MPEG data, and is composed of image data for an L viewpoint and image data for an R viewpoint.
- the image data for the L viewpoint and the image data for the R viewpoint each include a background and data (described later) relating to an object incorporated thereon.
- FIG. 17 shows the processing when three objects AC are combined.
- object area An area slightly wider than the object (hereinafter, referred to as "object area”) is set in the object AC as shown in FIG. Normally, the object area excluding the object is made transparent. That is, control information for making the object area excluding the object transparent is attached to each object.
- the decoding processing unit 221 decodes the image data for the L viewpoint and the R viewpoint read from the storage unit 224, obtains the background image data and the object image data for each viewpoint, and decompresses them into a development memory. Expand to 210. At the same time, it extracts the contour information, attribute information, arrangement information, order information before and after, etc., and sends them to the transition effect control unit 222 and the composite image generation unit 223.
- the synthetic image generating unit 223 synthesizes the background image of each viewpoint with the object based on the outline information, attribute information, arrangement information, and order information received from the decoding processing unit 221 ( It generates image data for the left eye (image data for L) and image data for the right eye (image data for R). Then, similar to the first embodiment, the images of the left-eye image (L image) and the right-eye image (R image) are arranged on the screen, for example, as shown in FIG. The image data for R and the image data for R are mapped on the graphic memory 211.
- the composite image generation unit 223 controls the transition effect control unit.
- FIG. 18 shows a mapping process of the L image data and the R image data. Note that FIG. 9 illustrates an example in which a transmission process (a transmittance of 50%) is set for the object B.
- the degree of overlapping of the contours is detected based on the contour information, arrangement information, and information on the order of the front and rear of the objects A and B extracted by the decoding processing unit 221.
- the object B is located ahead.
- the area outside the outline of the object B since the area outside the outline of the object B is set to be transparent, the area outside the outline of the object area of the object B has priority over the image data of the object A behind it and is stored in the graphic memory 211. Is mapped. If the outline of the object B is not located in the area outside this outline, the image data of the background image is mapped to the graphic memory 211.
- the image data of the object B is mapped on the graphic memory 211 with a priority of one pixel per two pixels.
- the image data of the object A behind it is mapped to the remaining pixels.
- the pixels to which the image data of the object B are assigned are set according to the transmittance of the object B. For example, if the transmittance of the object B is changed from 50% to 80%, the number of pixels for allocating the image data of the object B is changed to the ratio of one pixel to five pixels.
- the first still image data among the still image data constituting the relevant file (the MPEG data for the L viewpoint and the R viewpoint) is read.
- Image data (background image and object) for the L viewpoint and the R viewpoint obtained by this decoding are developed on the development memory 210.
- the outline information, attribute information, arrangement information, and order information before and after each object extracted at the time of the decoding are transferred to the transition effect control unit 222 or 222. And is sent to the composite image generation unit 223.
- the composite image generation unit 223 combines the background image data for the L viewpoint and the object image data for the R viewpoint based on the outline information, the attribute information, the arrangement information, and the order information. Generate L image data and R image data. Then, the generated L image data and R image data are mapped on the graphic memory 211.
- the image data mapped on the graphic memory 211 is sent to the display device 208 by the display control unit 207, and reproduced on the display screen.
- FIG. 19 shows a processing flow at the time of strong fade-out processing.
- the transition effect control unit 222 extracts the objects existing on the screen and the context of each object based on the extraction information received from the decode processing unit 221 (S401). ). Then, the object located at the foremost position is set as a deletion target (S402). It should be noted that objects other than the object at the foremost position can be set as objects to be deleted.
- the transition effect control unit 222 sets the transmittance of the object to be erased (S 403), and sends the transmittance and the identification information of the object to be erased to the composite image generating unit 223.
- the composite image generation unit 223 generates image data for L as described above based on the transmittance and the identification information of the object to be deleted (S404). Then, the generated L image data is mapped to the L image data area on the graphic memory 211 (S405). Similarly, R image data is generated (S406), and this is mapped to the R image data area on the graphic memory 211 (S407).
- the mapping process on the graphic memory 211 is completed, the image data on the graphic memory 211 is transferred to the display device 208, and the combined image obtained by combining the L viewpoint image and the R viewpoint image is thereby obtained. It is reproduced and displayed on the display device 208 (S408). After a while It is determined whether the object to be deleted has been completely erased (transmittance 100%). If it has not been completely erased, the process returns to S403, the transmittance is increased by one step, and the above processing is repeated.
- S403 The processing of S408 is repeatedly executed until the object to be erased is completely erased (S409).
- the object to be erased is completely erased, it is determined whether all the objects on the screen have been erased (S410). If NO, the process returns to S402, and a new object is set as an object to be erased.
- the object to be deleted is, for example, the object at the foremost position among the objects remaining on the screen. Then, when all objects on the screen have been erased, the fade-out processing ends (S410).
- FIG. 20 shows a processing flow at the time of strong fade-in processing.
- the transition effect control unit 222 extracts the objects to be incorporated in the screen and the context of each object based on the extraction information received from the decode processing unit 221 (S421). ). Then, the object located at the innermost position is set as an appearance target (S422). Note that objects other than the object at the innermost position are set as appearance targets.
- the transition effect control unit 222 sets the transmittance of the appearance target object (S423), and sends the transmittance and the identification information of the appearance target object to the composite image generation unit 223.
- the composite image generation unit 223 generates L image data as described above based on the transmittance and the identification information of the appearance target object (S424). At this time, all unappeared objects are all transmitted. Then, the generated L image data is mapped to the L image data area on the graphic memory 211 (S425). Similarly, R image data is generated (S426), and this is mapped to the R image data area on the graphic memory 211 (S427).
- the graphic memory 2 The image data on 11 is transferred to the display device 208, whereby a combined image obtained by combining the L viewpoint image and the R viewpoint image is reproduced and displayed on the display device 208 (S428). Thereafter, it is determined whether or not the appearance target object has completely appeared (transmittance 0%). If not, the process returns to S423, and the transmittance is reduced by one step, and the above processing is repeated.
- S423 The processing of S428 is repeatedly executed until the object to appear completely appears (S429). Then, when the object to appear completely appears, it is determined whether or not all the objects have appeared on the screen (S430). If NO, the process returns to S402, and a new object is set as the appearance object.
- the object to appear is, for example, the object at the innermost position among the objects not displayed on the screen. When all the objects appear on the screen, the fade-in process ends (S430).
- an object in each state can be stereoscopically viewed while erasing or appearing the objects in order, so that a realistic fade-out / fade-in operation can be realized.
- the object is erased or appeared by thinning out the display pixels.
- the color of the object to be erased or appeared is set to the transmittance. May be made lighter or darker depending on the condition.
- the present invention is applied to a so-called binocular image display device, but the present invention is applied to an image display device having a larger number of photographing viewpoints. Is also applicable.
- the tracing process is executed by increasing the number of viewpoints.
- MPEG data corresponding to the number of viewpoints is prepared in advance for each still image. Then, it is changed to be stored in the storage unit 224.
- the three-dimensional stereoscopic image display device can also be realized by adding the functions of the configuration examples in each embodiment to a personal computer or the like.
- a program for executing the function of each configuration example is downloaded to a personal computer via a disk or via the Internet.
- the present invention can be abstracted as a program for providing such a function.
- FIG. 21 shows a configuration of an image display device according to this embodiment.
- the raw image data is assumed to be two-dimensional image data, and three-dimensional image data is generated from the two-dimensional image data.
- the image display device includes an input device 301, a command input unit 302, a control unit 303, a transition effect control unit 304, a display plane generation unit 305, a parallax image generation unit 306, and a display control unit 307. , Display device 308, storage unit 309, development memory 310, and graphic memory 311.
- the input device 301 includes input means such as a mouse and a keyboard, and is used for inputting commands such as "editing of the configuration of a reproduced image", a reproduction command, an image feed command, and a fade-in / fade-out command.
- the command input unit 302 sends various commands input from the input device 301 to the control unit 303.
- the control unit 303 controls each unit according to the input command transferred from the command input unit 302.
- the transition effect control unit 304 controls the execution of the display plane rotation process in response to the input of the fade-in / fade-out command from the input device 301.
- the display plane creation unit 305 obtains a geometric figure of the display plane when viewed from the left eye viewpoint and the right eye viewpoint according to the rotation angle input from the transition effect control unit 304. The processing in the display plane creation unit 305 will be described later in detail.
- the parallax image generation unit 306 generates left-eye image data and right-eye image data from the two-dimensional image data expanded in the expansion memory 310, and maps them to the graphic memory 311. I do.
- the left-eye image and the right-eye image are added to the left-eye geometric figure and the right-eye geometric figure input from the display plane creating unit 305.
- the image data for the left eye and the image data for the left eye are compressed (non-linear or linear, even if they are shifted) to fit, and both compressed image data are mapped to the graphic memory 311. The processing at the time of the force and the effect of the transition will be described later in detail.
- the display control unit 307 sends the image data stored in the graphic memory 311 to the display device 308 in response to a command from the control unit 303.
- the display device 308 reproduces the image data received from the display control unit 307 on a display screen.
- the storage unit 309 is a database that stores a plurality of image files, and each image file stores a predetermined number of still image data.
- each still image data is data for displaying a two-dimensional image in the present embodiment.
- the expansion memory 310 is configured by a RAM (Random Access Memory), and is used when temporarily storing still image data read out from the storage unit 309.
- the graphic memory 311 is configured by a RAM, and sequentially stores image data for three-dimensional stereoscopic display generated by the parallax image generation unit 306.
- the parallax image generation unit 306 When an image reproduction command of a predetermined file is input to the image display device, the first still image data among the still image data constituting the finale is read and expanded on the expansion memory 310. You. Thereafter, the parallax image generation unit 306 generates right-eye image data and left-eye image data from the read image data, and generates a right-eye image (R image) and a left-eye image (L image). The right-eye image data and the left-eye image data are arranged so that the images are arranged on the screen as shown in FIG.
- the eye image data is mapped on the graphic memory 311.
- R indicates a display area (pixel) of the right-eye image on the screen
- L indicates a display area (pixel) of the left-eye image on the screen.
- the assignment of the display area is determined according to the configuration of the three-dimensional filter. That is, when the displayed image is viewed through the three-dimensional filter, the R image is displayed on the viewer's right eye, and the L image is displayed on the viewer.
- the display areas (pixels) of the R and L images are allocated so that the image is projected to the left eye of the viewer.
- the image data mapped on the graphic memory 311 is sent to the display device 308 by the display control unit 307, and is reproduced on the display screen.
- Such a geometric figure generation process is performed in such a manner that the left-eye viewpoint L and the right-eye viewpoint R are located in front of the display screen as shown in Fig. 23 (a) and at a position at a predetermined distance from the display screen. From this state, when the display screen is sequentially rotated by the angle ⁇ as shown in FIGS. 23 (b) and (c), and when viewed from the left-eye viewpoint L and the right-eye viewpoint R in each rotation state. The geometrical figure of the display plane is obtained by calculation.
- the L image plane and the R image plane shown in Fig. 23 schematically show the shapes of geometric figures when the display plane is viewed from the left-eye viewpoint L and the right-eye viewpoint R, respectively. .
- the L image plane and the R image plane have different shapes due to the parallax between the left eye viewpoint L and the right eye viewpoint R. Therefore, by applying the L image and the R image to the L image plane and the R image plane, respectively, a disparity is generated between the image projected to the left eye and the image projected to the left eye, and the image in the rotating state is generated. Will be viewed as stereoscopic.
- the original image data (two-dimensional image data) is compressed by 1/2 in the horizontal direction to generate L image data and R image data.
- the images of the generated L image data and R image data are The L image data and the R image data are compressed (or decompressed) in the vertical direction and the horizontal direction, respectively, so as to fit in the L image plane and the R image plane generated by the image plane creating unit 305.
- the compressed L image data and R image data are mapped to corresponding positions of the L image data area and the R image data area on the graphic memory 311.
- Fig. 24 schematically shows the original image data (upper part of the figure) compressed horizontally by 1Z2, and the lower part of Fig. 24 shows the L image generated in this manner.
- 9 schematically shows a state in which data and image data for R are mapped on the graphics memory 311 so as to fit in the plane for L image and the plane for R image, respectively.
- the L image plane and the R image plane are set to be maximum on the display plane.
- the vertical length is the same as the vertical length of the image display area (the fourth system from the left in Fig. 24). Note that, since the vertical length of the portion that protrudes from the screen is not actually longer than the vertical length of the image display area, it may be cut off and displayed.
- the magnification of the L image plane and the R image plane with respect to the original size (the size of the L image plane and the R image plane calculated according to Fig. 23) is the same. That is, when creating the L image plane and the R image plane in FIG. 24, the correspondence between the sizes of the L image plane and the R image plane in FIG. 23 is maintained.
- the row with the maximum vertical length as a whole is the same as the vertical length of the image display area.
- the planes for the L image and the plane for the R image are set so that the centers (rotation axes) in the vertical and horizontal directions coincide with each other.
- the free space generated on the graphic memory 311 contains background image data (for example, data of a single color). Is mapped.
- FIG. 25 shows a processing flow when a strong fade-in / fade-out command is input.
- a fade-in / fade-out command is input, first, the two-dimensional image data of the still image to be played back is compressed in half in the horizontal direction, and the L image data and the R image data are separated. It is generated and expanded on the expansion memory 310 (S501). Also, the next still image The two-dimensional image data is read from the storage unit 309, and is compressed 1/2 in the horizontal direction to generate image data for L and image data for R, and is expanded on the expansion memory 310 (S502).
- the rotation angle of the display plane is input from the transition effect control unit 304 to the display plane generation unit 305 (S503), and the L image plane and the R image plane (geometric figure information) corresponding to the rotation angle are displayed. This is calculated by the indicated plane generation unit 305 (S504).
- the rotation angle of the display plane is set to be equal to the unit rotation angle in the first processing cycle, and then to be gradually increased as the processing cycle proceeds. At this time, if the speed of the fade-in / fade-out can be appropriately set, the unit rotation angle is in accordance with this speed. Further, the rotation angle may be changed for each processing cycle. As a result, the display effect at the time of fade-in / fade-out can be further improved.
- the selected L image data and R image data are set so as to fit in the L image plane and the R image plane, respectively.
- nonlinear compression is performed (S508).
- the compressed L image data is mapped to the L image data area on the graphic memory 311 (S509), and the background image data (for example, a single color) is mapped to the L image data area remaining after the mapping. (S510).
- the compressed R image data is mapped to the R image data area on the graphic memory 311 (S511), and the background image data is mapped to the R image data area remaining after the mapping (S512).
- the image on the rotating display plane can be stereoscopically viewed while rotating the display plane in a simulated manner, a realistic fade-in / fade-out operation is realized. it can.
- the display plane is pseudo-rotated in the horizontal direction.
- the display plane may be rotated in various directions such as the vertical direction, the horizontal direction, or a combination thereof. obtain.
- the display plane creating unit 305 performs an arithmetic process on the display plane in each rotation state according to the arithmetic processing principle of FIG. 23 to calculate the L image plane and the R image plane in each rotation state. I do.
- the L image plane and the R image plane are calculated by the display plane creation unit 305.
- the L image plane and the R image plane corresponding to the rotation angle are calculated and stored in advance, and during the fade-in / fade-out processing, the L image plane and the R image plane corresponding to the rotation angle of the processing cycle are used. May be read and used.
- FIG. 26 shows a configuration example of the image display device in the case shown in FIG.
- a geometric plane information storage unit 305a that stores an L image plane and an R image plane corresponding to the rotation angle is provided.
- the display plane creating unit 305 reads the L image plane and the R image plane corresponding to the rotation angle input from the transition effect control unit 304 from the geometric plane information storage unit 305a, and reads them from the parallax image generation unit 306. Send to
- FIG. 27 shows a fade-in / fade-out processing flow in the case of being strong. This processing flow is changed to S504 force 520 in the processing flow of FIG. Other processing, This is the same as the processing flow in FIG.
- the still image data stored in the storage unit 309 is two-dimensional data.
- three-dimensional still image data left eye image data and right
- the image data for the eye may be stored in the storage unit 309.
- L image data and R image data corresponding to a still image to be reproduced are read out from the storage unit 309 and expanded in the expansion memory 310.
- the function of the parallax image generation unit 310 during normal reproduction is different from that of the above-described embodiment. That is, in this configuration, at the time of normal reproduction, the L image data and the R image data expanded in the expansion memory 310 are directly mapped to the corresponding areas on the dynamic memory 311. The process of generating the L image data and the R image data from the two-dimensional image data, which has been performed during the normal reproduction, is omitted.
- the L image data and the R image data expanded in the expansion memory 310 are directly used as the L image plane.
- a non-linear compression process may be performed so as to fit in the plane for the R image, and this may be mapped on the graphic memory 311.
- these image data and R image data originally have a parallax corresponding to the display of the stereoscopic image, if this is directly applied to the L image plane and the R image plane, the influence of the original parallax will be obtained. Appears in a reproduced image, causing distortion in stereoscopic vision.
- This distortion can be suppressed by providing the parallax image generation unit 306 with a function of eliminating the original parallax.
- two-dimensional image data is generated from the L image data and the R image data developed in the development memory 310, and the two-dimensional image data is processed in the same manner as in the above-described embodiment. Reconstruct L image data and R image data.
- FIG. 4 shows a fade-in / fade-out processing flow in the case of a strong and a weak.
- S501 and S502 in the processing flow of FIG. 25 are changed to SS530 and S531.
- the image data for L and the image data for R are generated from the two-dimensional image data force in S501 and S502, and the force that has been developed on the development memory 310 is as shown in FIG.
- the processing flow first, in S530, the L image data and R image data of the next stereoscopic still image Is developed on the development memory 310 (the currently played L image data and R image data are already developed on the development memory 310 at the time of normal playback).
- L image data and R image data for the still image currently being played back, and L image data to be played back next, from the L image data and R image data of Data and image data for R are reconstructed.
- Other processing is the same as the processing flow of FIG. 25 described above.
- the present invention is applied to a so-called binocular image display device, but the present invention is also applied to an image display device having a larger number of imaging viewpoints. Applicable.
- FIG. 29 shows an example of generating a geometric figure when the present invention is applied to a four-eye image display device.
- FIG. 29 (a) schematically shows a geometric figure when viewing the display plane from each viewpoint when the display plane is not rotated
- FIG. 29 (b) shows the display plane by a predetermined amount. This is a schematic representation of the geometric figure when viewing the display plane from each viewpoint when rotated.
- the background image Although a single color is used, other background images can of course be used.
- the L image data area and the R image data area on the graphic memory 311 are allocated so that the L image plane and the R image plane are maximized on the display plane.
- the method of allocating the L image data area and the R image data area on the memory 311 is not limited to this. For example, until the rotation angle reaches 90 ° (until the plane is switched from the front to the back), the L image plane is allocated.
- the graphic memory 311 so that the planes for R and R images gradually become smaller on the display screen, and the planes for L and R images gradually increase on the display screen until the rotation angle reaches 90 ° and the force also reaches 180 °. It is OK to assign the data area for L image and the data area for R image above.
- the present invention is applied to the display technique at the time of fade-in / fade-out.
- the present invention can be applied to display techniques other than the fade-in / fade-out.
- the present invention can be applied to a case where a display plane is pseudo-rotated in a three-dimensional space, or a display plane is pseudo-fixed diagonally in a three-dimensional space to give a special effect to image display. is there.
- the embodiment of the present invention can be variously modified as appropriate within the scope of the technical idea of the present invention.
- the three-dimensional stereoscopic image display device according to the present embodiment can also be realized by adding the functions of the configuration examples in each embodiment to a personal computer or the like.
- a program for executing the function of each configuration example is mounted on a disk or downloaded to a personal computer via the Internet.
- the present invention can be abstracted as a program for providing such a function.
- FIG. 30 shows an example of the architecture of a personal computer (image display device).
- CPU 1 is connected to North Bridge 2 with system control function and South Bridge 3 with interface functions such as PCI bus and ISA bus.
- a video card 5 is connected to the north bridge 2 via a memory 4 and an AGP (Accelerated Graphics Port).
- the South Bridge 3 has a USB (Universal Serial Bus) interface 6
- a hard disk drive (HDD) 7 and a CD-ROM device 8 are connected.
- HDMI Universal Serial Bus
- FIG. 31 shows a general video card 5.
- the VRAM (video memory) controller 5b controls writing / reading of drawing data to / from the VRAM 5a according to an instruction from the CPU 1 via the AGP.
- the DAC (D / A converter) 5c converts the digital video data from the VRAM controller 5b into an analog video signal, and supplies this video signal to the personal computer monitor 12 via the video buffer 5d.
- video display processing rendering processing
- stereoscopic video display processing in which a right-eye video and a left-eye video are generated and these are alternately drawn in a vertical stripe shape can be performed.
- the personal computer has a network connection environment and can receive a file (for example, a document file, mail, an HTML file, an XML file, and the like) from a transmission device configured as a server or the like on the Internet. . Further, in the personal computer, for example, by providing the monitor 12 with a liquid crystal barrier, it is possible to display both a two-dimensional image and a three-dimensional image. If the stereoscopic video is, for example, a right-eye video and a left-eye video alternately arranged in a vertical stripe, a vertical stripe-shaped light-shielding region is formed in the liquid crystal barrier under the control of the CPU 1.
- a file for example, a document file, mail, an HTML file, an XML file, and the like
- a stereoscopic video is to be displayed in a partial area on the screen (a window part for file playback or a partial video part in an HTML file)
- the window and a part of the CPU 1 are displayed by the CPU 1. It is possible to control the size and the formation position of the vertical stripe-shaped light shielding area based on the display coordinates and the size of the image portion.
- a normal barrier barrier stripes are fixedly formed at a predetermined pitch
- the personal computer is equipped with word processing software and browser software (viewer), and can open a file and display an image on the monitor 12.
- a personal computer generates synthesized video data by synthesizing the pixel value of the current display video data and the pixel value of the next display video data at a specified ratio, and generates a three-dimensional image.
- the program is installed by specifying the ratio so that the pixel value ratio gradually decreases over a predetermined period of time and finally reaches 0%, and the CPU 1 executes processing based on the program. I do.
- the VRAM controller 5b controls the writing of the combined R pixel value (drawing data) to the VRAM 5a and the reading for display in response to instructions from the CPU 1.
- the writing of the combined R pixel value and the like into the VRAM 5a may be performed, for example, in an area corresponding to the upper horizontal line on the screen in an area that becomes the lower horizontal line in order, but is not limited to this.
- FIG. 32 is a schematic diagram showing an example of switching images from a two-dimensional image (2D) to a three-dimensional image (3D). In the above process, it appears that the current display image gradually appears through (transparency) and the next display image is displayed.
- the pixel value of the current display video data is changed to the pixel value of the next display video data for each pixel, and this changed pixel is selected at random or every predetermined number of pixels.
- the current display image gradually becomes transparent (transparent), and the next display image is displayed.
- the changed pixels are selected to be on the lower horizontal line in the order of the area force corresponding to the upper horizontal line on the screen, for example, a so-called wipe
- the next display image appears to be displayed.
- the CPU 1 designates a switching pixel so that the pixel number ratio of the currently displayed video data gradually decreases over a predetermined time (for example, 3 seconds) and finally reaches 0%. What is necessary is just to perform the process of drawing.
- the CPU 1 performs a process of designating the switching pixel so as to increase the width or the number of the line-shaped or block-shaped area on the screen and draw the image. It may be executed.
- the pixel value of the corresponding address on the VRAM 5a is rewritten to the pixel value of the next display image for a plurality of vertical lines arranged at predetermined intervals on the screen. Then, the pixel value of the corresponding address on the VRAM 5a is rewritten to the pixel value of the next display image so that the vertical line width expands horizontally, and the area of the current display image gradually decreases over a predetermined time. To be 0% finally.
- the pixel value of the corresponding address on the VRAM 5a is rewritten to the pixel value of the next display video in the region that is the center on the screen. Then, a process of rewriting the pixel value of the corresponding address on the VRAM 5a to the pixel value of the next display image is performed so that the area expands vertically and horizontally, and the area of the current display image gradually decreases over a predetermined time. To finally reach 0%.
- the pixel value of the corresponding address on the VRAM 5a is rewritten to the pixel value of the next display image for a plurality of regions having a predetermined vertical length that are arranged in a staggered manner on the screen. Then, a process of rewriting the pixel value of the corresponding address on the VRAM 5a to the pixel value of the next display image is performed so that each region expands horizontally, and the region of the current display image gradually decreases over a predetermined time. To be 0% finally.
- the vertical line area at the left end on the screen corresponds to the corresponding address on VRAM5a. Pixel value of the next display image. Then, a process of rewriting the pixel value of the corresponding address on the VRAM 5a to the pixel value of the next display image is performed so that the vertical line width expands to the right and left, and the area of the current display image gradually takes a predetermined time. It is to be reduced to 0% finally.
- a personal computer is used as an example.
- the image display device is not limited thereto, and the image display device may be a digital broadcast receiving device capable of receiving a data broadcast (BML file) and displaying an image, or a network connection. It can also be configured as a mobile phone having an environment and image display function. Further, in the above example, stereoscopic viewing without glasses has been exemplified.However, the present invention is not limited to this, and for example, the left and right eye images alternately displayed by the liquid crystal shutter method are gradually combined with the next image described above. It's a thing to do.
- FIG. 1 is a diagram showing a configuration of a three-dimensional stereoscopic image display device according to an embodiment of the present invention.
- FIG. 2 is a diagram showing a composite state of images according to the embodiment of the present invention.
- FIG. 3 is a flowchart at the time of a fade-out operation according to the embodiment of the present invention.
- Garden 4] is a diagram showing a display screen at the time of do-out processing according to the embodiment of the present invention.
- Garden 5] is a flowchart at the time of do-in operation according to the embodiment of the present invention. It is a figure showing a display screen at the time of fade-out processing concerning an embodiment of the invention.
- Garden 7] It is a flow chart at the time of a feed-out operation concerning an embodiment of the invention.
- Garden 8] is a flowchart at the time of a do-in operation according to the embodiment of the present invention.
- Garden 9] is a diagram showing a display screen at the time of the out-out processing according to the embodiment of the present invention.
- FIG. 10 is a diagram showing a configuration of a three-dimensional stereoscopic image display device according to an embodiment of the present invention.
- FIG. 11 is a diagram illustrating a method for synthesizing a CG image according to the embodiment of the present invention.
- FIG. 12 is a diagram showing a method for generating image data of each viewpoint according to the embodiment of the present invention.
- FIG. 13 is a diagram showing a method of generating image data of each viewpoint according to the embodiment of the present invention.
- Garden 14] is a flowchart showing a process during a fade-out operation according to the embodiment of the present invention.
- FIG. 16 is a diagram showing a configuration of a three-dimensional stereoscopic image display device according to an embodiment of the present invention.
- FIG. 17 is a diagram illustrating a method for synthesizing an image according to the embodiment of the present invention.
- FIG. 18 is a diagram showing a method of generating image data of each viewpoint according to the embodiment of the present invention.
- Garden 19] is a flowchart showing a process during a fade-out operation according to the embodiment of the present invention.
- FIG. 20 is a flowchart showing a process at the time of a fade-in operation according to the embodiment of the present invention.
- FIG. 21 is a diagram showing a configuration of a three-dimensional stereoscopic image display device according to an embodiment of the present invention.
- FIG. 22 is a diagram showing a composite state of images according to the embodiment of the present invention.
- FIG. 23 is a diagram for explaining a generation process of a geometric figure according to the embodiment of the present invention.
- FIG. 24 is a diagram for explaining image data compression processing according to the embodiment of the present invention.
- Park 25] is a flowchart showing processing during a fade-in / fade-out operation according to the embodiment of the present invention.
- FIG. 26 is a diagram showing a configuration of a three-dimensional stereoscopic image display device according to an embodiment of the present invention.
- FIG. 29 is a diagram for describing a generation process of a geometric figure according to the embodiment of the present invention.
- FIG. 30 is a block diagram showing an example of the architecture of a personal computer according to an embodiment of the present invention.
- FIG. 31 is a block diagram showing a configuration example of a video card according to an embodiment of the present invention.
- FIG. 32 is an explanatory diagram of image switching according to the embodiment of the present invention.
- FIG. 33 is a diagram related to the embodiment of the present invention, and FIG. 33 (a) and FIG. 33 (b) are explanatory diagrams of image switching.
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP04745337A EP1628490A1 (en) | 2003-05-27 | 2004-05-26 | Image display device and program |
US10/557,804 US20070236493A1 (en) | 2003-05-27 | 2004-05-26 | Image Display Apparatus and Program |
Applications Claiming Priority (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-149881 | 2003-05-27 | ||
JP2003149881A JP2004356772A (ja) | 2003-05-27 | 2003-05-27 | 三次元立体画像表示装置およびコンピュータに三次元立体画像表示機能を付与するプログラム |
JP2003-164599 | 2003-06-10 | ||
JP2003165043A JP2005004341A (ja) | 2003-06-10 | 2003-06-10 | 画像表示装置およびコンピュータに画像表示機能を付与するプログラム |
JP2003-165043 | 2003-06-10 | ||
JP2003164599A JP2005005828A (ja) | 2003-06-10 | 2003-06-10 | 画像表示装置およびコンピュータに画像表示機能を付与するプログラム |
JP2003336222A JP2005109568A (ja) | 2003-09-26 | 2003-09-26 | 映像表示装置及びプログラム |
JP2003-336222 | 2003-09-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004107764A1 true WO2004107764A1 (ja) | 2004-12-09 |
Family
ID=33494212
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/007185 WO2004107764A1 (ja) | 2003-05-27 | 2004-05-26 | 画像表示装置及びプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20070236493A1 (ja) |
EP (1) | EP1628490A1 (ja) |
WO (1) | WO2004107764A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011259452A (ja) * | 2011-07-06 | 2011-12-22 | Fujifilm Corp | 立体画像表示装置及び立体画像表示方法 |
JP2011259453A (ja) * | 2011-07-06 | 2011-12-22 | Fujifilm Corp | 立体画像表示装置及び立体画像表示方法 |
WO2012056722A1 (ja) * | 2010-10-29 | 2012-05-03 | 富士フイルム株式会社 | 立体視画像表示装置および方法並びにプログラム |
JP2012199927A (ja) * | 2012-04-16 | 2012-10-18 | Nikon Corp | 画像再生装置 |
Families Citing this family (50)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060241864A1 (en) * | 2005-04-22 | 2006-10-26 | Outland Research, Llc | Method and apparatus for point-and-send data transfer within an ubiquitous computing environment |
AU2005203074A1 (en) * | 2005-07-14 | 2007-02-01 | Canon Information Systems Research Australia Pty Ltd | Image browser |
US20070145680A1 (en) * | 2005-12-15 | 2007-06-28 | Outland Research, Llc | Shake Responsive Portable Computing Device for Simulating a Randomization Object Used In a Game Of Chance |
KR101100212B1 (ko) | 2006-04-21 | 2011-12-28 | 엘지전자 주식회사 | 방송 신호 전송 방법, 방송 신호 재생 방법, 방송 신호전송 장치 및 방송 신호 수신 장치 |
KR101319535B1 (ko) * | 2006-12-26 | 2013-10-21 | 삼성전자주식회사 | 영상신호 처리장치 및 그 제어방법 |
TWI331872B (en) * | 2006-12-29 | 2010-10-11 | Quanta Comp Inc | Method for displaying stereoscopic image |
KR101546828B1 (ko) * | 2008-06-10 | 2015-08-24 | 엘지전자 주식회사 | 디스플레이 장치 |
KR100991804B1 (ko) * | 2008-06-10 | 2010-11-04 | 유한회사 마스터이미지쓰리디아시아 | 이동 기기용 입체영상생성칩 및 이를 이용한입체영상표시방법 |
KR101563626B1 (ko) * | 2008-11-18 | 2015-10-27 | 삼성전자 주식회사 | 디스플레이장치 및 그 제어방법 |
RU2512135C2 (ru) * | 2008-11-18 | 2014-04-10 | Панасоник Корпорэйшн | Устройство воспроизведения, способ воспроизведения и программа для стереоскопического воспроизведения |
US8335425B2 (en) * | 2008-11-18 | 2012-12-18 | Panasonic Corporation | Playback apparatus, playback method, and program for performing stereoscopic playback |
JP5515301B2 (ja) * | 2009-01-21 | 2014-06-11 | 株式会社ニコン | 画像処理装置、プログラム、画像処理方法、記録方法および記録媒体 |
CA2750211C (en) * | 2009-03-19 | 2015-06-16 | Lg Electronics Inc. | Method for processing three dimensional (3d) video signal and digital broadcast receiver for performing the processing method |
US20110063298A1 (en) * | 2009-09-15 | 2011-03-17 | Samir Hulyalkar | Method and system for rendering 3d graphics based on 3d display capabilities |
DE102009048834A1 (de) * | 2009-10-09 | 2011-04-14 | Volkswagen Ag | Verfahren und Anzeigeeinrichtung zum Anzeigen von Informationen |
US8736673B2 (en) * | 2009-12-31 | 2014-05-27 | Stmicroelectronics, Inc. | Method and apparatus for viewing 3D video using a stereoscopic viewing device |
EP2355526A3 (en) | 2010-01-14 | 2012-10-31 | Nintendo Co., Ltd. | Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method |
WO2011105048A1 (ja) * | 2010-02-23 | 2011-09-01 | パナソニック株式会社 | コンピュータ・グラフィックス映像合成装置と方法、及び表示装置 |
US9693039B2 (en) | 2010-05-27 | 2017-06-27 | Nintendo Co., Ltd. | Hand-held electronic device |
JP5872185B2 (ja) * | 2010-05-27 | 2016-03-01 | 任天堂株式会社 | 携帯型電子機器 |
CN102474650B (zh) | 2010-05-28 | 2014-12-17 | 松下电器产业株式会社 | 立体观察影像的再现装置、集成电路、再现方法 |
US8982151B2 (en) * | 2010-06-14 | 2015-03-17 | Microsoft Technology Licensing, Llc | Independently processing planes of display data |
EP2395765B1 (en) | 2010-06-14 | 2016-08-24 | Nintendo Co., Ltd. | Storage medium having stored therein stereoscopic image display program, stereoscopic image display device, stereoscopic image display system, and stereoscopic image display method |
KR101731343B1 (ko) * | 2010-07-14 | 2017-04-28 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
JP5577931B2 (ja) * | 2010-08-06 | 2014-08-27 | ソニー株式会社 | 画像処理装置、画像処理方法およびプログラム |
KR20120015165A (ko) * | 2010-08-11 | 2012-02-21 | 엘지전자 주식회사 | 영상의 깊이감 조절 방법 및 이를 이용하는 이동 단말기 |
WO2012048252A1 (en) | 2010-10-07 | 2012-04-12 | Aria Glassworks, Inc. | System and method for transitioning between interface modes in virtual and augmented reality applications |
US20120120051A1 (en) * | 2010-11-16 | 2012-05-17 | Shu-Ming Liu | Method and system for displaying stereoscopic images |
US9017163B2 (en) | 2010-11-24 | 2015-04-28 | Aria Glassworks, Inc. | System and method for acquiring virtual and augmented reality scenes by a user |
US9070219B2 (en) | 2010-11-24 | 2015-06-30 | Aria Glassworks, Inc. | System and method for presenting virtual and augmented reality scenes to a user |
US9041743B2 (en) * | 2010-11-24 | 2015-05-26 | Aria Glassworks, Inc. | System and method for presenting virtual and augmented reality scenes to a user |
EP2464126A1 (en) * | 2010-12-10 | 2012-06-13 | Advanced Digital Broadcast S.A. | A system and a method for transforming a 3D video sequence to a 2D video sequence |
US8953022B2 (en) | 2011-01-10 | 2015-02-10 | Aria Glassworks, Inc. | System and method for sharing virtual and augmented reality scenes between users and viewers |
JP2012174237A (ja) * | 2011-02-24 | 2012-09-10 | Nintendo Co Ltd | 表示制御プログラム、表示制御装置、表示制御システム、及び表示制御方法 |
US9118970B2 (en) | 2011-03-02 | 2015-08-25 | Aria Glassworks, Inc. | System and method for embedding and viewing media files within a virtual and augmented reality scene |
JP2012257105A (ja) * | 2011-06-09 | 2012-12-27 | Olympus Corp | 立体画像取得装置 |
JP2013005238A (ja) * | 2011-06-16 | 2013-01-07 | Sony Corp | 3次元画像処理装置及び3次元画像処理方法、表示装置、並びにコンピューター・プログラム |
US8994796B2 (en) | 2011-07-06 | 2015-03-31 | Fujifilm Corporation | Stereo image display apparatus and stereo image display method |
JP5577305B2 (ja) * | 2011-08-05 | 2014-08-20 | 株式会社ソニー・コンピュータエンタテインメント | 画像処理装置 |
US9465226B2 (en) | 2011-08-09 | 2016-10-11 | Sony Computer Entertainment Inc. | Automatic shutdown of 3D based on glasses orientation |
EP2590417A1 (en) * | 2011-11-01 | 2013-05-08 | Acer Incorporated | Stereoscopic image display apparatus |
KR101872864B1 (ko) | 2011-12-19 | 2018-06-29 | 엘지전자 주식회사 | 전자 기기 및 전자 기기의 제어 방법 |
WO2013102790A2 (en) * | 2012-01-04 | 2013-07-11 | Thomson Licensing | Processing 3d image sequences cross reference to related applications |
US9626799B2 (en) | 2012-10-02 | 2017-04-18 | Aria Glassworks, Inc. | System and method for dynamically displaying multiple virtual and augmented reality scenes on a single display |
US10769852B2 (en) | 2013-03-14 | 2020-09-08 | Aria Glassworks, Inc. | Method for simulating natural perception in virtual and augmented reality scenes |
JP2014203017A (ja) * | 2013-04-09 | 2014-10-27 | ソニー株式会社 | 画像処理装置、画像処理方法、表示装置、および電子機器 |
KR102176474B1 (ko) | 2014-01-06 | 2020-11-09 | 삼성전자주식회사 | 영상표시장치, 영상표시장치의 구동방법 및 영상표시방법 |
US10977864B2 (en) | 2014-02-21 | 2021-04-13 | Dropbox, Inc. | Techniques for capturing and displaying partial motion in virtual or augmented reality scenes |
US10867445B1 (en) * | 2016-11-16 | 2020-12-15 | Amazon Technologies, Inc. | Content segmentation and navigation |
US11175802B2 (en) * | 2018-09-21 | 2021-11-16 | Sap Se | Configuration object deletion manager |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07212798A (ja) * | 1994-01-18 | 1995-08-11 | Sharp Corp | 両眼テレビジョン装置 |
JPH09289638A (ja) * | 1996-04-23 | 1997-11-04 | Nec Corp | 3次元画像符号化復号方式 |
JPH11164328A (ja) * | 1997-11-27 | 1999-06-18 | Toshiba Corp | 立体映像表示装置 |
JPH11331700A (ja) * | 1998-05-15 | 1999-11-30 | Sony Corp | 画像処理装置および画像処理方法 |
JP2002073003A (ja) * | 2000-08-28 | 2002-03-12 | Namco Ltd | 立体視画像生成装置及び情報記憶媒体 |
JP2002152590A (ja) * | 2000-11-10 | 2002-05-24 | Canon Inc | 画像処理装置、画像処理システム、画像表示方法、及び記憶媒体 |
JP2003158750A (ja) * | 2001-11-21 | 2003-05-30 | Mitsubishi Electric Corp | フェーダ装置 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4694404A (en) * | 1984-01-12 | 1987-09-15 | Key Bank N.A. | High-speed image generation of complex solid objects using octree encoding |
US5353391A (en) * | 1991-05-06 | 1994-10-04 | Apple Computer, Inc. | Method apparatus for transitioning between sequences of images |
JPH08322004A (ja) * | 1995-05-24 | 1996-12-03 | Olympus Optical Co Ltd | 立体視ディスプレイ装置 |
US5781229A (en) * | 1997-02-18 | 1998-07-14 | Mcdonnell Douglas Corporation | Multi-viewer three dimensional (3-D) virtual display system and operating method therefor |
US6177953B1 (en) * | 1997-06-26 | 2001-01-23 | Eastman Kodak Company | Integral images with a transition set of images |
US6300956B1 (en) * | 1998-03-17 | 2001-10-09 | Pixar Animation | Stochastic level of detail in computer animation |
US7102643B2 (en) * | 2001-11-09 | 2006-09-05 | Vibe Solutions Group, Inc. | Method and apparatus for controlling the visual presentation of data |
-
2004
- 2004-05-26 EP EP04745337A patent/EP1628490A1/en not_active Withdrawn
- 2004-05-26 WO PCT/JP2004/007185 patent/WO2004107764A1/ja active Application Filing
- 2004-05-26 US US10/557,804 patent/US20070236493A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07212798A (ja) * | 1994-01-18 | 1995-08-11 | Sharp Corp | 両眼テレビジョン装置 |
JPH09289638A (ja) * | 1996-04-23 | 1997-11-04 | Nec Corp | 3次元画像符号化復号方式 |
JPH11164328A (ja) * | 1997-11-27 | 1999-06-18 | Toshiba Corp | 立体映像表示装置 |
JPH11331700A (ja) * | 1998-05-15 | 1999-11-30 | Sony Corp | 画像処理装置および画像処理方法 |
JP2002073003A (ja) * | 2000-08-28 | 2002-03-12 | Namco Ltd | 立体視画像生成装置及び情報記憶媒体 |
JP2002152590A (ja) * | 2000-11-10 | 2002-05-24 | Canon Inc | 画像処理装置、画像処理システム、画像表示方法、及び記憶媒体 |
JP2003158750A (ja) * | 2001-11-21 | 2003-05-30 | Mitsubishi Electric Corp | フェーダ装置 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012056722A1 (ja) * | 2010-10-29 | 2012-05-03 | 富士フイルム株式会社 | 立体視画像表示装置および方法並びにプログラム |
JP2011259452A (ja) * | 2011-07-06 | 2011-12-22 | Fujifilm Corp | 立体画像表示装置及び立体画像表示方法 |
JP2011259453A (ja) * | 2011-07-06 | 2011-12-22 | Fujifilm Corp | 立体画像表示装置及び立体画像表示方法 |
JP2012199927A (ja) * | 2012-04-16 | 2012-10-18 | Nikon Corp | 画像再生装置 |
Also Published As
Publication number | Publication date |
---|---|
EP1628490A1 (en) | 2006-02-22 |
US20070236493A1 (en) | 2007-10-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2004107764A1 (ja) | 画像表示装置及びプログラム | |
JP4214976B2 (ja) | 擬似立体画像作成装置及び擬似立体画像作成方法並びに擬似立体画像表示システム | |
JP5160741B2 (ja) | 3dグラフィック処理装置及びこれを利用した立体映像表示装置 | |
JP3420504B2 (ja) | 情報処理方法 | |
JP4982862B2 (ja) | プログラム、情報記憶媒体及び画像生成システム | |
JP2006165795A (ja) | 画像形成装置およびその方法 | |
KR100967296B1 (ko) | 그래픽 인터페이스 및 스테레오스코픽 디스플레이용 그래픽데이터를 래스터라이즈하는 방법 | |
US8866887B2 (en) | Computer graphics video synthesizing device and method, and display device | |
WO2009155688A1 (en) | Method for seeing ordinary video in 3d on handheld media players without 3d glasses or lenticular optics | |
KR101713875B1 (ko) | 프로젝터 투사 환경하에서의 사용자 시점을 고려한 가상공간 구현 방법 및 시스템 | |
US11589027B2 (en) | Methods, systems, and media for generating and rendering immersive video content | |
JP2005109568A (ja) | 映像表示装置及びプログラム | |
KR102059732B1 (ko) | 디지털 비디오 렌더링 | |
JP2006178900A (ja) | 立体画像生成装置 | |
US20180249145A1 (en) | Reducing View Transitions Artifacts In Automultiscopic Displays | |
KR100436904B1 (ko) | 2차원이미지에 대한 입체영상생성방법 | |
KR100381817B1 (ko) | 제트버퍼를 이용한 입체영상 생성방법 및 기록매체 | |
US20120280985A1 (en) | Image producing apparatus, image producing system, storage medium having stored thereon image producing program and image producing method | |
JP2004356789A (ja) | 立体映像表示装置及びプログラム | |
JP5328852B2 (ja) | 画像処理装置、画像処理方法、プログラム及び情報記憶媒体 | |
WO2015120032A1 (en) | Reducing view transition artifacts in automultiscopic displays | |
TW202240547A (zh) | 採用傾斜多視像影像匯聚平面的多視像顯示系統和方法 | |
JP2009064355A (ja) | プログラム、情報記憶媒体及び画像生成システム | |
JPH11184453A (ja) | 表示装置及びその制御方法、コンピュータ可読メモリ | |
JP2005004341A (ja) | 画像表示装置およびコンピュータに画像表示機能を付与するプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004745337 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20048143387 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 2004745337 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10557804 Country of ref document: US Ref document number: 2007236493 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 10557804 Country of ref document: US |