EP0969418A2 - Appareil de traítement d'images pour afficher des images tridimensionelles - Google Patents

Appareil de traítement d'images pour afficher des images tridimensionelles Download PDF

Info

Publication number
EP0969418A2
EP0969418A2 EP99301826A EP99301826A EP0969418A2 EP 0969418 A2 EP0969418 A2 EP 0969418A2 EP 99301826 A EP99301826 A EP 99301826A EP 99301826 A EP99301826 A EP 99301826A EP 0969418 A2 EP0969418 A2 EP 0969418A2
Authority
EP
European Patent Office
Prior art keywords
stereoscopic viewing
parameter
dimensional space
frame buffer
space data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP99301826A
Other languages
German (de)
English (en)
Other versions
EP0969418A3 (fr
Inventor
Shinji Mixed Reality Systems Lab. Inc. Uchiyama
Hiroyuki Mixed Reality Syst. Lab. Inc. Yamamoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Mixed Reality Systems Laboratory Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mixed Reality Systems Laboratory Inc filed Critical Mixed Reality Systems Laboratory Inc
Publication of EP0969418A2 publication Critical patent/EP0969418A2/fr
Publication of EP0969418A3 publication Critical patent/EP0969418A3/fr
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • H04N13/279Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/307Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using fly-eye lenses, e.g. arrangements of circular lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/286Image signal generators having separate monoscopic and stereoscopic modes
    • H04N13/289Switching between monoscopic and stereoscopic modes

Definitions

  • the present invention relates to an image processing apparatus for converting space data described in a language or format that processes a three-dimensional virtual space such as a VRML (Virtual Reality Modeling Language) or the like, into image data that can be stereoscopically observed, and to a viewer apparatus for displaying such image data.
  • a VRML Virtual Reality Modeling Language
  • the present invention also relates to a user interface apparatus for changing stereoscopic viewing parameters, a stereoscopic display buffer control method and apparatus for obtaining stereoscopic viewing, and a program storage medium for storing their programs.
  • the VRML 2.0 as the currently available latest version is used for building a virtual mall for electric commerce, managing three-dimensional data of, e.g., CAD in an intranet, and so forth.
  • the assignee of the present invention has developed a rear-cross lenticular 3D display (to be referred to as a stereoscopic display apparatus hereinafter to be distinguished from a two-dimensional display apparatus) as a stereoscopic display apparatus that allows the observer to directly perceive depth.
  • a stereoscopic display apparatus that allows the observer to directly perceive depth.
  • many stereoscopic display apparatuses have been proposed.
  • a VRML viewer apparatus In order to display a three-dimensional space described in the VRML 2.0, a VRML viewer apparatus is required. However, a normally prevalent VRML viewer apparatus can only make two-dimensional display (to be referred to as a "two-dimensional display apparatus hereinafter in contrast to the "stereoscopic display apparatus") as final display using a normal monitor, although a three-dimensional space is described.
  • step S52 an image to be seen when the three-dimensional space is viewed by the left eye is generated, and three-dimensional space data is rendered on the basis of the viewpoint position and gaze direction of the left eye.
  • a left eye image as a rendering result is stored in frame buffer B.
  • step S54 frame buffer C is prepared, and data required for right eye display is read out from the contents of frame buffer A and is written in frame buffer C.
  • step S56 data required for left eye display is read out from the contents of frame buffer B and is written in frame buffer C.
  • the buffer control method shown in Fig. 18 requires three buffers.
  • Fig. 19 shows a buffer write control sequence according to another prior art.
  • the buffer control method shown in Fig. 19 requires two buffers.
  • inter-buffer data transfer that requires processing for reading out data from a frame buffer in which the data has been written, and writing the data in another frame buffer is time-consuming processing, and lowers the processing speed.
  • the present invention has been made in consideration of the conventional problems, and has as its object to provide a stereoscopic image processing apparatus, user interface apparatus, and stereoscopic image display apparatus, which can assign parameters to three-dimensional space data in which stereoscopic viewing parameters are not defined.
  • a stereoscopic image processing apparatus comprises:
  • the parameter defines a base line length of a user. Also, according to a preferred aspect of the present invention, the parameter defines a convergence angle of a user.
  • the base line length and convergence angle are mandatory parameters required to realize stereoscopic viewing.
  • the apparatus further comprises means for determining necessity of generation of the parameter on the basis of a file extension of the received three-dimensional space data or attribute information appended the data file.
  • a user interface apparatus used upon displaying a stereoscopic image comprises:
  • the parameters can be freely changed to match image data with the purpose of the user's application.
  • the user interface is preferably displayed on the display screen.
  • the apparatus further comprises display means, and the user setting means displays a GUI on the display means.
  • the GUI is implemented by a slide bar, jog dial or wheel.
  • a buffer control method for effectively using a buffer according to the present invention is directed to a buffer control method upon obtaining stereoscopic viewing by displaying right and left eye images written in one frame buffer, comprising the steps of:
  • a buffer control apparatus which has one frame buffer and obtains stereoscopic viewing by displaying right and left eye images written in the frame buffer, comprising:
  • the mask memory comprises a stencil buffer.
  • the mask function is determined based on an order of stripes on a display apparatus, and a write order of images in the frame buffer.
  • the mask information is written in a stencil buffer at least before the other image is written in the frame memory.
  • a stereoscopic display apparatus for receiving and stereoscopically displaying three-dimensional space data described with the VRML 2.0 using a WWW browser
  • This stereoscopic display apparatus uses as a display a rear-cross lenticular 3D display developed by the present applicant so as to give a stereoscopic sense to the user.
  • the present invention cannot be applied to only virtual space data in the VRML 2.0, only virtual space data received by a WWW browser, or only the rear-cross lenticular 3D display.
  • Fig. 1 shows a WWW network to which a plurality of VRML viewer apparatuses that adopt the method of the present invention are connected.
  • the features of the VRML viewer apparatus of this embodiment are:
  • Fig. 2 shows the hardware arrangement of a viewer apparatus 1000 of this embodiment.
  • the viewer apparatus 1000 has a display 10, a frame buffer 11 for temporarily storing data to be displayed on the display 10, a stencil buffer 12 for storing mask information, a CPU 13, a keyboard 14, a mouse 15, and a storage device 16 like in a normal workstation or personal computers.
  • the display 10 uses a rear-cross lenticular eye-glass-less (i.e., naked eye) 3D display proposed by the present application to attain stereoscopic viewing.
  • a rear-cross lenticular eye-glass-less (i.e., naked eye) 3D display proposed by the present application to attain stereoscopic viewing.
  • the rear-cross lenticular eye-glass-free 3D display 10 is characterized by inserting two lenticular lenses (an H lenticular lens 21 and V lenticular lens 22) and a checkerboard-like mask panel 23 between a TFT liquid crystal (LC) panel 20 and backlight panel 24.
  • the V lenticular lens 22 has a role of splitting illumination light coming from the backlight panel 24 in the right and left eye directions by changing its directionality in units of scanning lines.
  • the H lenticular lens 21 prevents crosstalk by focusing light coming from an aperture of one horizontal line of the checkerboard-like mask panel 23 onto one scanning line of the LC panel 20, and broadens the stereoscopic viewing range in the vertical direction by converting light coming from the aperture portion into divergent light toward the observer.
  • Fig. 4 is an enlarged view of a partial region A of the LC panel 20 shown in Fig. 3.
  • an image obtained by synthesizing pairs of right and left stereo images for stereoscopic viewing, which are alternately arranged in horizontal stripe patterns, is displayed on the LC panel 20.
  • the display 10 is compatible with a horizontal stripe synthesized image of a time-division field sequential scheme normally used in liquid crystal shutter spectacles or the like, and so forth.
  • Images of a virtual space constructed using three-dimensional geometric data such as VRML data or the like viewed from two viewpoints corresponding to the right and left eyes can be rendered using a conventional CG technique while considering these two viewpoints as virtual cameras, as shown in Fig. 5.
  • data that pertain to the viewpoint positions (R, L), gaze direction (vector e), base line length b, and convergence angle ⁇ of the right and left eyes of the user are required.
  • FIG. 6 shows the relationship among the viewpoint positions (R, L), gaze direction (line of sight vector e), base line length b, and convergence angle ⁇ when a description method that defines the right eye of a dextrocular user as the base line position is used.
  • Fig. 7 explains the description method of the base line length b and convergence angle ⁇ when the center of the base line is set at the center between the right and left eyes.
  • the coordinate origin is set at the center between the right and left eyes
  • the x-axis agrees with the direction from the left eye to the right eye
  • the y-axis agrees with the vertical direction
  • the z-axis agrees with the depth direction.
  • the right eye is located at a position offset by b/2 to the right
  • the left eye is located at a position offset by b/2 to the left.
  • represents the convergence angle
  • the right line of sight agrees with a direction that has rotated ⁇ /2 counterclockwise from the line of sight vector e R
  • the left line of sight agrees with a direction that has rotated ⁇ /2 clockwise from the line of sight vector e L .
  • a projection M of a virtual camera placed at the position of an eye (right or left eye) onto a two-dimensional imaging plane (x, y) is obtained. If (X C , Y C , Z C ) represents the position of the virtual camera, the projection M is expressed by a matrix determined by the viewpoint position (X C , Y C , Z C ), the line of sight direction e, and the field angle, and perspective transformation can be normally used.
  • the "base line length" and "convergence angle" of the user can be arbitrary to a certain extent.
  • the size of a space expressed by VRML data need not be expressed with reference to the size of a human, but can be freely set according to the will of the data creator.
  • the size of a molecule is very different from the base line length, and it is essentially nonsense to define the "base line length" in accordance with the size of a human.
  • the extent that the "base line length” and "convergence angle" of the user can be arbitrary is not so greater than the "viewpoint position" and "gaze direction". That is, the "viewpoint position" and “gaze direction” of the user are mandatory when VRML data is displayed on a two-dimensional image or a stereoscopic viewing image is to be rendered.
  • the "base line length” and "convergence angle” are not required when only one two-dimensional image is generated.
  • the “base line length” and “convergence angle” are required.
  • the viewer apparatus of this embodiment automatically assigns a base line length and convergence angle to externally received VRML data without the intervention of the user, in consideration of the fact that the VRML data is not assigned any base line length and convergence angle.
  • Fig. 11 explains the data flow when the viewer apparatus shown in Fig. 2 acquires VRML data from a WWW site via a WWW browser, assigns a base line length and convergence angle to that VRML data, generates right and left eye images 105 and 104 at an arbitrary viewpoint position, and displays the generated images on the stereoscopic display 10.
  • a viewer application program 100 is a helper application program of a WWW browser 101 that runs under the control of an operating system (OS) 102.
  • the viewer apparatus 1000 of this embodiment has the storage device 16 for storing the processing sequence (Fig. 12) of the viewer application program 100, virtual space data (three-dimensional space data), and processing contents (space operation parameters and stereoscopic viewing parameters), as shown in Fig. 2.
  • the virtual space data (i.e., three-dimensional space data) shown in Fig. 2 corresponds to VRML data shown in Fig. 11.
  • Fig. 2 includes data which are required upon making stereoscopic viewing display in practice, and are input from a predetermined user interface to express the forward speed (or backward speed), field angle, and the like, and is denoted by 110 in Fig. 11.
  • the stereoscopic viewing parameters in Fig. 2 are those generated by the viewer application program 100 such as the base line length, convergence angle, and the like, and are denoted by 111 in Fig. 11.
  • Fig. 11 the user starts the WWW browser 101, which acquires VRML data from a WWW site 200.
  • Three-dimensional data described by VRML is appended a predetermined file extension.
  • the viewer application program 100 is registered in advance as a helper application program of the WWW browser 101. More specifically, the viewer application program 100 assigns an initial value b 0 of the base line length and an initial value ⁇ 0 of the convergence angle to three-dimensional space data with the predetermined file extension passed from the WWW browser 101, and provides a user interface environment for changing space operation data including the base line length b, convergence angle ⁇ , and the like.
  • Fig. 12 shows the control sequence of the viewer application program 100.
  • the WWW browser 101 When the WWW browser 101 detects that data received from the WWW site 200 has a file extension, it passes that VRML data to the viewer application program 100.
  • the viewer application program 100 receives the VRML data in step S2.
  • step S4 initial values (b 0 and ⁇ 0) of the stereoscopic viewing parameters (base line length b and convergence angle ⁇ ) are determined from the received VRML data. This determination scheme will be explained later.
  • the determined initial values are stored in a stereoscopic viewing parameter storage area 111.
  • step S6 the right and left eye images 105 and 104 are generated using equation (1) above in accordance with the determined initial values (b 0 and ⁇ 0 ) of the base line length and convergence angle, and are passed to the WWW browser 101.
  • the WWW browser 101 or operating system 102 synthesizes these images and displays the synthesized image on the display 10 (step S8).
  • This synthesis scheme uses the stencil buffer 12, and is one of the characteristic features of the viewer apparatus 1000 of this embodiment, as will be described in detail later.
  • the user who observes the display screen may change the base line length, field angle, or convergence angle, or may make walkthrough operation such as forward or backward movement, rotation, or the like.
  • the operation for changing the base line length, or convergence angle is called stereoscopic viewing parameter changing operation in this embodiment.
  • the walkthrough operation such as changing in field angle, forward or backward movement, rotation, or the like is called interaction operation.
  • step S10 If it is determined that data input in step S10 indicates end of operation (step S12), the viewer application program 100 ends.
  • step S22 execute processing corresponding to that interaction operation.
  • a stereoscopic viewing parameter associated with the operation is changed, i.e., the contents in the storage area 111 are changed in step S20.
  • the initial value b 0 of the base line length is determined as follows.
  • the viewer application program estimates the initial value b 0 of the base line length from only target space data. Note that the processing itself for obtaining stereoscopic viewing parameters from only space data which is not assigned any stereoscopic viewing parameters is not done conventionally.
  • the size of the entire space is estimated, and the product of this estimated size and an appropriate constant k is calculated to obtain the base line length initial value b 0 .
  • Such determination process means that "a size obtained by multiplying the size of the entire space by a given constant larger than 0 is defined as the size of a human" (the base line length of a standard person is 65 mm).
  • (X i , Y i , Z i ) be a point group of VRML data.
  • the maximum values (X max , Y max , Z max ) and minimum values (X min , Y min , Z min ) of X i , Y i , and Z i are obtained.
  • the user can perceive the target space expressed by the VRML data with a given size independently of the size expressed by the target space.
  • the size of the entire space does not always serve as a reference for the size of a human
  • desired stereoscopic sense may not always be obtained by the base line length determined as described above.
  • the above-mentioned method presents it as an object having a size shown in Fig. 15 to the user when the size of a human is fixed at a given value, and the presented size does not always match the user's requirement.
  • the base line length obtained by the above-mentioned method gives a field of view in which the entire target space can be seen.
  • it is possible for the present invention to set the base line length initial value b 0 at the size of a human eye but such setup is often not preferable.
  • the scheme of the embodiment based on equation 93) allows to set the base line length initial value b 0 at the size of a human eye.
  • the initial value ⁇ 0 of the convergence angle is determined as follows.
  • the convergence angle ⁇ is produced when both eyes point inwardly upon observing a close object.
  • the initial value 0 of the convergence angle does not always provide desired stereoscopic sense. However, a stereoscopic viewing parameter that can at least give stereoscopic sense can be set.
  • the initial values of the stereoscopic viewing parameters do not give optimal values in every cases, as described above. Even when the initial values are optimal, it may be desirable to change these parameters as a result of walkthrough. For example, assume that the observer walks toward a given building in the virtual space of a street, and approaches the key hole in the door knob of one door in that building (then enters that key hole). In such case, since the right and left images cease to merge from a given timing (the deviation between the right and left images becomes too large), stereoscopic sense is lost. For this reason, the stereoscopic viewing parameters must be able to dynamically change in correspondence with observer's operation. In the aforementioned example, the base line length must be reduced (to reduce the size of a human to allow the observer to enter the key hole).
  • the viewer application program 100 also provides a user interface for changing the parameters and inputting interactions.
  • Fig. 16 shows a user interface (GUI) displayed on the screen of the display 10.
  • GUI user interface
  • reference numerals 200 and 201 denote user interface tools (BASE LINE LENGTH) for changing the base line length b, both of which are implemented by slide bars.
  • the slide bar 200 is used for increasing the base line length
  • the slide bar 201 is used for decreasing the base line length.
  • the base line length initial value b 0 is set to obtain the virtual space shown in Fig. 14
  • a display shown in Fig. 15 is obtained by increasing the base line length using the slide bar 200.
  • slide bars 202, 203 are used for changing the forward speed and backward speed (SPEED), respectively.
  • the slide bar 202 is used for effecting a large speed change, while the bar 203 is for effecting a small change.
  • a slide bar 204 is used for changing the convergence angle (View Convergence). Furthermore, a slide bar 205 is used for changing the field angle (Field of View).
  • the user interfaces shown in Fig. 1 are operated by the user using the mouse 15.
  • tools of the user interface are not limited to slide bars.
  • tools that can continuously change parameters e.g., graphical jog dials, and graphical wheels may be used.
  • the right and left eye images to be displayed on the display 10 to obtain stereoscopic viewing must be stored in a frame buffer. Since the stereoscopic display 10 used in this embodiment uses a rear-cross lenticular spectacle-less 3D display, pairs of right and left stereo images for stereoscopic viewing are alternately displayed in horizontal stripe patterns, as shown in Fig. 4. For this purpose, one frame buffer must alternately store one stripe for the left eye image and one stripe for the right eye image, as shown in Fig. 17.
  • the buffer control of this embodiment requires only one frame buffer, and can obviate the need for reading out data from the frame buffer.
  • the viewer apparatus 1000 of this embodiment comprises the frame buffer 11 and stencil buffer 12.
  • the stencil buffer 12 is a buffer memory which is equipped as an option in some image processing apparatuses.
  • the stencil buffer 12, which is a part of a frame buffer is a memory having a 1-bit depth, and the same numbers of bits as those of the display 10 in the horizontal and vertical directions.
  • the stencil buffer 12 is used as a mask, as shown in Fig. 20. More specifically, bits "1" serve as a mask, and bits "0" do not serve as a mask. Thus, the stencil buffer functions as a mask that masks a horizontal line every two lines.
  • Fig. 21 is a flow chart showing the data write sequence in the frame buffer.
  • step S70 an image obtained when the three-dimensional space is viewed by the right eye is generated. Then, three-dimensional space data is rendered on the basis of the viewpoint position and direction of the right eye. In this manner, the right eye image as a rendering result is stored in the frame buffer 11. More specifically, at this time, data of the right eye image is written at all the addresses of the frame buffer 11.
  • step S72 a mask pattern shown in Fig. 20 is prepared, and is written in the stencil buffer 12.
  • lines may be drawn into the stencil buffer so that they form a masking pattern as shown in Fig. 20.
  • step S74 a left eye image is generated, and is over-written in the frame buffer.
  • the stencil buffer 12 functions so that the right eye image remains stored at those pixel positions in the frame buffer 11 which correspond to "0" positions in the stencil buffer 12, and the left eye image is written at those pixel positions in the frame buffer 11 which correspond to "1" positions in the stencil buffer 12. With this operation, stripes shown in Fig. 17 are obtained.
  • the viewer application program of the present invention is not limited to the helper application program of the WWW browsers. It may be implemented as a plug-in program in the browsers, and it can also serve as a standalone program as long as a stereoscopic image is to be displayed.
  • the initial values of the stereoscopic viewing parameters can also be determined by schemes other than the aforementioned schemes. Whether or not the determined initial values of the parameters make the user feel incoherent depends on applications. Hence, various constant values k for determining initial values may be prepared in advance in the viewer application program in units of application programs, and different constant values may be used in correspondence with application programs.
  • the user interface of the present invention need not always be displayed on the display, and any other user interfaces may be used as long as the user can change parameters and the like continuously or discretely.
  • bits “0" and “1” of the mask information may be reversed. Also, bits “0" and “1” may be stored in correspondence with the layout order of the image format in the LCD 20 of the stereoscopic display 10.
  • the left eye image may be written first. Whether or not mask functions assigned to bits "0" and "1" of the mask information are reversed may be determined in correspondence with the layout of the LCD 20 of the stereoscopic display 10.
  • parameters can be assigned to three-dimensional space data in which no stereoscopic viewing parameters are defined.
  • the user interface apparatus that can freely change stereoscopic viewing parameters can be provided.
  • buffer control for efficient stereoscopic viewing display can be realized using only one frame buffer.
  • the present invention can be implemented by a computer program operating on a standard desk top computer.
  • An aspect of the present invention thus provides a storage medium storing processor implementable instructions for controlling a processor to carry out the method as hereinabove described.
  • the computer program can be obtained in electronic form for example by downloading the code over a network such as the internet.
  • an electrical signal carrying processor implementable instructions for controlling a processor to carry out the method as hereinbefore described.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
EP99301826A 1998-06-30 1999-03-11 Appareil de traítement d'images pour afficher des images tridimensionelles Withdrawn EP0969418A3 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP18513598 1998-06-30
JP18513598A JP3420504B2 (ja) 1998-06-30 1998-06-30 情報処理方法

Publications (2)

Publication Number Publication Date
EP0969418A2 true EP0969418A2 (fr) 2000-01-05
EP0969418A3 EP0969418A3 (fr) 2001-05-02

Family

ID=16165494

Family Applications (1)

Application Number Title Priority Date Filing Date
EP99301826A Withdrawn EP0969418A3 (fr) 1998-06-30 1999-03-11 Appareil de traítement d'images pour afficher des images tridimensionelles

Country Status (3)

Country Link
US (1) US6760020B1 (fr)
EP (1) EP0969418A3 (fr)
JP (1) JP3420504B2 (fr)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001084852A1 (fr) * 2000-05-03 2001-11-08 Koninklijke Philips Electronics N.V. Circuit de commande d'affichage autostereoscopique
WO2002073981A1 (fr) 2001-03-09 2002-09-19 Koninklijke Philips Electronics N.V. Systeme d'affichage d'images autostereoscopiques pouvant etre commande par l'utilisateur
EP1501317A1 (fr) * 2002-04-25 2005-01-26 Sharp Kabushiki Kaisha Dispositif de creation de donnees image, dispositif de reproduction de donnees image, et support d'enregistrement de donnees image
WO2008011888A1 (fr) * 2006-07-24 2008-01-31 Seefront Gmbh Système autostéréoscopique
DE102008025103A1 (de) * 2008-05-26 2009-12-10 Technische Universität Berlin Verfahren zum Herstellen einer autostereoskopischen Darstellung und Anordnung für eine autostereoskopische Darstellung
EP2448275A3 (fr) * 2009-11-23 2012-06-06 Samsung Electronics Co., Ltd. Procédé de fourniture de IUG, appareil d'affichage et système de fourniture d'images 3D l'utilisant
EP2658268A3 (fr) * 2012-04-25 2014-03-19 Samsung Electronics Co., Ltd Appareil et procédé permettant d'afficher une image stéréoscopique dans un dispositif électronique
CN105373374A (zh) * 2014-08-30 2016-03-02 上海爱护网贸易有限公司 一种在web上显示3d场所的方法及系统
CN105373375A (zh) * 2014-08-30 2016-03-02 上海爱护网贸易有限公司 一种基于异步加载的web3d显示方法及系统

Families Citing this family (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US7614008B2 (en) 2004-07-30 2009-11-03 Apple Inc. Operation of a computer with touch screen interface
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
KR20010104493A (ko) * 2000-05-02 2001-11-26 성필문 데이터통신망을 이용한 입체영상제공방법 및 그 처리장치
JP2002092657A (ja) * 2000-09-12 2002-03-29 Canon Inc 立体表示制御装置、方法及び記憶媒体
CA2328795A1 (fr) 2000-12-19 2002-06-19 Advanced Numerical Methods Ltd. Ameliorations des applications et du rendement d'une technologie de visualisation detaillee en contexte
CA2345803A1 (fr) 2001-05-03 2002-11-03 Idelix Software Inc. Elements d'interface utilisateur pour applications de technologie d'affichage souple
US8416266B2 (en) 2001-05-03 2013-04-09 Noregin Assetts N.V., L.L.C. Interacting with detail-in-context presentations
US9760235B2 (en) 2001-06-12 2017-09-12 Callahan Cellular L.L.C. Lens-defined adjustment of displays
US7213214B2 (en) 2001-06-12 2007-05-01 Idelix Software Inc. Graphical user interface with zoom for detail-in-context presentations
US7084886B2 (en) 2002-07-16 2006-08-01 Idelix Software Inc. Using detail-in-context lenses for accurate digital image cropping and measurement
JP3594915B2 (ja) * 2001-08-03 2004-12-02 株式会社ナムコ プログラム、情報記憶媒体及びゲーム装置
CA2361341A1 (fr) 2001-11-07 2003-05-07 Idelix Software Inc. Utilisation de la presentation de detail en contexte sur des images stereoscopiques
CA2370752A1 (fr) 2002-02-05 2003-08-05 Idelix Software Inc. Rendu rapide d'images de trame deformees par lentille pyramidale
US8120624B2 (en) 2002-07-16 2012-02-21 Noregin Assets N.V. L.L.C. Detail-in-context lenses for digital image cropping, measurement and online maps
CA2393887A1 (fr) 2002-07-17 2004-01-17 Idelix Software Inc. Ameliorations de l'interface utilisateur pour presentation de donnees a details en contexte
CA2406131A1 (fr) 2002-09-30 2004-03-30 Idelix Software Inc. Interface graphique faisant appel au pliage contextuel detaille
CA2449888A1 (fr) 2003-11-17 2005-05-17 Idelix Software Inc. Navigation sur de grandes images au moyen de techniques a ultra-grand angulaire de rendu des details en contexte
CA2411898A1 (fr) 2002-11-15 2004-05-15 Idelix Software Inc. Methode et systeme pour commander l'acces aux presentations de detail en contexte
EP1570683A1 (fr) 2002-11-21 2005-09-07 Vision III Imaging, Inc. Alignement critique d'images parallaxes pour affichage autostereoscopique
JP2004199496A (ja) 2002-12-19 2004-07-15 Sony Corp 情報処理装置および方法、並びにプログラム
KR101101570B1 (ko) * 2003-12-19 2012-01-02 티디비전 코포레이션 에스.에이. 데 씨.브이. 3차원 비디오게임 시스템
JP2007525907A (ja) 2004-02-27 2007-09-06 ティディヴィジョン コーポレイション エス.エー. デ シー.ヴィ. 立体3dビデオイメージディジタルデコーディングのシステムおよび方法
US7486302B2 (en) 2004-04-14 2009-02-03 Noregin Assets N.V., L.L.C. Fisheye lens graphical user interfaces
US8106927B2 (en) 2004-05-28 2012-01-31 Noregin Assets N.V., L.L.C. Graphical user interfaces and occlusion prevention for fisheye lenses with line segment foci
US9317945B2 (en) 2004-06-23 2016-04-19 Callahan Cellular L.L.C. Detail-in-context lenses for navigation
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US7714859B2 (en) 2004-09-03 2010-05-11 Shoemaker Garth B D Occlusion reduction and magnification for multidimensional data presentations
US7542034B2 (en) 2004-09-23 2009-06-02 Conversion Works, Inc. System and method for processing video images
US7995078B2 (en) 2004-09-29 2011-08-09 Noregin Assets, N.V., L.L.C. Compound lenses for multi-source data presentation
US7580036B2 (en) 2005-04-13 2009-08-25 Catherine Montagnese Detail-in-context terrain displacement algorithm with optimizations
US8031206B2 (en) 2005-10-12 2011-10-04 Noregin Assets N.V., L.L.C. Method and system for generating pyramid fisheye lens detail-in-context presentations
US7983473B2 (en) 2006-04-11 2011-07-19 Noregin Assets, N.V., L.L.C. Transparency adjustment of a presentation
JP4662071B2 (ja) * 2006-12-27 2011-03-30 富士フイルム株式会社 画像再生方法
US8655052B2 (en) * 2007-01-26 2014-02-18 Intellectual Discovery Co., Ltd. Methodology for 3D scene reconstruction from 2D image sequences
US8274530B2 (en) 2007-03-12 2012-09-25 Conversion Works, Inc. Systems and methods for filling occluded information for 2-D to 3-D conversion
US20080225042A1 (en) * 2007-03-12 2008-09-18 Conversion Works, Inc. Systems and methods for allowing a user to dynamically manipulate stereoscopic parameters
US9026938B2 (en) 2007-07-26 2015-05-05 Noregin Assets N.V., L.L.C. Dynamic detail-in-context user interface for application access and content access on electronic displays
JP5596914B2 (ja) * 2008-09-16 2014-09-24 富士通株式会社 端末装置、表示制御方法、及び表示機能を備えたプログラム
JP5409107B2 (ja) * 2009-05-13 2014-02-05 任天堂株式会社 表示制御プログラム、情報処理装置、表示制御方法、および情報処理システム
US20100328428A1 (en) * 2009-06-26 2010-12-30 Booth Jr Lawrence A Optimized stereoscopic visualization
JP4482657B1 (ja) * 2009-09-25 2010-06-16 学校法人 文教大学学園 3次元コンテンツを立体コンテンツに自動変換するステレオビューア
JP5405264B2 (ja) 2009-10-20 2014-02-05 任天堂株式会社 表示制御プログラム、ライブラリプログラム、情報処理システム、および、表示制御方法
JP4754031B2 (ja) 2009-11-04 2011-08-24 任天堂株式会社 表示制御プログラム、情報処理システム、および立体表示の制御に利用されるプログラム
EP2355526A3 (fr) 2010-01-14 2012-10-31 Nintendo Co., Ltd. Support de stockage lisible sur ordinateur doté d'un programme de contrôle de l'affichage stocké dessus, appareil de contrôle de l'affichage, système de contrôle de l'affichage et procédé de contrôle de l'affichage
US9693039B2 (en) 2010-05-27 2017-06-27 Nintendo Co., Ltd. Hand-held electronic device
JP2012217591A (ja) * 2011-04-07 2012-11-12 Toshiba Corp 画像処理システム、装置、方法及びプログラム
KR101824501B1 (ko) * 2011-05-19 2018-02-01 삼성전자 주식회사 헤드 마운트 디스플레이 장치의 이미지 표시 제어 장치 및 방법
JP5236051B2 (ja) * 2011-08-02 2013-07-17 ティディヴィジョン コーポレイション エス.エー. デ シー.ヴィ. 3dビデオゲーム・システム
TW201326902A (zh) * 2011-12-29 2013-07-01 Ind Tech Res Inst 立體顯示系統及其影像顯示方法
KR20140063272A (ko) * 2012-11-16 2014-05-27 엘지전자 주식회사 영상표시장치, 및 그 동작방법
US10990996B1 (en) * 2017-08-03 2021-04-27 Intuit, Inc. Predicting application conversion using eye tracking

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2176671A (en) * 1985-05-31 1986-12-31 Electrocraft Consultants Limit Separation overlay
WO1996022660A1 (fr) * 1995-01-20 1996-07-25 Reveo, Inc. Systeme intelligent et procede associe de creation et presentation d'images stereo multiplexees dans des environnements de realite virtuelle
EP0817125A2 (fr) * 1996-06-26 1998-01-07 Matsushita Electric Industrial Co., Ltd. Appareil pour générer des images stéréoscopiques animées
CA2214238A1 (fr) * 1996-08-30 1998-02-28 Arlen Anderson Systeme et methode d'execution et de gestion des operations dans un parc d'engraissement
EP0827350A2 (fr) * 1996-09-02 1998-03-04 Canon Kabushiki Kaisha Dispositif d'affichage d'images stéréoscopiques

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5020878A (en) 1989-03-20 1991-06-04 Tektronix, Inc. Method and apparatus for generating a binocular viewing model automatically adapted to a selected image
US5675746A (en) * 1992-09-30 1997-10-07 Marshall; Paul S. Virtual reality generator for use with financial information
US5590062A (en) * 1993-07-02 1996-12-31 Matsushita Electric Industrial Co., Ltd. Simulator for producing various living environments mainly for visual perception
US5581271A (en) * 1994-12-05 1996-12-03 Hughes Aircraft Company Head mounted visual display
US5805868A (en) * 1995-03-24 1998-09-08 3Dlabs Inc. Ltd. Graphics subsystem with fast clear capability
US5841409A (en) * 1995-04-18 1998-11-24 Minolta Co., Ltd. Image display apparatus
US5910817A (en) * 1995-05-18 1999-06-08 Omron Corporation Object observing method and device
CN100364009C (zh) * 1995-08-21 2008-01-23 松下电器产业株式会社 再生设备及记录方法
JP3575902B2 (ja) 1996-01-24 2004-10-13 株式会社タイトー 二眼式立体映像を表示するゲーム機の両眼映像入射位置および輻輳点の調整装置
US6072443A (en) * 1996-03-29 2000-06-06 Texas Instruments Incorporated Adaptive ocular projection display
US5808588A (en) * 1996-04-25 1998-09-15 Artificial Parallax Electronics Corp. Shutter synchronization circuit for stereoscopic systems
EP0817123B1 (fr) * 1996-06-27 2001-09-12 Kabushiki Kaisha Toshiba Système et procédé d'affichage stéréoscopique
US6154600A (en) * 1996-08-06 2000-11-28 Applied Magic, Inc. Media editor for non-linear editing system
JPH10154059A (ja) 1996-09-30 1998-06-09 Sony Corp 画像表示処理装置、画像表示処理方法、および情報提供媒体
US5796373A (en) * 1996-10-10 1998-08-18 Artificial Parallax Electronics Corp. Computerized stereoscopic image system and method of using two-dimensional image for providing a view having visual depth
US6343987B2 (en) * 1996-11-07 2002-02-05 Kabushiki Kaisha Sega Enterprises Image processing device, image processing method and recording medium
US5914719A (en) * 1996-12-03 1999-06-22 S3 Incorporated Index and storage system for data provided in the vertical blanking interval
JP4251673B2 (ja) * 1997-06-24 2009-04-08 富士通株式会社 画像呈示装置
US6057840A (en) * 1998-03-27 2000-05-02 Sony Corporation Of Japan Computer-implemented user interface having semi-transparent scroll bar tool for increased display screen usage
US6160909A (en) * 1998-04-01 2000-12-12 Canon Kabushiki Kaisha Depth control for stereoscopic images
US6215485B1 (en) * 1998-04-03 2001-04-10 Avid Technology, Inc. Storing effects descriptions from a nonlinear editor using field chart and/or pixel coordinate data for use by a compositor
US6188406B1 (en) * 1998-08-12 2001-02-13 Sony Corporation Single-item text window for scrolling lists

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2176671A (en) * 1985-05-31 1986-12-31 Electrocraft Consultants Limit Separation overlay
WO1996022660A1 (fr) * 1995-01-20 1996-07-25 Reveo, Inc. Systeme intelligent et procede associe de creation et presentation d'images stereo multiplexees dans des environnements de realite virtuelle
EP0817125A2 (fr) * 1996-06-26 1998-01-07 Matsushita Electric Industrial Co., Ltd. Appareil pour générer des images stéréoscopiques animées
CA2214238A1 (fr) * 1996-08-30 1998-02-28 Arlen Anderson Systeme et methode d'execution et de gestion des operations dans un parc d'engraissement
EP0827350A2 (fr) * 1996-09-02 1998-03-04 Canon Kabushiki Kaisha Dispositif d'affichage d'images stéréoscopiques

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HIBBARD E. ET AL: 'On the Theory and Application of Stereographics in Scientific Visualization' EUROGRAPHICS 91 TECHNICAL REPORT SERIES - STATE OF THE ART REPORTS vol. EG-91-STAR, 1991, pages 1 - 21, XP000243006 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001084852A1 (fr) * 2000-05-03 2001-11-08 Koninklijke Philips Electronics N.V. Circuit de commande d'affichage autostereoscopique
WO2002073981A1 (fr) 2001-03-09 2002-09-19 Koninklijke Philips Electronics N.V. Systeme d'affichage d'images autostereoscopiques pouvant etre commande par l'utilisateur
EP1501317A1 (fr) * 2002-04-25 2005-01-26 Sharp Kabushiki Kaisha Dispositif de creation de donnees image, dispositif de reproduction de donnees image, et support d'enregistrement de donnees image
EP1501317A4 (fr) * 2002-04-25 2006-06-21 Sharp Kk Dispositif de creation de donnees image, dispositif de reproduction de donnees image, et support d'enregistrement de donnees image
US7679616B2 (en) 2002-04-25 2010-03-16 Sharp Kabushiki Kaisha Image data generation apparatus for adding attribute information regarding image pickup conditions to image data, image data reproduction apparatus for reproducing image data according to added attribute information, and image data recording medium related thereto
JP2009544992A (ja) * 2006-07-24 2009-12-17 シーフロント ゲゼルシャフト ミット ベシュレンクテル ハフツング 自動立体システム
WO2008011888A1 (fr) * 2006-07-24 2008-01-31 Seefront Gmbh Système autostéréoscopique
US8077195B2 (en) 2006-07-24 2011-12-13 Seefront Gmbh Autostereoscopic system
DE102008025103A1 (de) * 2008-05-26 2009-12-10 Technische Universität Berlin Verfahren zum Herstellen einer autostereoskopischen Darstellung und Anordnung für eine autostereoskopische Darstellung
EP2448275A3 (fr) * 2009-11-23 2012-06-06 Samsung Electronics Co., Ltd. Procédé de fourniture de IUG, appareil d'affichage et système de fourniture d'images 3D l'utilisant
US9307224B2 (en) 2009-11-23 2016-04-05 Samsung Electronics Co., Ltd. GUI providing method, and display apparatus and 3D image providing system using the same
EP2658268A3 (fr) * 2012-04-25 2014-03-19 Samsung Electronics Co., Ltd Appareil et procédé permettant d'afficher une image stéréoscopique dans un dispositif électronique
CN105373374A (zh) * 2014-08-30 2016-03-02 上海爱护网贸易有限公司 一种在web上显示3d场所的方法及系统
CN105373375A (zh) * 2014-08-30 2016-03-02 上海爱护网贸易有限公司 一种基于异步加载的web3d显示方法及系统

Also Published As

Publication number Publication date
JP3420504B2 (ja) 2003-06-23
EP0969418A3 (fr) 2001-05-02
JP2000020757A (ja) 2000-01-21
US6760020B1 (en) 2004-07-06

Similar Documents

Publication Publication Date Title
US6760020B1 (en) Image processing apparatus for displaying three-dimensional image
US10096157B2 (en) Generation of three-dimensional imagery from a two-dimensional image using a depth map
US6677939B2 (en) Stereoscopic image processing apparatus and method, stereoscopic vision parameter setting apparatus and method and computer program storage medium information processing method and apparatus
JP4228646B2 (ja) 立体視画像生成方法および立体視画像生成装置
US7907167B2 (en) Three dimensional horizontal perspective workstation
US20070291035A1 (en) Horizontal Perspective Representation
US9460555B2 (en) System and method for three-dimensional visualization of geographical data
US20050219240A1 (en) Horizontal perspective hands-on simulator
US20050264559A1 (en) Multi-plane horizontal perspective hands-on simulator
US20060126927A1 (en) Horizontal perspective representation
WO2015048906A1 (fr) Système à réalité augmentée et procédé de positionnement et de cartographie
EP1740998A2 (fr) Simulateur tactile a perspective horizontale
US20050248566A1 (en) Horizontal perspective hands-on simulator
JP4406824B2 (ja) 画像表示装置、画素データ取得方法、およびその方法を実行させるためのプログラム
JP2003067784A (ja) 情報処理装置
WO2017062730A1 (fr) Présentation d'une scène de réalité virtuelle à partir d'une série d'images
JP2003209769A (ja) 画像生成装置および方法
KR101212223B1 (ko) 촬영장치 및 깊이정보를 포함하는 영상의 생성방법
KR20010047046A (ko) 제트버퍼를 이용한 입체영상 생성방법
JP2008287588A (ja) 画像処理装置および方法
Shimamura et al. Construction and presentation of a virtual environment using panoramic stereo images of a real scene and computer graphics models

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

AX Request for extension of the european patent

Free format text: AL;LT;LV;MK;RO;SI

17P Request for examination filed

Effective date: 20000605

RIC1 Information provided on ipc code assigned before grant

Free format text: 7G 06T 17/40 A, 7G 06T 15/20 B

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

AX Request for extension of the european patent

Free format text: AL;LT;LV;MK;RO;SI

AKX Designation fees paid

Free format text: AT BE CH DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: CANON KABUSHIKI KAISHA

17Q First examination report despatched

Effective date: 20030721

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20050909