WO2008059086A1 - Système et procédé de visualisation d'une image agrandie par application de techniques de réalité augmentée - Google Patents

Système et procédé de visualisation d'une image agrandie par application de techniques de réalité augmentée Download PDF

Info

Publication number
WO2008059086A1
WO2008059086A1 PCT/ES2007/000645 ES2007000645W WO2008059086A1 WO 2008059086 A1 WO2008059086 A1 WO 2008059086A1 ES 2007000645 W ES2007000645 W ES 2007000645W WO 2008059086 A1 WO2008059086 A1 WO 2008059086A1
Authority
WO
WIPO (PCT)
Prior art keywords
real
image
user
dimensional
zoom
Prior art date
Application number
PCT/ES2007/000645
Other languages
English (en)
Spanish (es)
Inventor
Jose Ignacio Torres Sancho
Maria Teresa LINAZA Saldaña
Original Assignee
The Movie Virtual, S.L.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Movie Virtual, S.L. filed Critical The Movie Virtual, S.L.
Publication of WO2008059086A1 publication Critical patent/WO2008059086A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/40Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images giving the observer of a single two-dimensional [2D] image a perception of depth
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0127Head-up displays characterised by optical features comprising devices increasing the depth of field
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • This invention relates to a method and a system for viewing an enlarged scene, which makes it possible to increase with real content a real scene or image that is within the viewpoint of a user.
  • the invention is mainly oriented to apparatuses for the tourism and leisure sector of the type of binoculars for the observation of a sight, although it is also designed for other environments such as cultural institutions, fairs or nature tourism.
  • binoculars placed on a hill for tourists to observe the surrounding area for a few minutes when introducing a coin offer a panoramic view of the buildings and streets of an urban area, their natural and cultural resources. Additionally, in some cases, they allow you to zoom in on certain tourist attractions to see them closer. This view can arouse in tourists the interest of visiting these attractions later, helping them in identifying new points of interest for their subsequent visit. However, it is often difficult to find something in the field of vision of binoculars other than nearby forests or the sky itself. Even when a tourist attraction that may be interesting has been found, that interest is often lost due to the lack of images and information about that attraction. People are used to receiving information in a simple and entertaining way through different channels such as Internet or television, using hyperlinks to obtain multimedia content that responds to their request.
  • the present invention satisfies the previously mentioned needs by means of a system and a method for displaying an enlarged image with additional multimedia content.
  • the invention detects the position of a display device comprised in the system, increases the user's view with graphic representations of graphic objects recovered from a database and allows the user to interact with the multimedia information provided.
  • Another aspect of the invention provides a system and method for zooming in on the enlarged image, detecting the position of the display device and fixing the relative position of the graphic objects when zooming on the view.
  • FIG. 1 shows a system according to an embodiment of the invention.
  • Figure 2 shows the functional scheme of the invention.
  • Figure 3 shows the operation of the system by an example of use of the invention.
  • FIG. 5 shows an example of the method of the invention for zooming.
  • RA Augmented Reality
  • VA Augmented Virtuality
  • the invention is based on the application of Augmented Reality (RA) technologies to the traditional concept of binoculars in tourist environments, that is, the invention relates to a system that allows users to see a scene (for example, the view from a nearby hill) augmented by superimposing virtual information relative to the scene the user is observing using RA technologies.
  • RA Augmented Reality
  • the increase of the scene increases the user's entertainment experience, since virtual information provides additional content to the scene (for example, the arrows that indicate buildings and show their name).
  • the system also allows the user to interact with virtual information and obtain additional content (for example, access to text files, images or videos related to a specific tourist resource).
  • the system allows you to customize the contents based on different user profiles.
  • FIG 1 shows a typical system (1) according to an embodiment of the invention.
  • the system (1) basically comprises a real camera (2) that registers a real image (6) or real-time image from the user's point of view (13), and that sends this real image (6) to a unit of processing (3).
  • the processing unit (3) includes a database (4) that stores three-dimensional graphic objects (5) - not represented - that will be superimposed on the real image (6) using RA methods.
  • the RA methods convert the three-dimensional graphic objects (5) into two-dimensional virtual representations (5 '), and then the two-dimensional virtual representations (5') of said three-dimensional graphic objects (5) are superimposed on the real image (6), forming the enlarged image (6 ').
  • Three-dimensional graphic objects (5) may include additional multimedia information (7) -not represented- capable of providing additional and more complex information about three-dimensional graphic objects (5).
  • the system (1) also includes a display device (8) by which the user (13) can observe the enlarged image (6 '), that is, the real image (6) to which the representations have been added two-dimensional virtual (5 ') of the three-dimensional graphic objects (5).
  • the system (1) includes a tracking system (9) to detect the current position of the display device (8) as a fundamental element in the augmentation process. As shown in Figure 1, the display device (8) and the real camera (2) are preferably mounted on the same mechanical axis so that they move in a solidary manner. In this way, the tracking system (9) calculates the position of the display device (8) and therefore, the position of the real camera (2).
  • the system (1) also includes interaction means (11) that fulfill the user interface function for the system (1).
  • the system (1) may include a payment system (12) based on the insertion of coins or other means of payment.
  • the real camera (2) can be any type of camera that includes "autofocus" lenses for optical zoom, so that the user (13) can increase or decrease the zoom to observe small or distant objects more clearly.
  • the real camera (2) must be able to be controlled from an external control device capable of accessing and modifying the zoom and other settings or control parameters of the camera, for example, by an RS-232C serial cable connection. There are many cameras of this type available commercially.
  • the tracking system (9) is preferably based on inertial sensors. However, there is a large number of alternative positioning systems that can be used effectively in other embodiments. Since the system (1) is supposed to be used primarily in two directions (right / left, up / down), the simplest tracking techniques provide sufficient accuracy. However, the invention can use any tracking system that is able to determine the position of the display device (8) robustly and send the coordinates to the processing unit (3).
  • the display device (8) by which the user (13) perceives the enlarged image (6 ') that includes the real image (6) and the two-dimensional virtual representations (5') of the three-dimensional graphic objects (5) together with the additional multimedia content (7) is preferably a metaphor for conventional binoculars.
  • the display device (8) preferably comprises a display that is basically a non-transparent video display system, which can be non-stereoscopic, stereoscopic and autostereoscopic.
  • the display device (8) can be one of the virtual binoculars or telescopes available in the market with stereoscopic capability.
  • semi-transparent and autostereoscopic devices may be used to visualize the enlarged image (6 ').
  • the preferred embodiment of the invention includes seven buttons for interacting with the system (1) in a simple and ergonomic manner.
  • the buttons (11) have been distributed between the left side (19) and the right side (20) of the system (1).
  • On the left side (19) of the system (1) there are two buttons that allow you to interact with the enlarged image (6 ').
  • the user (13) can zoom to enlarge the image with one of the buttons and zoom to reduce it with the other.
  • On the right side (20) of the system (1) there are five buttons: a central button (preferably one color) and four buttons around it (preferably a color different from the central button).
  • the middle button is the "enter” button, which is used to click on the two-dimensional virtual representations (5 ') of the three-dimensional graphic objects (5) and also to choose between the menus of additional multimedia content (7).
  • the buttons that surround it serve to move through the menus that allow the selection of additional multimedia content (7).
  • the system (1) preferably includes two different databases (4): a database for augmentation and a database of self-managed content.
  • the database for the increase includes the three-dimensional graphic objects (5) associated with the points of interest for the user (13), including the name of the main tourist attractions and other graphic objects.
  • the self-managed content database stores additional multimedia content (7), including videos, movie clips, interactive 3D panoramas or even three-dimensional models of existing and non-existent tourist attractions.
  • the preferred embodiment of the invention also includes an authoring tool to simplify the manipulation of three-dimensional graphic objects (5), their position in the enlarged image (6 ') and the additional multimedia content (7) that they can display.
  • the program can work with XML files to change them or create new ones suitable for the main program.
  • the preferred embodiment of the invention includes up to ten points of interest with a maximum of five sub-objects to choose from within the menu.
  • These three-dimensional graphic objects (5) include the points of tourist interest and their corresponding markers in the enlarged image (6 '), while the sub-objects represent different types of additional multimedia content (7) that are shown when choosing the two-dimensional virtual representations ( 5 ') of the three-dimensional graphic objects (5).
  • Figure 2 shows a functional scheme of the system (1) of Figure 1.
  • the real camera (2) captures the user's point of view (13), that is, records a video image in real time called real image (6).
  • the tracking system (9) calculates and sends a position information (14) to the processing unit (3), informing about the current location and orientation of the display device (8).
  • the processing unit (3) then performs a graphics adaptation process (15), which converts the three-dimensional graphic objects (5) stored in the two-dimensional virtual representations (5 ') based on an orientation vector that is obtained from of position information (14).
  • the processing unit (3) adapts a view of the three-dimensional graphic objects (5) to obtain a virtual scene, controlling a "virtual camera” that uses the same orientation vector than the real camera (2), that is, by orienting the virtual camera exactly the same as the real camera (2). Therefore, the real scene (6) and the two-dimensional virtual representations (5 ') of the three-dimensional graphic objects (5) are synchronized according to the real world. This synchronization allows composing the real and virtual scenes of the enlarged image (6 ').
  • the processing unit (3) performs an augmentation process (16) in which the real scene (6) is augmented with the two-dimensional virtual representations (5 ') of the three-dimensional graphic objects (5) to obtain the image augmented (6 '), which is sent back to the display device (8).
  • the user (13) looks through the display device (8), he can see the enlarged image (6 '). If the user (13) rotates the system (1) or makes any movement that changes the position of the display device (8), the tracking system (9) reports the change to the processing unit (3).
  • the graphics adaptation process (15) updates the two-dimensional virtual representations (5 ') of the three-dimensional graphic objects (5), so that they coincide with the new real image (6). Therefore, when the user (13) turns the system (1) or changes its position, the enlarged image (6 ') changes in its entirety, that is, the real image (6) and the two-dimensional virtual representations (5') They adapt to that change.
  • Figure 3 shows the operation of the invention using an example of using the system (1).
  • the graphic at the top shows an example of an enlarged image (6 ') displayed by the user (13).
  • This enlarged image (6 ') is composed of the real image (6) of a town (17) and some mountains (18), in which one of the buildings and one of the mountains are indicated by the corresponding two-dimensional virtual representations ( 5').
  • the user (13) discovers an interesting object on which he wishes to gather more information - for example, the high-rise building in the center of the screen marked with the text "objl" - acts with the means of interaction (11) until selecting the two-dimensional virtual representation (5 ') of said interesting object.
  • the system (1) displays the additional multimedia content (7) related to said two-dimensional virtual representation (5 ').
  • the system (1) can show a screen in which some interesting text and a video of a person explaining certain characteristics of the building are displayed, and / or offer new navigation options.
  • the system (1) can also offer personalized information, so that different contents can be displayed to different user profiles, including aspects such as multilingualism. For example, an English-speaking tourist who has a cultural profile can be a type of user profile. This user will receive more complete additional information in English about the history of the selected building than the information received by other profiles.
  • Figure 4 shows the method according to the invention whereby three-dimensional graphic objects (5) are placed in the virtual world using a spherical environment.
  • the registration information (24) that determines the placement of the three-dimensional graphic objects (5) in the enlarged image (6 ') is defined using the authoring tool.
  • the processing unit (3) receives the registration information (24) of the different three-dimensional graphic objects (5) and the position information (14) of the display device (8), in order to carry out the process of adapting the graphics (fifteen).
  • a virtual world (21) is constructed, that is, a three-dimensional model in which the three-dimensional graphic objects (5) are placed in their previously defined spatial position.
  • a virtual camera (22) is placed with respect to the three-dimensional graphic objects (5) based on the position information (14) of the real camera (2).
  • three-dimensional graphic objects (5) are placed in virtual points that simulate the distance of real objects from the user's point of view (13).
  • the process of adapting graphics (15) of three-dimensional virtual objects (5) into two-dimensional virtual representations (5 ') is based on the information about the angle that three-dimensional graphic objects (5) differ from a Zero-reference value View- Vector, which represents the center point of a coordinate system.
  • the graphics adaptation process (15) converts the three-dimensional graphic objects (5) into their respective two-dimensional virtual representations (5 ').
  • the result is the placement and adaptation of two-dimensional virtual representations (5 ') so that they provide a feeling of depth to the user (13).
  • the magnification process (16) combines the two-dimensional virtual representations (5 ') with the real image (6) recorded by the real camera (2) to obtain the enlarged image (6').
  • the user (13) can enjoy the landscape and the information provided by the two-dimensional virtual representations
  • the process of transforming the spatial coordinates defined by the three-dimensional graphic objects (5) into a real environment in two-dimensional coordinates that define the pixels that are displayed in a display device (8) is a key element in the present invention.
  • this process is carried out by applying an innovative approach, building a "virtual world” and capturing an image of said "virtual world” in the same way that the real camera (2) records the real image (6).
  • the augmented image (6 ') superimposes the real world and the virtual world to achieve the effect of magnification, bringing both worlds into a new "augmented world.”
  • the invention allows zooming in on the enlarged image (6 ').
  • the two-dimensional virtual representations (5 ') are not only rescaled, but also recovered, so that they continue to be placed on the increasing objects. This means that, by zooming in, the invention guarantees that the views of the real camera (2) and the virtual camera (22) are adjusted at all times.
  • the user (13) activates the zoom process by interacting with the two buttons on the left side (19) of the system (1).
  • the buttons are connected to the processing unit (3), so that when the user (13) acts on them, the processing unit (3) modifies the zoom of the virtual camera (22) and sends a command to the camera real (2) to change its zoom control parameters synchronously.
  • the adjustment of the control parameters of the real camera (2) and the virtual camera (22) must be done dynamically in real time, while the system (1) is operating, in order to correctly align the real image (6) and two-dimensional virtual representations (5 '). Due to the mechanical zoom system of the real camera (2), there is a delay between the adjustment of the virtual camera zoom value (22) and the mechanical update of the real camera zoom (2). This delay in the change between two zoom positions is perceived by the user (13). Another mechanical limitation of the real camera system (2) is that the zoom speed of said real camera (2) is not linear, due to the acceleration and deceleration processes at the beginning and the end.
  • the invention proposes a method for updating and adjusting the zoom values of the real (2) and virtual (22) cameras, an example of which is shown in Figure 5, so that both values are coincident at all times despite the mechanical limitations and delays of the real camera (2).
  • the initial viewing angle of both real (2) and virtual (22) cameras is set at a value of 48 °.
  • the user (13) interacts with one of the buttons on the left side (19) of the system (1) to zoom in, increasing the image, the virtual camera (22) and the real camera (2) begin to zoom in
  • there are some delays in the real camera (2) to follow the zoom process that is, the real camera (2) zooms slower than the virtual camera (22).
  • the processing unit (3) receives an adjustment command (25) from the virtual camera (22) indicating its instant zoom value.
  • the processing unit (3) sets the zoom of the real camera (2) with the zoom value of the virtual camera (22).
  • the zoom value of the virtual camera (22) continues to change. Therefore, this process is repeated until the user (13) stops the zoom (until a time t according to Figure 5 elapses).
  • the adjustment between the zooms of the real camera (2) and the virtual camera (22) is performed in a discrete iterative manner, so that the relatively slower real camera (2) follows the instantaneous zoom of the virtual camera (22).
  • the zoom of the real camera (2) can usually only adopt a set of discrete values due to its mechanical configuration.
  • the processing unit (3) receives information of the specific zoom value of the virtual camera (22), for example, 45 °
  • the processing unit (3) tries to fix the zoom of the real camera (2) at the value of 45 °.
  • the zoom of the real camera (2) reaches this specific value with an "e" error (positive or negative). Due to this error, at the end of the process, the processing unit (3) must read the final zoom value of the real camera (2), which will be slightly different from the final zoom value of the virtual camera (22) due to the error in”.
  • the processing unit (3) sends a tuning command (26) to the virtual camera (22), so that the final value of the virtual camera (22) is updated with the final value of the camera zoom real (2).
  • the zoom control parameter of some real cameras (2) used in practice is encoded in hexadecimal format. Therefore, when the processing unit (3) receives an adjustment command (25) from the virtual camera (22) indicating the current value of the virtual camera zoom (22), the processing unit (3) transforms this value in a hexadecimal code before sending it to the real camera (2) and vice versa.
  • the specifications of the actual cameras (2) existing in the market usually include a table that relates a set of view angles to the corresponding hexadecimal values of the control parameters. For example, in the preferred embodiment, the hexadecimal values of the zoom control parameters range from 0 to 4000, representing viewing angles between 4.8 ° (maximum zoom) and 48 ° (without any zoom).
  • the invention proposes the use of interpolation algorithms based on the specifications table of the real camera (2). For example, for the real camera (2) of the preferred embodiment of the invention, the following interpolation equations are proposed:
  • y -0.2274x 3 + 22.593x 2 - 949.94x + 18690, where y is the value of the zoom control parameter in hexadecimal and x is the current angle of view of the real camera (2).
  • the processing unit (3) uses this equation to calculate the corresponding hexadecimal code that is sent to the real camera (2).
  • x is the angle of view of the real camera (2) and y is the hexadecimal value of the zoom control parameter.
  • the processing unit (3) uses this equation to calculate the corresponding view angle that will be sent to the virtual camera (22) as tuning command (26).
  • the invention applies to the cultural tourism sector, which is one of the key future areas for the creation and strengthening of cultural industries.
  • the Augmented Reality techniques proposed to access and understand tourist and cultural content are highly visual and interactive forms of presentation. Therefore you are Technological approaches are an added value, which allows visitors to experience the history associated with a real environment in a personalized way.
  • the reader can imagine the view of a city from a nearby hill.
  • the invention is placed at the top of the hill to allow a "visit" to some of the tourist attractions and elements of the environment, and to receive information about each of these attractions. For example, you can choose a cathedral or an island in the middle of the bay to receive additional multimedia information.
  • the embodiments of the invention are not limited to tourist environments, but can be extended to a wide variety of recreational and cultural experiences. Similar scenarios can be described in situations such as cultural institutions, exhibition halls, nature tourism, fairs or any other scenario in which objects and resources appear that can be augmented by Computer Graphics or additional multimedia content.
  • a third example may be the increase of objects shown at trade shows.
  • the machine tool is usually shown at large fairs.
  • One of the main problems of the manufacturers is that they cannot show all the functionalities of the machines at the fair.
  • the invention can provide more information about the components and the actual operation of the machines.
  • the reader can consider the case of hikers who want to know more about landscapes, flora and fauna while walking through the countryside.
  • the invention can provide additional information at the top of the mountains, such as the name of the different peaks of the environment, the way of access or the diversity of fauna and flora.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un système (1) de visualisation interactif basé sur des technologies de réalité augmentée pour des applications de tourisme et de divertissement, qui permet à un utilisateur (13) de visualiser une image réelle (6) agrandie avec certaines informations pertinentes. Ce système (1) comprend une caméra réelle (2), une unité de traitement (3), une base de données (4), un dispositif de visualisation (8) et un système de suivi (9). L'unité de traitement (3) convertit des objets graphiques tridimensionnels (5) stockés dans la base de données (4) sous forme de représentations virtuelles bidimensionnelles (5') en fonction de la position de la caméra réelle (2) calculée par le système de suivi (9); puis aligne les représentations virtuelles bidimensionnelles (5') et l'image réelle (6), générant ainsi l'image agrandie (6'). Le système (1) permet de zoomer sur l'image agrandie (63), en effectuant un zoom simultané et précis sur l'image réelle (6) et les représentations virtuelles bidimensionnelles (5').
PCT/ES2007/000645 2006-11-16 2007-11-13 Système et procédé de visualisation d'une image agrandie par application de techniques de réalité augmentée WO2008059086A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ES200602922A ES2300204B1 (es) 2006-11-16 2006-11-16 Sistema y metodo para la visualizacion de una imagen aumentada aplicando tecnicas de realidad aumentada.
ES2006002922 2006-11-16

Publications (1)

Publication Number Publication Date
WO2008059086A1 true WO2008059086A1 (fr) 2008-05-22

Family

ID=39015743

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/ES2007/000645 WO2008059086A1 (fr) 2006-11-16 2007-11-13 Système et procédé de visualisation d'une image agrandie par application de techniques de réalité augmentée

Country Status (2)

Country Link
ES (1) ES2300204B1 (fr)
WO (1) WO2008059086A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2998680A1 (fr) * 2012-11-26 2014-05-30 Laurent Desombre Procede de navigation dans un environnement associe a un periscope interactif a realite virtuelle
EP3055833A4 (fr) * 2013-10-10 2017-06-14 Selverston, Aaron Appareil de visualisation tridimensionnelle (3d) interactive, pour l'extérieur
US20180250589A1 (en) * 2017-03-06 2018-09-06 Universal City Studios Llc Mixed reality viewer system and method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3104290B1 (fr) * 2019-12-05 2022-01-07 Airbus Defence & Space Sas Jumelles de visee de simulation, et systeme et procedes de simulation

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020082498A1 (en) * 2000-10-05 2002-06-27 Siemens Corporate Research, Inc. Intra-operative image-guided neurosurgery with augmented reality visualization
US20020163521A1 (en) * 1993-09-10 2002-11-07 John Ellenby Electro-optic vision systems
EP1435737A1 (fr) * 2002-12-30 2004-07-07 Abb Research Ltd. Système et méthode de réalité augmentée
DE102004044718A1 (de) * 2004-09-10 2006-03-16 Volkswagen Ag Augmented Reality Autoren System
DE102004046144A1 (de) * 2004-09-23 2006-03-30 Volkswagen Ag Verfahren und System zum Planen einer Produktionsumgebung

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6064398A (en) * 1993-09-10 2000-05-16 Geovector Corporation Electro-optic vision systems
US7050078B2 (en) * 2002-12-19 2006-05-23 Accenture Global Services Gmbh Arbitrary object tracking augmented reality applications
US7403220B2 (en) * 2004-08-23 2008-07-22 Gamecaster, Inc. Apparatus, methods, and systems for viewing and manipulating a virtual environment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020163521A1 (en) * 1993-09-10 2002-11-07 John Ellenby Electro-optic vision systems
US20020082498A1 (en) * 2000-10-05 2002-06-27 Siemens Corporate Research, Inc. Intra-operative image-guided neurosurgery with augmented reality visualization
EP1435737A1 (fr) * 2002-12-30 2004-07-07 Abb Research Ltd. Système et méthode de réalité augmentée
DE102004044718A1 (de) * 2004-09-10 2006-03-16 Volkswagen Ag Augmented Reality Autoren System
DE102004046144A1 (de) * 2004-09-23 2006-03-30 Volkswagen Ag Verfahren und System zum Planen einer Produktionsumgebung

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2998680A1 (fr) * 2012-11-26 2014-05-30 Laurent Desombre Procede de navigation dans un environnement associe a un periscope interactif a realite virtuelle
EP3055833A4 (fr) * 2013-10-10 2017-06-14 Selverston, Aaron Appareil de visualisation tridimensionnelle (3d) interactive, pour l'extérieur
US20180250589A1 (en) * 2017-03-06 2018-09-06 Universal City Studios Llc Mixed reality viewer system and method
WO2018165041A1 (fr) * 2017-03-06 2018-09-13 Universal City Studios Llc Système et procédé de visualiseur de réalité mixte
US10289194B2 (en) 2017-03-06 2019-05-14 Universal City Studios Llc Gameplay ride vehicle systems and methods
CN110382066A (zh) * 2017-03-06 2019-10-25 环球城市电影有限责任公司 混合现实观察器系统和方法
KR20190124766A (ko) * 2017-03-06 2019-11-05 유니버셜 시티 스튜디오스 엘엘씨 혼합 현실 뷰어 시스템 및 방법
US10528123B2 (en) 2017-03-06 2020-01-07 Universal City Studios Llc Augmented ride system and method
US10572000B2 (en) 2017-03-06 2020-02-25 Universal City Studios Llc Mixed reality viewer system and method
KR102145140B1 (ko) 2017-03-06 2020-08-18 유니버셜 시티 스튜디오스 엘엘씨 혼합 현실 뷰어 시스템 및 방법
CN110382066B (zh) * 2017-03-06 2023-10-13 环球城市电影有限责任公司 混合现实观察器系统和方法

Also Published As

Publication number Publication date
ES2300204A1 (es) 2008-06-01
ES2300204B1 (es) 2009-05-01

Similar Documents

Publication Publication Date Title
Schmalstieg et al. Augmented reality: principles and practice
US11854149B2 (en) Techniques for capturing and displaying partial motion in virtual or augmented reality scenes
US11024088B2 (en) Augmented and virtual reality
Fritz et al. Enhancing cultural tourism experiences with augmented reality technologies
US10225545B2 (en) Automated 3D photo booth
ES2688643T3 (es) Aparato y método de realidad aumentada
CN109584295A (zh) 对图像内目标物体进行自动标注的方法、装置及系统
US20080246759A1 (en) Automatic Scene Modeling for the 3D Camera and 3D Video
JP2008520052A5 (fr)
US20060114251A1 (en) Methods for simulating movement of a computer user through a remote environment
KR20140082610A (ko) 휴대용 단말을 이용한 증강현실 전시 콘텐츠 재생 방법 및 장치
CA2669409A1 (fr) Procede d'etablissement de script des transitions entre des scenes
JP2003264740A (ja) 展望鏡
US20130249792A1 (en) System and method for presenting images
US20210312887A1 (en) Systems, methods, and media for displaying interactive augmented reality presentations
Hoberman et al. Immersive training games for smartphone-based head mounted displays
US11532138B2 (en) Augmented reality (AR) imprinting methods and systems
ES2300204B1 (es) Sistema y metodo para la visualizacion de una imagen aumentada aplicando tecnicas de realidad aumentada.
US20030090487A1 (en) System and method for providing a virtual tour
Woletz Interfaces of immersive media
Cohen et al. A multiuser multiperspective stereographic QTVR browser complemented by java3D visualizer and emulator
Kim Lim et al. A low-cost method for generating panoramic views for a mobile virtual heritage application
DeHart Directing audience attention: cinematic composition in 360 natural history films
Tatzgern et al. Embedded virtual views for augmented reality navigation
Beckwith et al. Parallax: Dancing the Digital Space

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07823050

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07823050

Country of ref document: EP

Kind code of ref document: A1