EP1374019A2 - Browsersystem und verfahren zu dessen verwendung - Google Patents

Browsersystem und verfahren zu dessen verwendung

Info

Publication number
EP1374019A2
EP1374019A2 EP01929798A EP01929798A EP1374019A2 EP 1374019 A2 EP1374019 A2 EP 1374019A2 EP 01929798 A EP01929798 A EP 01929798A EP 01929798 A EP01929798 A EP 01929798A EP 1374019 A2 EP1374019 A2 EP 1374019A2
Authority
EP
European Patent Office
Prior art keywords
virtual
browser
browser apparatus
information
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP01929798A
Other languages
English (en)
French (fr)
Inventor
Lincoln Wallen
Richard Tonge
William Roger Osborn
Lawrence Christopher Angrave
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mathengine PLC
Original Assignee
Mathengine PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mathengine PLC filed Critical Mathengine PLC
Publication of EP1374019A2 publication Critical patent/EP1374019A2/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/32Operator till task planning
    • G05B2219/32014Augmented reality assists operator in maintenance, repair, programming, assembly, use of head mounted display with 2-D 3-D display and voice feedback, voice and gesture command
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer

Definitions

  • the invention relates to a browser system for displaying spatially arranged information, and in particular to a hand held browser system.
  • the invention also relates to a method of using the browser system.
  • a wide variety of systems suitable for displaying information from a computer system are known.
  • the most widely available are the conventional cathode ray tube displays of a conventional computer monitor.
  • Smaller computer systems such as laptops and palmtops generally use liquid crystal displays and a number of other forms of display are known.
  • a particular form of display is a head-up display which is used in virtual reality systems.
  • the intention of a virtual reality system is to immerse the wearer of the system in a virtual world created in the computer.
  • the head-up display therefore includes a display in the form of goggles or a helmet fixed on the head of a user and displaying an image of a virtual world.
  • the virtual world is a virtual 3 -dimensional space with a number of objects within it which can be seen, viewed and frequently also manipulated by the virtual reality user.
  • Augmented reality combines an experience of reality with additional computer generated virtual reality.
  • Applications may include tele-medicine, architecture, entertainment, repair and construction.
  • Augmented reality differs from virtual reality; the former adds information to the real world whereas the latter is the simulation of a real or imagined environment which can be experienced visually in 3 -dimensions and can provide an interactive experience.
  • augmented reality systems almost all employ either static displays on traditional computer monitors or projectors or alternatively head mounted virtual-reality type displays.
  • a camera is used together with complex image analysis software which matches the image captured on the camera from the real world with information contained in the virtual world.
  • this requires determining the position of one or more real reference objects in the real world in the field of view of the camera. Knowledge of the position of these reference objects with respect to the camera is then used to generate the virtual world.
  • augmented reality systems due to Sony, known as the Navi-Cam system, which reads bar-code information from the real world and displays information relating to the bar code as virtual information on a display.
  • barcodes can be provided on a library shelf keyed to information about the contents of that shelf.
  • the browser may be either a conventional virtual reality display together with a camera or a palmtop together with a camera. When the camera is directed at the bar codes in the real world an image of the real world taken from the camera is combined with data referenced by the bar code .
  • augmented reality systems that combine a small display of information with information captured by a camera from the real world. See for example Feiner, S and Shamash, A "Hybrid user interface: Breeding virtually bigger interfaces for physically small computers", Proc . UIST. 1991 ACM. symposium on user interface software and technology, Hilton Head, November 11 to 13 1991 pages 9 - 17.
  • Palmtops are widely known small computer systems. Another project includes tilt sensors in a palmtop. This can provide one-handed operation of the palmtop.
  • One application of such a tilt sensor is a game in which the object is to guide a ball through a maze. The maze and the ball are displayed on the screen of the palmtop and the tilt of the palmtop is used to generate the acceleration for the ball.
  • This game is known as "Maulg II", and is available for download from the Interne .
  • “City of News” is an immersive 3 -dimensional web browser which represents a dynamically growing urban landscape of information.
  • the browser fetches and displays URLs so as to form virtual skyscrapers and alleys of text and images through which the user can 'fly' .
  • the control is by conventional computer controls, for example using a mouse.
  • a method of displaying to a user portions of a virtual information space having information arranged spatially in a virtual co-ordinate system comprising: providing a browser apparatus having a display and a position detector for determining the position of the browser apparatus, the browser apparatus being movable by the user to different real space positions including different positions relative to the user's eye; determining information characterising the position and orientation of the browser apparatus in real space and an inferred position of the user's eye; calculating a projected view, from the inferred position of the user's eye, of the spatially arranged information in the virtual information space projected onto the display, depending on the inferred position of the user's eye, the position and orientation of the display and a relationship between real space and virtual reality co-ordinate systems; and displaying on the display of the browser the calculated projected view of the virtual information space; whereby the displayed view of the virtual information space can be changed by moving the browser or changing the relative position or orientation of the browser and the infer
  • the browser apparatus is moveable by the user to change the view of the virtual world; conveniently the browser apparatus may be hand-held.
  • the apparatus may be a computer such as a palmtop.
  • the browser apparatus may have mobile telephone connectivity and a display.
  • the method according to the invention alleviates a limitation of devices, especially low resolution devices, in viewing large amounts of graphical and textual information.
  • By moving the display further information may be viewed or selected.
  • movement of the display may be used to provide a scroll and zoom control of the information displayed.
  • the apparatus may have a network connection for connection to a network.
  • the network connection may be, for example, Bluetooth, infrared or a mobile telephone connection.
  • the apparatus may permit a networked device with a display screen to become a motion sensitive device for browsing virtual spaces in relation to a real physical world.
  • the data may include a plurality of 2 -dimensional
  • the browser apparatus may be moved to change the displayed resolution of the 2 -dimensional image.
  • the virtual information space may be a virtual 3 -dimensional world including at least one, preferably many, 3 -dimensional objects.
  • the method may further comprise the steps of controlling at least one navigation parameter by the position and/or orientation of the browser apparatus in space, and navigating through the virtual world by updating the position of the browser apparatus in the virtual world depending on the value of the said at least one navigation parameter.
  • the step of navigating through the virtual world may update the position of the browser apparatus by reading the velocity of the browser apparatus in the virtual world, updating the velocity depending on the value of the said at least one navigation parameter, updating the position of the browser apparatus using the updated velocity, and storing the updated velocity of the browser apparatus. In this way the appearance of inertia may be provided.
  • the navigation may be abstract, for example for navigation through a virtual world of web pages.
  • the navigation parameter may be a direct simulation of a control.
  • the orientation of the browser may determine the position of the steering wheel in the driving simulation and the position of the browser in the virtual world may then be updated using the position of the steering wheel and other parameters characterising the driving simulation.
  • the eye position may be calculated by making one of a number of assumptions, not limited to those presented here.
  • One possibility for calculating the eye position is to assume that its position is fixed regardless of the position or orientation of the screen.
  • the eye may be assumed to be in a fixed position in the frame of reference of the screen, i.e. a fixed position in front of the screen, taking into account the orientation of the screen.
  • the position may be taken to be a convenient measure of arm length in front of the screen, for example 0.3m to 1.0m away from the screen, and located on an axis perpendicular to the screen and passing through the centre of the screen.
  • the browser apparatus may be switchable between each of the above modes for inferring eye position.
  • the eye position of the user may be measured. This may be done, for example, by fixing a sensor to the head of the user and detecting the position of the sensor, or by including a camera in the browser apparatus and measuring the eye position by recording an image of the user's head, identifying the eyes in the image and calculating the user's eye position therefrom.
  • the browser apparatus may include a tilt sensor to determine the orientation.
  • the browser apparatus may include an accelerometer in order to use dead reckoning to calculate the fine movement of the browser apparatus . Movement on a large scale may be obtained in a number of systems, for example Global Positioning System (GPS) .
  • GPS Global Positioning System
  • the method may also include the step of selecting at least one object in the virtual world, de-coupling the position of the object from the virtual world, and updating the position of the selected object not by using the virtual world model but as a function of the movement of the browser. This may be done by updating the position of the selected object or objects by keeping the selected object or objects fixed with respect to the browser apparatus. Alternatively, the object or objects may be fixed to be along the line of sight from the eye position through the browser apparatus.
  • the position along the line of sight may also be modified, for example by moving the browser along the line of sight or by using additional keys or controls on the browser.
  • the selected object or objects may be moved in virtual space by moving the browser apparatus in real space. This allows rearrangement of objects in the virtual 3 -dimensional world by the user.
  • the apparatus may also be used to author the virtual world, i.e. to create, delete and modify objects, in a similar way.
  • the object position can be calculated by the conventional rules determining object positions in the virtual world.
  • a method of displaying to a user portions of a virtual information space having objects arranged spatially in a virtual coordinate system using a browser apparatus movable by the user to different positions in real space, the browser apparatus having a display and a position detector for determining the position of the browser apparatus, the method comprising, displaying an image of the virtual world including at least one object on the browser apparatus, selecting an object in the virtual world, and moving the browser apparatus to move the selected object in the virtual world.
  • the system may be a palmtop, PDA or mobile telephone with a display.
  • the invention in another aspect, relates to a browser apparatus for displaying to a user portions of a virtual information space having information arranged spatially in a virtual co-ordinate system, wherein the browser apparatus is movable by the user to different positions in real space and different positions relative to the eye.
  • the browser apparatus comprises a display, a memory, and a position detector for determining the position of the browser apparatus.
  • the memory contains stored code for: determining the position of the browser apparatus in real space; determining the relationship between an inferred eye position in real space, the browser apparatus position in real space and the virtual co-ordinate system; calculating a projected view.
  • the browser apparatus may further comprise a transmitter and a receiver for remotely connecting the browser apparatus to a network.
  • the browser apparatus may additionally include a decompression unit for decompressing compressed data received from the network by the receiver, and/or a rendering engine for rendering image data so as to permit so called thin-client rendering .
  • the invention in another aspect, relates to a network system, comprising a browser apparatus as described above with a transmitter and receiver for networking the browser apparatus, linked to a server network having a store containing data about the virtual world, and a transmitter and receiver for transmitting information between the browser apparatus and the server.
  • the server network may include a filter for selecting information relating to part of the virtual world and transmitting it to the browser.
  • FIG. 2 shows a block diagram of the palmtop computer
  • Figure 3 is a flow chart illustrating use of the browser apparatus
  • Figure 4 illustrates the co-ordinate systems
  • Figure 5 is a flow chart of the projection method
  • Figure 6 is a block diagram illustrating the component processes carried out in the browser
  • Figure 7 illustrates a virtual world and a projection screen.
  • a browser apparatus in the form of a palmtop computer 1 has a body 6 supporting a liquid crystal display screen 3 on its upper surface.
  • a selection button 25 is provided, which can be used to select objects.
  • Figure 2 shows a block diagram of the palmtop and base station.
  • the palmtop also contains a camera 11 which can capture images of the surroundings of the palmtop and a transceiver 13 for communicating with a base station 15, in turn connected to a server 17.
  • the palmtop contains a CPU 19, a graphics processor 21 capable of carrying out 3 dimensional graphic calculations and a memory 23.
  • the palmtop is a conventional palmtop fitted with a transceiver for remotely connecting to a network; both such components are known and will not be further described.
  • the palmtop exists in the real, three-dimensional world.
  • the palmtop is used to display information about a virtual information space, namely a set of visually displayable data that is spatially arranged.
  • the data may relate to objects in the real world or may relate to a totally imaginary world, containing imaginary objects.
  • the properties of objects in the imaginary, virtual world are limited only by the need to calculate the properties of the objects.
  • the objects can be calculated to move in ways that correspond to the physical laws in the real world using physics modelling systems, or they may be more abstract.
  • FIG. 3 is a block diagram illustrating the use of the browser apparatus .
  • the accelerometer is a small commercially available accelerometer, a One Analog DevicesTM ADXL202.
  • the accelerometer outputs acceleration information which is twice integrated to obtain position information.
  • the calculated velocity is damped to zero with a time constant of several seconds to avoid errors in the velocity accumulating.
  • the unit is switchable between three modes; it is determined in step 33 which mode is currently selected.
  • the modes relate to how the eye position is determined in the next step 35.
  • a first mode the camera 11 takes a picture (image) of the space in front of the palmtop .
  • the recorded image is then analysed to locate the users eyes. Then, from the distance between the eyes and their position in the image, the location of the user's eyes is determined.
  • a second mode the position of the eyes is inferred.
  • the position is determined by calculating a position 1 metre or other suitable representation of a fixed arm distance, for example in the range 0.2m to 1.2m directly in front of the centre of the screen.
  • a third mode the eye is initially assumed to be in a fixed position in relation to the screen. Thereafter, the eye is assumed to remain fixed in space.
  • the user may freely select between first, second and third modes as convenient. If the first mode becomes unavailable or fails to determine the eye position, the user may be presented with a choice between the second and third modes only.
  • step 35 the virtual information space is projected onto the display screen.
  • the eye position i.e. using the browser display as a movable window onto the virtual world.
  • the projection of the virtual world onto the display uses as the projection point the virtual eye (camera) position. A straight line is drawn from each virtual world position to the eye position and those points that have such projection lines passing through the display are projected onto the position where the line passes through the display. Of course, such points will only be seen on the display if they are not obscured by objects closer to the display.
  • Figure 4 illustrates the virtual camera, virtual screen and virtual origin together with the real world screen, eye and origin.
  • the transformation between virtual and real co-ordinate systems are homogenous co-ordinate transforms.
  • es ⁇ cv(sX) ⁇ l , where e is the eye position as a function of time, s the screen position, s' the virtual screen position, v the virtual origin with respect to the world and c the virtual camera position. All are functions of time, in general.
  • the co-ordinate system in the virtual world is conventional in 3D graphics systems. Each object location or direction is given by a 4-vector ⁇ x,y,z,w ⁇ . The first three co-ordinates are the conventional position coordinates and w is either 0 or 1.
  • the position of objects in the virtual world may be defined by a plurality of 4-vectors ⁇ x,y,z,l ⁇ giving the positions of vertices or control points of an object.
  • Figure 5 illustrates the projection method of the projection step 37. The calculations may be performed either in real or virtual co-ordinates using the stored relationship therebetween to convert one to the other.
  • the next step is to calculate the eye position in the virtual co-ordinate system (the virtual camera in Fig. 4) and the screen position in the virtual co-ordinate system. This is determined from a knowledge of the mapping between real and virtual worlds which will depend on the application. Then the positions of objects in the virtual world are projected onto the screen. This is done by projecting individual 4-vectors of position to produce a "projective point" ⁇ x,y,z,w ⁇ which is used for subsequent processing in a graphics pipeline in a conventional way. This projective point is used in subsequent parts of the graphics pipeline for viewport culling, hidden object test, etc, as is conventional for such graphics pipelines.
  • the projection step is a simple geometric projection step that displays on the screen objects that would be seen from the eye position.
  • the projection is in fact carried out in quite a complex manner. It should not however be forgotten that all the projection step is doing is carrying out a simple geometric projection.
  • the co-ordinates are transformed so that the eye is at the origin and the screen is at a unit distance along the z axis.
  • This transform may be carried out by multiplying the 4-vectors of position by a matrix T.
  • T Let (ex,ey,ez) be the eye position, (sxx,sxy, sxz) the x direction of the screen, (syx, syy, syz) the y direction of the and (qx, qy, qz) the centre of the screen.
  • U transforms x and y co-ordinates so that points at the edge of the viewing window move to the edges of a unit square. It also transforms the z and w co-ordinates so that the w co-ordinate is used in the perspective division process which leaves the z co-ordinate free to take a value between 0 and 1 for hidden surface removal and z-buffering.
  • the projection matrix U is given by
  • n is the distance to the near focal point, scaled by the distance from the user's eye to the screen and f is the distance to the far focal point, similarly calculated, b and h are the width (breadth) and height of the screen.
  • multiplication by the matrix T followed by the matrix U may be implemented in a single step as multiplication by the matrix UT.
  • the 4-vectors giving positions of objects are pre-multiplied by UT to transform to co-ordinates for input to a graphics pipeline.
  • the remaining graphics processing step may be carried out in a conventional 3D rendering system with or without hardware support.
  • the result is a signal capable of driving the display 3.
  • step 37 the signal drives the display to display the projected image of the virtual world.
  • Figure 6 illustrates schematically the various component processes of the method and where they take place. The dotted line represents the steps that take place in the browser 1.
  • the user inputs mode data which is passed to the viewing mode interface 63.
  • Information 65 to calculate the position and orientation of the browser is fed into the viewing mode interface together with information 67 characterising the user's eye position.
  • the viewing mode interface process then passes information to the calculation process 69 that calculates the projection matrix P and passes the matrix P in turn to the graphics process 71.
  • the user control process 61 also passes information to the settings process 73. This transmits settings to the server. Information regarding the position, orientation and eye position is likewise transmitted to the server from the viewing mode interface 63.
  • the server process 75 which takes place in the server 17, data relating to the virtual world is retrieved and filtered to extract information relevant to the current browser position. This information is then transmitted back to the browser where it is fed into an internal object database 77 containing information regarding the vertices of the objects, their texture and other information such as lighting as required. The information from the database is then fed through the graphics process 71 where the screen image is calculated and fed 79 to the screen of the browser. As can be seen graphics rendering takes place in the browser apparatus. However, the information about the 3- dimensional world is stored on the server 17 and sent through base station 15 and transceiver 13 to the palmtop 1.
  • Notable features present in the device illustrated in Figure 6 include the 3 dimensional rendering process, part of the graphics process 71.
  • the invention is not limited to the above specific example .
  • Devices according to the invention offer a new way to view a virtual world: the screen may become a window onto the world (or a magnifying glass) with the position and orientation of the screen moved by hand, independently of the browser apparatus's eye position, unlike stereoscopic glasses, and the contents of the virtual world as seen by the user projected onto the screen according to user's eye position and the screen position and orientation. Since the field of view is much smaller than a total immersion
  • a virtual world on a small, hand-held device such as an organiser or mobile phone offers up huge possibilities.
  • the projective, magnifying possibilities give a way easily to view and navigate web content on a small screen.
  • a complete cyberspace may be set up with geographically relevant content.
  • Exhibition halls may provide directions or even guiding avatars in this cyberspace.
  • People may leave each other messages; cyber-graffiti only viewable by those you intend to view it, or the whole world if you are more artistically minded.
  • the device not only offers passive viewing of a virtual world but also manipulation of that world, a world that becomes even richer if physically simulated. And most obviously, games can be played in the cyberspace.
  • the multi-dimensional information space for display on the browser apparatus can be a virtual world containing objects, such as a virtual world used in playing a game.
  • the objects may represent fixed elements of the virtual world, such as walls, floors or the like.
  • Other objects can represent movable objects such as balls, creatures, furniture, or indeed any object at all that the designer of the virtual world wishes to include.
  • the information space can be a more abstract virtual space with a number of information containing objects spatially arranged.
  • the information containing objects may be web pages, database pages or the like, or more abstractly arranged information such as a network diagram of a networked computer system, telephone network or similar.
  • the browser apparatus may be able to display any spatially arranged information. It is not even necessary that the information is arranged three dimensionally; a four, five or even higher dimensional space may be provided though a three-dimensional information space may be much easier to intuitively navigate through .
  • the position of the browser apparatus may be detected using any of a number of methods.
  • the position detector may include, merely by way of example, an accelerometer.
  • the acceleration may be numerically integrated once over time to form velocity information and again to form position integration; such integration may be carried out using simple numerical information. Such an approach amounts to dead reckoning.
  • the position and velocity information may suffer from systematic drift, i.e. increases or decreases from the expected value, for example because of inaccuracies in the integration.
  • This may be corrected by subtracting a reference velocity or reference acceleration, determined for example from an average velocity and acceleration over an extended period. It may also be possible to make assumptions about typical usage to allow drift correction, for example by damping the velocity to zero with a time constant longer than the time for a typical gesture.
  • Alternative position detection methods include using camera or ultrasound systems, for example by triangulating based on fixed reference points.
  • Bluetooth a local communications system, might also be used by triangulating the position of the browser based on the distance to three transmitters arranged at fixed positions.
  • Coarse position information may be obtained by GPS, radio triangulation or simulation, all of which are known. The skilled person will appreciate that there are many other ways of measuring or estimating the position of the browser apparatus .
  • the orientation of the browser apparatus with respect to the real world may be obtained using the same techniques as finding the position of the browser apparatus, for example by finding the position of three different fixed, known points with respect to the browser apparatus.
  • the orientation may be measured in the browser apparatus, for example by providing a tilt sensor in the browser apparatus .
  • position information is intermittent and noisy from external sources (e.g. GPS, Blue Tooth)
  • the palmtop must be able to intelligently calculate its current position and orientation by using onboard inertial sensors and attitude sensors. It may however be possible to use just external sources, depending on the application. It may be useful to combine the large- scale information from external sources and the small-scale information from sensors.
  • Eye position may also be obtained in a number of ways.
  • One approach is to infer the eye position based on one of a number of assumptions.
  • One assumption is that the eye position is static in the real-world reference frame, which may be an accurate assumption if the user is seated.
  • the device can then be tilted and moved relative to the users head to show different information.
  • a second assumption is that the user's head is static in the frame of reference of the mobile device, i.e. always at the same position in front of the screen even as the screen is tilted.
  • the browser apparatus may be fixed in either of the above modes or it may be switchable between them.
  • the positions of the user's eyes may be measured.
  • the user may wear a small transmitter on his head and the direction of the transmitter from the browser apparatus determined.
  • the browser apparatus may include a camera and the camera may record an image of the user, the eyes or other portion or whole of the head of the user determined by image processing and the location of the eyes thus determined.
  • the images in the virtual information space may contain text; an example of such images is web pages. This creates difficulties because the resolution of the text is essentially continuously variable.
  • a text rendering scheme to produce such text is accordingly required; one example of an existing scheme is the TexFont package.
  • the plane of the image may be fixed relative to the device so that it is face on, whilst the distance from the device may be varied by moving the device thus achieving intuitive magnification .
  • the scale of real and virtual worlds may remain constant but the orientation and the point in the real world corresponding to the origin of the virtual world may vary (v no longer fixed) .
  • Such an application may be suitable in games applications.
  • the scale is also changeable, perhaps under user control.
  • Such an approach may be particularly suitable for navigating through arbitrary information spaces that are not in the form of a virtual world.
  • the view from the eye position on the browser apparatus may be calculated.
  • the information used is the position of the user's eye, the position and orientation of the display on the browser apparatus and information about the virtual information space.
  • the projection of the virtual world onto a screen may be carried out in a number of ways, for example by calculation in the CPU of a browser apparatus or in a specific graphics processor. This projection approach is well suited to 3D graphics pipelines; dedicated hardware is accordingly available for carrying out the projection. This is highly convenient for implementing a browser apparatus according to the invention in a cost-effective way.
  • a 3D graphics pipeline performs calculations to take 3D geometric data and draw on a 2D screen the result of viewing objects, also known as primitives, from a certain viewpoint.
  • Figure 7 illustrates the viewpoint, screen and primitives.
  • the calculations transforms objects according to a reference frame hierarchy, lights the primitives according to their surface properties and virtual lights, projects the points from the space onto the plane, culls invisible objects outside the fields of view, removes hidden surfaces, and textures the polygons.
  • the objects may be manipulated such that movement of the screen moves the picked object.
  • a handheld browser apparatus may not be able to hold all the information about the space.
  • An appropriate solution is a networked architecture where some information is delivered to the browser apparatus, for example through the mobile telephone network.
  • the browser apparatus may accordingly include a mobile telephone transceiver for connecting to the mobile telephone network.
  • Other possibilities to connect the browser apparatus to a base station include infra-red ports, wire, radio or the like.
  • the camera position may be predicted ahead of time from the current position and motion. This allows faster display of three dimensional information because it allows faster updates.
  • the simplest approach is to use dead-reckoning to predict the camera position and display position for the next display updates.
  • High latency data paths such as loading large datasets from disks or across a network, can then be started before they are required by the display system.
  • the database in the server 17 may accordingly perform efficient, large scale geographical culling of information in the virtual space to minimise the bandwidth required between database and handheld device .
  • the palmtop may manage the information passed to it by the cyber-database intelligently, storing any texture and polygonal information likely to be needed again soon.
  • Embodiments of the new system provide advantages over prior art approaches.
  • embodiments of the invention permit the user to interact with a number of game components by moving the display, and to explore a wider viewpoint by moving the display.
  • embodiments of the invention provide a movable screen very useful to explore particular areas of the web page .
  • Prior augmented reality systems using head mounted displays have the limitation that the virtual and real worlds remain congruent. However, embodiments of the invention permit additional linear or angular motion on the virtual display following a path through the virtual environment .
  • Hyperlinks may be selected using the picking procedure described above, i.e. a line-of-sight selection.
  • a 3D object may be loaded into the virtual scene. By moving the handheld device and moving his own viewpoint, the entirety of the object can be explored as if it were fixed in real space. If the object is large, it may be manipulated itself, without the user having to move around it, and with additional scaling operations, if needed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
EP01929798A 2000-05-13 2001-05-11 Browsersystem und verfahren zu dessen verwendung Withdrawn EP1374019A2 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
GBGB0011455.3A GB0011455D0 (en) 2000-05-13 2000-05-13 Browser system and method for using it
GB0011455 2000-05-13
PCT/GB2001/002066 WO2001088679A2 (en) 2000-05-13 2001-05-11 Browser system and method of using it

Publications (1)

Publication Number Publication Date
EP1374019A2 true EP1374019A2 (de) 2004-01-02

Family

ID=9891451

Family Applications (1)

Application Number Title Priority Date Filing Date
EP01929798A Withdrawn EP1374019A2 (de) 2000-05-13 2001-05-11 Browsersystem und verfahren zu dessen verwendung

Country Status (5)

Country Link
EP (1) EP1374019A2 (de)
JP (1) JP2003533815A (de)
AU (1) AU2001256479A1 (de)
GB (1) GB0011455D0 (de)
WO (1) WO2001088679A2 (de)

Families Citing this family (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6711547B1 (en) * 2000-02-28 2004-03-23 Jason Corey Glover Handheld medical processing device storing patient records, prescriptions and x-rays used by physicians
GB2387504B (en) * 2002-04-12 2005-03-16 Motorola Inc Method and system of managing a user interface of a communication device
FI117217B (fi) * 2003-10-01 2006-07-31 Nokia Corp Menetelmä ja järjestelmä käyttöliittymän (User Interface) hallitsemiseksi, vastaava laite ja ohjelmalliset (Software) välineet menetelmän toteuttamiseksi
DE102004061842B4 (de) * 2003-12-22 2017-03-02 Metaio Gmbh Tracking-System für mobile Anwendungen
US7961909B2 (en) 2006-03-08 2011-06-14 Electronic Scripting Products, Inc. Computer interface employing a manipulated object with absolute pose detection component and a display
JP4722052B2 (ja) * 2004-10-15 2011-07-13 ソフトバンクモバイル株式会社 連係動作方法及び通信端末装置
USD558758S1 (en) 2007-01-05 2008-01-01 Apple Inc. Electronic device
USD580387S1 (en) 2007-01-05 2008-11-11 Apple Inc. Electronic device
USD898736S1 (en) 2007-01-05 2020-10-13 Apple Inc. Electronic device
USD957385S1 (en) 2007-08-31 2022-07-12 Apple Inc. Electronic device
USD602486S1 (en) 2007-08-31 2009-10-20 Apple Inc. Electronic device
USD1033379S1 (en) 2008-04-07 2024-07-02 Apple Inc. Electronic device
USD602016S1 (en) 2008-04-07 2009-10-13 Apple Inc. Electronic device
USD615083S1 (en) 2008-04-07 2010-05-04 Apple Inc. Electronic device
USD602015S1 (en) 2008-04-07 2009-10-13 Apple Inc. Electronic device
KR100998182B1 (ko) * 2008-08-21 2010-12-03 (주)미래컴퍼니 수술용 로봇의 3차원 디스플레이 시스템 및 그 제어방법
USD602017S1 (en) 2008-09-05 2009-10-13 Apple Inc. Electronic device
JP5087532B2 (ja) * 2008-12-05 2012-12-05 ソニーモバイルコミュニケーションズ株式会社 端末装置、表示制御方法および表示制御プログラム
FR2941805A1 (fr) * 2009-02-02 2010-08-06 Laurent Philippe Nanot Dispositif pour la visite guidee virtuelle interactive de sites/evenements historiques ou de projets de construction et scenarios d'entrainement
USD627777S1 (en) 2010-01-06 2010-11-23 Apple Inc. Portable display device
USD637596S1 (en) 2010-01-06 2011-05-10 Apple Inc. Portable display device
USD633908S1 (en) 2010-04-19 2011-03-08 Apple Inc. Electronic device
USD627778S1 (en) 2010-04-19 2010-11-23 Apple Inc. Electronic device
USD864949S1 (en) 2010-04-19 2019-10-29 Apple Inc. Electronic device
USD681630S1 (en) 2010-07-08 2013-05-07 Apple Inc. Portable display device with graphical user interface
USD683730S1 (en) 2010-07-08 2013-06-04 Apple Inc. Portable display device with graphical user interface
USD642563S1 (en) 2010-08-16 2011-08-02 Apple Inc. Electronic device
USD680109S1 (en) 2010-09-01 2013-04-16 Apple Inc. Electronic device with graphical user interface
USD671114S1 (en) 2011-02-25 2012-11-20 Apple Inc. Portable display device with cover
USD670692S1 (en) 2011-01-07 2012-11-13 Apple Inc. Portable display device
USD669468S1 (en) 2011-01-07 2012-10-23 Apple Inc. Portable display device
US9285883B2 (en) 2011-03-01 2016-03-15 Qualcomm Incorporated System and method to display content based on viewing orientation
US8810598B2 (en) 2011-04-08 2014-08-19 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
JP2015501984A (ja) 2011-11-21 2015-01-19 ナント ホールディングス アイピー,エルエルシー 加入請求書サービス、システムおよび方法
USD707223S1 (en) 2012-05-29 2014-06-17 Apple Inc. Electronic device
USD684571S1 (en) 2012-09-07 2013-06-18 Apple Inc. Electronic device
USD681632S1 (en) 2012-08-11 2013-05-07 Apple Inc. Electronic device
JP5519750B2 (ja) * 2012-09-11 2014-06-11 オリンパスイメージング株式会社 画像鑑賞システム、画像鑑賞方法、画像鑑賞用サーバー、および端末機器
JP5519751B2 (ja) * 2012-09-11 2014-06-11 オリンパスイメージング株式会社 画像鑑賞システム、画像鑑賞方法、画像鑑賞用サーバー、および端末機器
USD681032S1 (en) 2012-09-11 2013-04-30 Apple Inc. Electronic device
DE102013205593A1 (de) * 2013-03-28 2014-10-02 Hilti Aktiengesellschaft Verfahren und Vorrichtung zur Anzeige von Objekten und Objektdaten eines Konstruktionsplans
US9582516B2 (en) 2013-10-17 2017-02-28 Nant Holdings Ip, Llc Wide area augmented reality location-based services
USD845294S1 (en) 2014-05-05 2019-04-09 Apple Inc. Housing for an electronic device with surface ornamentation
GB2528319A (en) * 2014-07-18 2016-01-20 Ibm Device display perspective adjustment
AU361808S (en) 2014-10-15 2015-05-14 Apple Inc Electronic device
US11577159B2 (en) 2016-05-26 2023-02-14 Electronic Scripting Products Inc. Realistic virtual/augmented/mixed reality viewing and interactions
TWI687842B (zh) * 2017-12-29 2020-03-11 宏碁股份有限公司 虛擬實境網頁內容的瀏覽方法及應用其的電子裝置
USD940127S1 (en) 2018-04-23 2022-01-04 Apple Inc. Electronic device
USD924868S1 (en) 2018-04-23 2021-07-13 Apple Inc. Electronic device
USD974352S1 (en) 2019-11-22 2023-01-03 Apple Inc. Electronic device
CN111459266A (zh) * 2020-03-02 2020-07-28 重庆爱奇艺智能科技有限公司 一种在虚拟现实的3d场景中操作2d应用的方法和装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6009210A (en) * 1997-03-05 1999-12-28 Digital Equipment Corporation Hands-free interface to a virtual reality environment using head tracking
EP0874303B1 (de) * 1997-04-25 2002-09-25 Texas Instruments France Videoanzeigesystem zum Darstellen einer virtuellen dreidimensionalen Bildanzeige
AU2211799A (en) * 1998-01-06 1999-07-26 Video Mouse Group, The Human motion following computer mouse and game controller

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO0188679A3 *

Also Published As

Publication number Publication date
JP2003533815A (ja) 2003-11-11
WO2001088679A3 (en) 2003-10-09
WO2001088679A2 (en) 2001-11-22
GB0011455D0 (en) 2000-06-28
AU2001256479A1 (en) 2001-11-26

Similar Documents

Publication Publication Date Title
EP1374019A2 (de) Browsersystem und verfahren zu dessen verwendung
US10928974B1 (en) System and method for facilitating user interaction with a three-dimensional virtual environment in response to user input into a control device having a graphical interface
KR101823182B1 (ko) 동작의 속성을 이용한 디스플레이 상의 3차원 사용자 인터페이스 효과
Mine Virtual environment interaction techniques
US10521028B2 (en) System and method for facilitating virtual interactions with a three-dimensional virtual environment in response to sensor input into a control device having sensors
US7382374B2 (en) Computerized method and computer system for positioning a pointer
US6078329A (en) Virtual object display apparatus and method employing viewpoint updating for realistic movement display in virtual reality
JP4115188B2 (ja) 仮想空間描画表示装置
US10330931B2 (en) Space carving based on human physical data
EP2105905A2 (de) Bilderzeugungsvorrichtung
Piekarski Interactive 3d modelling in outdoor augmented reality worlds
CN112313605A (zh) 增强现实环境中对象的放置和操纵
US20180276900A1 (en) System and method for modifying virtual objects in a virtual environment in response to user interactions
JPH08190640A (ja) 情報表示方法および情報提供システム
EP1821258B1 (de) Verfahren und Gerät für automatische Animation von grafischen 3D Szenen für verbesserte 3-D Visualisierung
Cho et al. Multi-scale 7DOF view adjustment
JP4493082B2 (ja) Cg提示装置及びそのプログラム、並びに、cg表示システム
EP1720090B1 (de) Rechnerunterstützes Verfahren und Rechnersystem zum Positionieren eines Zeigers
Wu et al. Quantifiable fine-grain occlusion removal assistance for efficient vr exploration
Asiminidis Augmented and Virtual Reality: Extensive Review
Wartell et al. Interaction volume management in a multi-scale virtual environment
Huelves 3D Magic Lens Implementation using a Handheld Device in a 3D Virtual Environment
Kruszynski et al. Tangible Interaction for 3D Widget Manipulation in Virtual Environments.
Barange et al. Tabletop Interactive Camera Control
Andersson VR Technology, TNM053

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20021125

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR

AX Request for extension of the european patent

Extension state: AL LT LV MK RO SI

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20041201