WO2001013337A1 - Virtual model manipulation - Google Patents

Virtual model manipulation Download PDF

Info

Publication number
WO2001013337A1
WO2001013337A1 PCT/NZ2000/000145 NZ0000145W WO0113337A1 WO 2001013337 A1 WO2001013337 A1 WO 2001013337A1 NZ 0000145 W NZ0000145 W NZ 0000145W WO 0113337 A1 WO0113337 A1 WO 0113337A1
Authority
WO
WIPO (PCT)
Prior art keywords
display device
visual display
screen
virtual model
image
Prior art date
Application number
PCT/NZ2000/000145
Other languages
French (fr)
Inventor
John Andrew Wilson
Original Assignee
Soft Tech (Nz) Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Soft Tech (Nz) Limited filed Critical Soft Tech (Nz) Limited
Priority to AU63267/00A priority Critical patent/AU6326700A/en
Priority to NZ516991A priority patent/NZ516991A/en
Publication of WO2001013337A1 publication Critical patent/WO2001013337A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1601Constructional details related to the housing of computer displays, e.g. of CRT monitors, of flat displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/161Indexing scheme relating to constructional details of the monitor
    • G06F2200/1612Flat panel monitor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer

Definitions

  • This invention relates to virtual model manipulation.
  • a major problem is that an operator cannot physically touch a 3D model.
  • VR helmets have still not perfected stereoscopic vision and the users frequently suffer from headaches.
  • Another problem with helmets is that there is no easy way to select a reference point from which to view a model.
  • helmets are a physical encumbrance and requires the user to be totally immersed. While this may be an advantage for entertainment applications, this is not desirable for work applications where the user also prefers to have external interactions with his/her environment.
  • Two-dimensional digitisers are devices which can enable spatial data to be readily input into a computer system, but the digitisers themselves cannot be used for the actual manipulation of the data once in the system.
  • a 3-dimensional digitiser has recently been produced which can measure the position and orientation of a stylus tip in 3-dimensional space. This is known as a 3DRAWTM digitising system, details about which can be found at the website www.yr.web.com/WEB/PRODUCTS/TRACKER.HTM.
  • a problem with this system is that although it allows 3-dimensional models to be accurately recorded within a computer system, it does not allow for ready manipulation of these models once entered.
  • a product sold under the trade mark Microscribe 3D by Immersion as advertised on website www.yrweb.com/WEB/PRODUCTS/DIGITISE.HTM is another digitiser.
  • a problem with this device is the difficulty it takes for the user to access a particular point of reference that is illustrated on the computer screen.
  • the user has to either create zoom windows or enter co-ordinates. There is no intuitive or fast means by which this can be achieved.
  • This is of particular disadvantage when the user is wanting to use the model with organic (and therefore relatively unstructured) subject matter. For example, a medical person may wish to use a modelling system to follow a vein or assess the position of a tumour.
  • a virtual model manipulation system including
  • the moveable visual display device displays different orientations and/or views of a 3-dimensional model associated with the orientation and/or position of the visual display device.
  • the method of operation is characterised by displaying different orientations and/or views of a 3-dimensional model on the visual display device, wherein the orientation and/or views displayed are associated with the orientation and/or position of visual display device.
  • the media may be any suitable media and can include compact disc, microchip, hard drive and so forth.
  • the visual display device may be any visual device that can represent a 3- dimensional model. This may be a hologram, CRT screen or any other suitable device.
  • the visual display device will now be referred to as a screen.
  • the screen is capable of appearing substantially transparent screen.
  • the screen will use a liquid crystal display (LCD).
  • LCD liquid crystal display
  • the screen may not necessarily be a flat screen.
  • the screen could be curved giving a greater 3-dimensional quality to any image displayed on the screen.
  • the screen may be flexible and can be manipulated to follow generally the contours of the model being displayed.
  • the screen may move by a number of means.
  • the screen is attached to a multi-jointed arm which can give a number of degrees of freedom - four at a minimum.
  • a multi-jointed arm which can give a number of degrees of freedom - four at a minimum.
  • Some embodiments may include a handle attached to the screen allowing the user to readily move the screen.
  • the screen may be independent of any physical encumbrance such as an arm.
  • the screen may transmit its position, or have its position remotely sensed by the model manipulation system.
  • triangulation methods may be used, utilising such methods as magnetic, acoustic, visual or GPS.
  • the reference point can be chosen by the user.
  • Traditional systems are cumbersome and rely on the creation of zoom windows or the entering of co-ordinates.
  • the choice of reference point is made intuitively.
  • the screen is a touch screen and touching the screen (whether by finger, stylus, laser pointer or some other means) provides the user ith a referef.ee p lrit.
  • the user can then manipulate the image on the screen. For example, the user may touch the screen at a particular point and then select the reference point touched to then manipulate the image.
  • the selection and possibly manipulation may be via function buttons attached to the screen.
  • the stylus may have function buttons, or there may be a separate input device for this (e.g. keyboard).
  • the image may be that of a person's face.
  • a touch of the stylus on the end of the nose sets up a direct reference point. Selecting the reference point via a function button and then moving the stylus can then cause the nose to change shape on the screen as manipulated by the user. It can be seen that this is an intuitive way to approach model manipulation.
  • the present invention is characterised in that the screen displays different orientations of the model which are associated with the actual orientation and/or position of the screen itself. This provides a considerable amount of intuitiveness when dealing with a 3-dimensional model.
  • the screen may be moved forward and the display on the screen may show the model being passed, through at the distance and angle corresponding to the position of the screen.
  • the 3-dimensional image may be a face. Initially the screen may display the face as if the face is at a distance from the user. The user may then move the screen forward and it appears on the screen that the screen is approaching the face. Moving the screen further forward still, may provide an image on the screen which looks like the screen is actually passing through the face. A cross- sectional outline on the screen would represent model objects which are virtually bisected by the screen.
  • Tilting the screen from one plane to another can cause the image on the screen to likewise tilt giving the impression that the model is in reality just behind the screen and the screen is merely a window through which the user can look at the model.
  • This feature of the present invention overcomes a number of the problems associated with the prior art and in addition provides new features in the field of virtual model manipulation.
  • the present invention obviates the need for VR helmets and their problems such as a high degree of latency, poor stereoscopic vision, the requirement to move the users head to select a reference point and the requirement to be totally immersed while in a work environment.
  • the present invention provides the user with a fast and intuitive means by which to locate and view a 3-dimensional model by its feature of moving a visual screen and having the display on the screen give the impression that the screen is moving through and/or around the model.
  • the 3-dimensional model may be a medical model, scanned in from a patient.
  • a surgeon can follow various organic pathways through the model, for example veins and get a greater appreciation of linkages of component parts within that model.
  • the 3-dimensional model may be of a building including various components such as walls, structural columns, pipes, wiring, joists and the like.
  • a particular contractor may choose it's specialty (say pipes) and track the positioning of the pipes within the building using the present invention.
  • the contractor may use the present invention to actually place the pipes within the model of the building, this procedure being enhanced by the ability to view the building from all angles.
  • the physical interface provided by the present invention from the user to the computer model can also be used in the construction of 3-dimensional model.
  • the user may wish to construct an organic model such as a tree.
  • the user may place the screen on substantially horizontal plane, draw say a circle and then move the screen upwards causing the circle to form into a cylinder using the image manipulation tools (such as function buttons).
  • the user may wish to create a branch. This can involve the user repositioning the screen at an angle to the cylinder already formed and again creating a series of circles with differing diameters at an angle from the main cylinder or trunk - thus resulting in a tapering branch.
  • the present invention can also be used to intuitively sculpt a 3-dimensional model.
  • the original image may be a solid block.
  • the user may move the screen up to the solid block and then use a model manipulation stylus to carve off pieces from that block to form a sculpture.
  • the present invention can also be used in the entertainment industry as well.
  • the imagery on the screen may be similar to that used in present day computer games.
  • the screen may be used as an equivalent to a joystick allowing the user to effectively move through a scene and play the game.
  • Figure 1 is a diagrammatic view of one possible embodiment of the present invention.
  • FIG. 1 Hardware generally indicated by arrow 1 associated with a virtual model manipulation system in accordance with the present invention is illustrated in Figure 1.
  • the hardware (1) includes a user interactive area (10) which includes a screen (2), handle (12) and function buttons (11).
  • the screen (2) is a transparent LCD screen.
  • a stylus (3) attached to the screen (2) can be used to select reference points of images displayed on the screen (2). In other embodiments, the stylus need not be attached.
  • the screen (2) is attached to an arm generally indicated by arrow 4.
  • the arm (4) has fully moveable ball joints (5 and 6) along with a pivot point (7).
  • the ball joints (5, 6) and pivot points (7) provide the screen (2) with six degrees of freedom in its movement.
  • the handle (12) on the screen (2) can be readily grasped by the user to move the screen (2).
  • a base (8) ensures that the arm (4) is substantially stable when in use.
  • the screen (2) is electrically connected via the arm (4) to a central processing unit (9).
  • the central processing unit (CPU) (9) contains all of the software required to alter the image on the screen depending on the orientation position of the screen.
  • the CPU (9) also includes the program required to alter the image on the screen in accordance with input via the user using the stylus (3) and the function buttons (11).
  • the CPU (9) also receives electronic electrical signals from potentiometers in the ball joints (5,6) and pivot points (7) of the arm (4). This provides the CPU (9) with information as to the actual position and orientation of the screen (2) which is essential for the required virtual model manipulation. Sensors other than potentiometers may also be used to give positional/orientation data for example remote sensors as discussed previously.
  • the CPU may be sited on the screen itself.
  • the screen could be a Palm PilotTM or some other compact device with its own CPU.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Processing Or Creating Images (AREA)

Abstract

This invention relates to a method of operating a virtual model manipulation system where in the virtual model manipulation systems includes a moveable visual display device, and a means to directly provide the user with the choice of reference point, the method of operating the virtual model manipulation system characterised by the step of displaying different orientating and/or views of a 3-dimensional model on the visual display device wherein the orientation and/or views are displayed are associated with the orientation and/or position of the visual display device.

Description

VIRTUAL MODEL MANIPULATION
TECHNICAL FIELD
This invention relates to virtual model manipulation.
Reference throughout this specification should be made to use of the present invention in relation to virtual model manipulation for modelling purposes. It should be appreciated however that the principles of the present invention can be applied to other applications than this.
BACKGROUND ART
There are a number of 3-dimensional input devices for use in image manipulation and modelling. A useful summary of these devices is given in a paper dated 2 November 1993 produced by Chris Hand at the Department of Computing Science, De Montfort University, the Gateway Leicester, United Kingdom. This article can be accessed by way of reference on the website www.dcs.ed.ac.uk.
To summarise, there are a number of problems associated with virtual model manipulation. A major problem is that an operator cannot physically touch a 3D model.
The closest technology enabling operators to do this is the use of a virtual reality helmet in combination with a data glove. There are serious problems however, with the use of a VR helmet.
Firstly, there is a high degree of latency which can lead to inaccuracies and overcorrection.
Also, VR helmets have still not perfected stereoscopic vision and the users frequently suffer from headaches. Another problem with helmets is that there is no easy way to select a reference point from which to view a model.
Further, to move towards a desired viewing model requires the user to move his/her head appropriately. In a work environment particularly, this is an unpleasant sensation.
Yet another problem with helmets is that they are a physical encumbrance and requires the user to be totally immersed. While this may be an advantage for entertainment applications, this is not desirable for work applications where the user also prefers to have external interactions with his/her environment.
Two-dimensional digitisers are devices which can enable spatial data to be readily input into a computer system, but the digitisers themselves cannot be used for the actual manipulation of the data once in the system.
A 3-dimensional digitiser has recently been produced which can measure the position and orientation of a stylus tip in 3-dimensional space. This is known as a 3DRAW™ digitising system, details about which can be found at the website www.yr.web.com/WEB/PRODUCTS/TRACKER.HTM. A problem with this system is that although it allows 3-dimensional models to be accurately recorded within a computer system, it does not allow for ready manipulation of these models once entered.
A product sold under the trade mark Microscribe 3D by Immersion as advertised on website www.yrweb.com/WEB/PRODUCTS/DIGITISE.HTM is another digitiser.
A problem with this device is the difficulty it takes for the user to access a particular point of reference that is illustrated on the computer screen. The user has to either create zoom windows or enter co-ordinates. There is no intuitive or fast means by which this can be achieved. This is of particular disadvantage when the user is wanting to use the model with organic (and therefore relatively unstructured) subject matter. For example, a medical person may wish to use a modelling system to follow a vein or assess the position of a tumour.
It is an object of the present invention to address the foregoing problems or at least to provide the public with a useful choice.
Further aspects and advantages of the present invention will become apparent from the ensuing description which is given by way of example only.
DISCLOSURE OF INVENTION
According to one aspect of the present invention there is provided a virtual model manipulation system including
a moveable visual display device, and
a means to directly provide the user with a choice of reference point,
the virtual model manipulation system characterised in that
the moveable visual display device displays different orientations and/or views of a 3-dimensional model associated with the orientation and/or position of the visual display device.
According to another aspect of the present invention there is provided method of operating a virtual model manipulation system when the virtual modal manipulation includes
a moveable visual display device,
a means to directly provide the user with a choice of reference point,
wherein the method of operation is characterised by displaying different orientations and/or views of a 3-dimensional model on the visual display device, wherein the orientation and/or views displayed are associated with the orientation and/or position of visual display device.
According to a further aspect of the present invention there is provided media carrying instructions for the operation of a virtual model manipulation system as described above. The media may be any suitable media and can include compact disc, microchip, hard drive and so forth.
The visual display device may be any visual device that can represent a 3- dimensional model. This may be a hologram, CRT screen or any other suitable device.
The visual display device will now be referred to as a screen.
In preferred embodiments the screen is capable of appearing substantially transparent screen. With present day technology it is envisaged that the screen will use a liquid crystal display (LCD).
In more advanced embodiments of the present invention, the screen may not necessarily be a flat screen. For example, the screen could be curved giving a greater 3-dimensional quality to any image displayed on the screen. In other embodiments, the screen may be flexible and can be manipulated to follow generally the contours of the model being displayed.
The screen may move by a number of means.
In one embodiment of the present invention the screen is attached to a multi-jointed arm which can give a number of degrees of freedom - four at a minimum. Preferably there is provided six degrees of freedom with the screen attached to the arm being able to move and rotate on three separate axes.
Some embodiments may include a handle attached to the screen allowing the user to readily move the screen. In other embodiments of the present invention the screen may be independent of any physical encumbrance such as an arm. For example, the screen may transmit its position, or have its position remotely sensed by the model manipulation system. For example, triangulation methods may be used, utilising such methods as magnetic, acoustic, visual or GPS.
There may be a number of ways by which the reference point can be chosen by the user. Traditional systems are cumbersome and rely on the creation of zoom windows or the entering of co-ordinates. Thus, in preferred embodiments of the present invention the choice of reference point is made intuitively. Thus, in preferred embodiments of the present invention the screen is a touch screen and touching the screen (whether by finger, stylus, laser pointer or some other means) provides the user ith a referef.ee p lrit.
Once the user has a reference point, then the user can then manipulate the image on the screen. For example, the user may touch the screen at a particular point and then select the reference point touched to then manipulate the image.
In some embodiments the selection and possibly manipulation may be via function buttons attached to the screen. Alternatively the stylus may have function buttons, or there may be a separate input device for this (e.g. keyboard).
For example, the image may be that of a person's face. A touch of the stylus on the end of the nose sets up a direct reference point. Selecting the reference point via a function button and then moving the stylus can then cause the nose to change shape on the screen as manipulated by the user. It can be seen that this is an intuitive way to approach model manipulation.
The present invention is characterised in that the screen displays different orientations of the model which are associated with the actual orientation and/or position of the screen itself. This provides a considerable amount of intuitiveness when dealing with a 3-dimensional model.
For example, the screen may be moved forward and the display on the screen may show the model being passed, through at the distance and angle corresponding to the position of the screen.
As an example, the 3-dimensional image may be a face. Initially the screen may display the face as if the face is at a distance from the user. The user may then move the screen forward and it appears on the screen that the screen is approaching the face. Moving the screen further forward still, may provide an image on the screen which looks like the screen is actually passing through the face. A cross- sectional outline on the screen would represent model objects which are virtually bisected by the screen.
Tilting the screen from one plane to another can cause the image on the screen to likewise tilt giving the impression that the model is in reality just behind the screen and the screen is merely a window through which the user can look at the model.
This feature of the present invention overcomes a number of the problems associated with the prior art and in addition provides new features in the field of virtual model manipulation.
The present invention obviates the need for VR helmets and their problems such as a high degree of latency, poor stereoscopic vision, the requirement to move the users head to select a reference point and the requirement to be totally immersed while in a work environment.
The present invention provides the user with a fast and intuitive means by which to locate and view a 3-dimensional model by its feature of moving a visual screen and having the display on the screen give the impression that the screen is moving through and/or around the model. For example, the 3-dimensional model may be a medical model, scanned in from a patient. With the present invention, a surgeon can follow various organic pathways through the model, for example veins and get a greater appreciation of linkages of component parts within that model.
Another use of the present invention may be to allow contractors to view various areas in a proposed building plan. For example, the 3-dimensional model may be of a building including various components such as walls, structural columns, pipes, wiring, joists and the like. A particular contractor may choose it's specialty (say pipes) and track the positioning of the pipes within the building using the present invention.
Alternatively, the contractor may use the present invention to actually place the pipes within the model of the building, this procedure being enhanced by the ability to view the building from all angles.
The physical interface provided by the present invention from the user to the computer model can also be used in the construction of 3-dimensional model. For example, the user may wish to construct an organic model such as a tree. The user may place the screen on substantially horizontal plane, draw say a circle and then move the screen upwards causing the circle to form into a cylinder using the image manipulation tools (such as function buttons).
Next the user may wish to create a branch. This can involve the user repositioning the screen at an angle to the cylinder already formed and again creating a series of circles with differing diameters at an angle from the main cylinder or trunk - thus resulting in a tapering branch.
It can be seen that this is a much more intuitive means of creating a 3-dimensional organic object than previous methods.
The present invention can also be used to intuitively sculpt a 3-dimensional model. For example, the original image may be a solid block. The user may move the screen up to the solid block and then use a model manipulation stylus to carve off pieces from that block to form a sculpture.
It should be appreciated that the present invention can also be used in the entertainment industry as well. For example, the imagery on the screen may be similar to that used in present day computer games. However, the screen may be used as an equivalent to a joystick allowing the user to effectively move through a scene and play the game.
BRIEF DESCRIPTION OF DRAWINGS
Further aspects of the present invention will now be described by way of example only and with reference to the accompanying drawing in which:
Figure 1 is a diagrammatic view of one possible embodiment of the present invention.
BEST MODES FOR CARRYING OUT THE INVENTION
Hardware generally indicated by arrow 1 associated with a virtual model manipulation system in accordance with the present invention is illustrated in Figure 1.
The hardware (1) includes a user interactive area (10) which includes a screen (2), handle (12) and function buttons (11).
The screen (2) is a transparent LCD screen. A stylus (3) attached to the screen (2) can be used to select reference points of images displayed on the screen (2). In other embodiments, the stylus need not be attached.
The screen (2) is attached to an arm generally indicated by arrow 4. The arm (4) has fully moveable ball joints (5 and 6) along with a pivot point (7). The ball joints (5, 6) and pivot points (7) provide the screen (2) with six degrees of freedom in its movement.
The handle (12) on the screen (2) can be readily grasped by the user to move the screen (2).
A base (8) ensures that the arm (4) is substantially stable when in use.
The screen (2) is electrically connected via the arm (4) to a central processing unit (9).
The central processing unit (CPU) (9) contains all of the software required to alter the image on the screen depending on the orientation position of the screen.
The CPU (9) also includes the program required to alter the image on the screen in accordance with input via the user using the stylus (3) and the function buttons (11).
The CPU (9) also receives electronic electrical signals from potentiometers in the ball joints (5,6) and pivot points (7) of the arm (4). This provides the CPU (9) with information as to the actual position and orientation of the screen (2) which is essential for the required virtual model manipulation. Sensors other than potentiometers may also be used to give positional/orientation data for example remote sensors as discussed previously.
In some embodiments the CPU may be sited on the screen itself. As an example the screen could be a Palm Pilot™ or some other compact device with its own CPU.
Aspects of the present invention have been described by way of e ample only and it should be appreciated that modifications and additions may be made thereto without departing from the scope of the appended claims .

Claims

WHAT I/WE CLAIM IS:
1. A method of operating a virtual model manipulation system wherein the virtual model manipulation systems includes
a moveable visual display device, and
a means to directly provide the user with the choice of reference point,
the method of operating the virtual model manipulation system characterised by the step of displaying different orientating and/or views of a 3-dimensional model on the visual display device,
wherein the orientation and/or views are displayed are associated with the orientation and/or position of the visual display device.
2. A method as claimed in claim 1, wherein the visual display device is a screen.
3. A method as claimed in claim 2 wherein the screen is capable of appearing substantially transparent.
4. A method as claimed in any one of claims 2 to 3 wherein the screen is curved.
5. A method as claimed in any one of claims 2 to 6 wherein the screen is flexible.
6. A method as claimed in any one of claims 1 to 5 wherein the visual display device is attached to a multi-jointed arm.
7. A method as claimed in any one of claims 1 to 6 wherein the visual display device is capable of rotation and movement on three separate axes.
8. A method as claimed in any one claims 1 to 7 wherein the visual display device includes a handle to assist the user to move the visual display device.
9. A method as claimed in any one of claims 1 to 8 wherein the visual display device is a touch screen.
10. A method as claimed in any one of claims 1 to 9 characterised by the further step of creating a reference point by touching the visual display device.
11. A method as claimed in claim 10 characterised by the further step of manipulating the image on the visual display device once the reference point is chosen.
12. A method as claimed in any one of claims 2 to 11 wherein the visual display device includes a function button(s) attached to the visual display device.
13. A method as claimed in any one of claims 1 to 12 characterised by the step of displaying a display on the visual display device of the 3-dimensional model being passed through at a distance and angle corresponding to that physical location of the visual display device.
14. A method as claimed in any one of claims 1 to 13 characterised by the further step of displaying an image on the visual display device so that it appears to tilt with the physical tilting of the visual display device.
15. A method as claimed in any one claims 1 to 14 characterised by the further step of constructing a 3-dimensional model by building an image which corresponds to the movement of the visual display device.
16. A method as claimed in any one of claims 1 to 15 characterised by the further step of being able to sculpt a 3-dimensional image with the visual display device and model manipulating means.
17. A virtual model manipulation system including
a moveable visual display device, and
a means of directly provide user with the choice of reference point, the virtual model manipulation system characterised in that
the moveable visual display device displays different orientation and/of views of a 3 dimensional model associated with the orientation and/or position of the visual display device.
18. A system as claimed in claim 17 wherein the visual display device is a screen.
19. A system as claimed in claim 18 wherein the screen is capable of appearing substantially transparent.
20. A system as claimed in any one of claims 18 to 19 wherein the screen is curved.
21. A system as claimed in any one of claims 18 to 20 wherein the screen is flexible.
22. A system as claimed in any one claims 17 to 21 where in the visual display device is attached to multi-jointed arm.
23. A system as claimed in any one of claims 18 to 22 wherein the visual display device is capable of rotation and movement on three separate axes.
24. A system as claimed in any one of claims 17 to 23 wherein the visual display device includes a handle to assist the user to move the screen.
25. A system as claimed in any one of claims 17 to 24 wherein the visual display device is a touch screen.
26. A system as claimed in any one of claims 17 to 25 characterised by the further step of creating a reference pointed by touching the visual display device.
27. A system as claimed in claim 26 characterised by the further step of manipulating the image on the visual display device once the reference point is chosen.
28. A system as claimed in any one of claims 17 to 27 wherein the visual display device includes a function button attached to the visual display device.
29. A system as claimed in any one claims 17 to 28 characterised by the step of displaying a display on the visual display device of the 3-dimensional model being passed through at a distance and angle corresponding to that physical location of the visual display device.
30. A system as claimed in any one of claims 17 to 30 characterised by the capability of displaying an image on the visual display device so that it appears to tilt with the physical tilting of the visual display device.
31. A system as claimed in any one of claims 17 to 30 characterised by the capability of constructing a 3-dimensional model by building an image which corresponds to the movement of the visual display device.
32. A system as claimed in any one of claims 17 to 31 characterised by the capability of being able to sculpt a 3-dimensional image with the visual display device and model manipulating means.
33. A method as claimed in any one of claims 1 to 5 or claims 7 to 16 wherein the position of the visual display device is remotely sensed.
34. A system as claimed in any one claims 17 to 21 or claims 23 to 32 wherein the position of the visual disply device is remotely sensed.
35. A method of operation of a virtual model manipulation system substantially as herein described with the reference to and as illustrated by the accompanying drawings.
36. A virtual model manipulation system substantially as herein described with reference to and as illustrated by the accompanying drawings.
37. Media which carries instructions for the operation of the virtual model manipulation system substantially as herein claimed and/or described.
PCT/NZ2000/000145 1999-08-02 2000-08-02 Virtual model manipulation WO2001013337A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU63267/00A AU6326700A (en) 1999-08-02 2000-08-02 Virtual model manipulation
NZ516991A NZ516991A (en) 1999-08-02 2000-08-02 Virtual model manipulation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
NZ337027 1999-08-02
NZ33702799 1999-08-02

Publications (1)

Publication Number Publication Date
WO2001013337A1 true WO2001013337A1 (en) 2001-02-22

Family

ID=19927420

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/NZ2000/000145 WO2001013337A1 (en) 1999-08-02 2000-08-02 Virtual model manipulation

Country Status (2)

Country Link
AU (1) AU6326700A (en)
WO (1) WO2001013337A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003088026A1 (en) * 2002-04-10 2003-10-23 Technische Universiteit Delft Apparatus for storage and reproduction of image data
US7324121B2 (en) 2003-07-21 2008-01-29 Autodesk, Inc. Adaptive manipulators
DE102007050060A1 (en) * 2007-10-19 2009-04-23 Dräger Medical AG & Co. KG Device and method for issuing medical data

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4984179A (en) * 1987-01-21 1991-01-08 W. Industries Limited Method and apparatus for the perception of computer-generated imagery
JPH08263698A (en) * 1995-03-20 1996-10-11 Matsushita Electric Ind Co Ltd Environmental experience simulator
US5615132A (en) * 1994-01-21 1997-03-25 Crossbow Technology, Inc. Method and apparatus for determining position and orientation of a moveable object using accelerometers
US5764217A (en) * 1995-01-23 1998-06-09 International Business Machines Corporation Schematic guided control of the view point of a graphics processing and display system
US6057810A (en) * 1996-06-20 2000-05-02 Immersive Technologies, Inc. Method and apparatus for orientation sensing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4984179A (en) * 1987-01-21 1991-01-08 W. Industries Limited Method and apparatus for the perception of computer-generated imagery
US5615132A (en) * 1994-01-21 1997-03-25 Crossbow Technology, Inc. Method and apparatus for determining position and orientation of a moveable object using accelerometers
US5764217A (en) * 1995-01-23 1998-06-09 International Business Machines Corporation Schematic guided control of the view point of a graphics processing and display system
JPH08263698A (en) * 1995-03-20 1996-10-11 Matsushita Electric Ind Co Ltd Environmental experience simulator
US6057810A (en) * 1996-06-20 2000-05-02 Immersive Technologies, Inc. Method and apparatus for orientation sensing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DATABASE WPI Derwent World Patents Index; Class T01, AN 1996-510501/51 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003088026A1 (en) * 2002-04-10 2003-10-23 Technische Universiteit Delft Apparatus for storage and reproduction of image data
US7324121B2 (en) 2003-07-21 2008-01-29 Autodesk, Inc. Adaptive manipulators
DE102007050060A1 (en) * 2007-10-19 2009-04-23 Dräger Medical AG & Co. KG Device and method for issuing medical data
US9133975B2 (en) 2007-10-19 2015-09-15 Dräger Medical GmbH Device and process for the output of medical data
DE102007050060B4 (en) * 2007-10-19 2017-07-27 Drägerwerk AG & Co. KGaA Device and method for issuing medical data

Also Published As

Publication number Publication date
AU6326700A (en) 2001-03-13

Similar Documents

Publication Publication Date Title
US11723734B2 (en) User-interface control using master controller
Mine Virtual environment interaction techniques
Weimer et al. A synthetic visual environment with hand gesturing and voice input
Ware Using hand position for virtual object placement
Satriadi et al. Augmented reality map navigation with freehand gestures
US8547328B2 (en) Methods, apparatus, and article for force feedback based on tension control and tracking through cables
Poston et al. Dextrous virtual work
Song et al. WYSIWYF: exploring and annotating volume data with a tangible handheld device
US5973678A (en) Method and system for manipulating a three-dimensional object utilizing a force feedback interface
EP3217910B1 (en) Interaction between user-interface and master controller
US20080010616A1 (en) Spherical coordinates cursor, mouse, and method
Leibe et al. The perceptive workbench: Toward spontaneous and natural interaction in semi-immersive virtual environments
US8203529B2 (en) Tactile input/output device and system to represent and manipulate computer-generated surfaces
Withana et al. ImpAct: Immersive haptic stylus to enable direct touch and manipulation for surface computing
JP2003085590A (en) Method and device for operating 3d information operating program, and recording medium therefor
WO2006047018A2 (en) Input device for controlling movement in a three dimensional virtual environment
Bai et al. 3D gesture interaction for handheld augmented reality
Poston et al. The virtual workbench: Dextrous VR
Ware et al. Frames of reference in virtual object rotation
Stellmach et al. Investigating Freehand Pan and Zoom.
Tseng et al. EZ-Manipulator: Designing a mobile, fast, and ambiguity-free 3D manipulation interface using smartphones
Bai et al. Asymmetric Bimanual Interaction for Mobile Virtual Reality.
WO2001013337A1 (en) Virtual model manipulation
Mahdikhanlou et al. Object manipulation and deformation using hand gestures
Kim et al. A tangible user interface with multimodal feedback

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 516991

Country of ref document: NZ

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWE Wipo information: entry into national phase

Ref document number: 10048618

Country of ref document: US

WWP Wipo information: published in national office

Ref document number: 516991

Country of ref document: NZ

122 Ep: pct application non-entry in european phase
WWG Wipo information: grant in national office

Ref document number: 516991

Country of ref document: NZ

NENP Non-entry into the national phase

Ref country code: JP