WO2001013337A1 - Virtual model manipulation - Google Patents
Virtual model manipulation Download PDFInfo
- Publication number
- WO2001013337A1 WO2001013337A1 PCT/NZ2000/000145 NZ0000145W WO0113337A1 WO 2001013337 A1 WO2001013337 A1 WO 2001013337A1 NZ 0000145 W NZ0000145 W NZ 0000145W WO 0113337 A1 WO0113337 A1 WO 0113337A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display device
- visual display
- screen
- virtual model
- image
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1601—Constructional details related to the housing of computer displays, e.g. of CRT monitors, of flat displays
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8082—Virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/161—Indexing scheme relating to constructional details of the monitor
- G06F2200/1612—Flat panel monitor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
Definitions
- This invention relates to virtual model manipulation.
- a major problem is that an operator cannot physically touch a 3D model.
- VR helmets have still not perfected stereoscopic vision and the users frequently suffer from headaches.
- Another problem with helmets is that there is no easy way to select a reference point from which to view a model.
- helmets are a physical encumbrance and requires the user to be totally immersed. While this may be an advantage for entertainment applications, this is not desirable for work applications where the user also prefers to have external interactions with his/her environment.
- Two-dimensional digitisers are devices which can enable spatial data to be readily input into a computer system, but the digitisers themselves cannot be used for the actual manipulation of the data once in the system.
- a 3-dimensional digitiser has recently been produced which can measure the position and orientation of a stylus tip in 3-dimensional space. This is known as a 3DRAWTM digitising system, details about which can be found at the website www.yr.web.com/WEB/PRODUCTS/TRACKER.HTM.
- a problem with this system is that although it allows 3-dimensional models to be accurately recorded within a computer system, it does not allow for ready manipulation of these models once entered.
- a product sold under the trade mark Microscribe 3D by Immersion as advertised on website www.yrweb.com/WEB/PRODUCTS/DIGITISE.HTM is another digitiser.
- a problem with this device is the difficulty it takes for the user to access a particular point of reference that is illustrated on the computer screen.
- the user has to either create zoom windows or enter co-ordinates. There is no intuitive or fast means by which this can be achieved.
- This is of particular disadvantage when the user is wanting to use the model with organic (and therefore relatively unstructured) subject matter. For example, a medical person may wish to use a modelling system to follow a vein or assess the position of a tumour.
- a virtual model manipulation system including
- the moveable visual display device displays different orientations and/or views of a 3-dimensional model associated with the orientation and/or position of the visual display device.
- the method of operation is characterised by displaying different orientations and/or views of a 3-dimensional model on the visual display device, wherein the orientation and/or views displayed are associated with the orientation and/or position of visual display device.
- the media may be any suitable media and can include compact disc, microchip, hard drive and so forth.
- the visual display device may be any visual device that can represent a 3- dimensional model. This may be a hologram, CRT screen or any other suitable device.
- the visual display device will now be referred to as a screen.
- the screen is capable of appearing substantially transparent screen.
- the screen will use a liquid crystal display (LCD).
- LCD liquid crystal display
- the screen may not necessarily be a flat screen.
- the screen could be curved giving a greater 3-dimensional quality to any image displayed on the screen.
- the screen may be flexible and can be manipulated to follow generally the contours of the model being displayed.
- the screen may move by a number of means.
- the screen is attached to a multi-jointed arm which can give a number of degrees of freedom - four at a minimum.
- a multi-jointed arm which can give a number of degrees of freedom - four at a minimum.
- Some embodiments may include a handle attached to the screen allowing the user to readily move the screen.
- the screen may be independent of any physical encumbrance such as an arm.
- the screen may transmit its position, or have its position remotely sensed by the model manipulation system.
- triangulation methods may be used, utilising such methods as magnetic, acoustic, visual or GPS.
- the reference point can be chosen by the user.
- Traditional systems are cumbersome and rely on the creation of zoom windows or the entering of co-ordinates.
- the choice of reference point is made intuitively.
- the screen is a touch screen and touching the screen (whether by finger, stylus, laser pointer or some other means) provides the user ith a referef.ee p lrit.
- the user can then manipulate the image on the screen. For example, the user may touch the screen at a particular point and then select the reference point touched to then manipulate the image.
- the selection and possibly manipulation may be via function buttons attached to the screen.
- the stylus may have function buttons, or there may be a separate input device for this (e.g. keyboard).
- the image may be that of a person's face.
- a touch of the stylus on the end of the nose sets up a direct reference point. Selecting the reference point via a function button and then moving the stylus can then cause the nose to change shape on the screen as manipulated by the user. It can be seen that this is an intuitive way to approach model manipulation.
- the present invention is characterised in that the screen displays different orientations of the model which are associated with the actual orientation and/or position of the screen itself. This provides a considerable amount of intuitiveness when dealing with a 3-dimensional model.
- the screen may be moved forward and the display on the screen may show the model being passed, through at the distance and angle corresponding to the position of the screen.
- the 3-dimensional image may be a face. Initially the screen may display the face as if the face is at a distance from the user. The user may then move the screen forward and it appears on the screen that the screen is approaching the face. Moving the screen further forward still, may provide an image on the screen which looks like the screen is actually passing through the face. A cross- sectional outline on the screen would represent model objects which are virtually bisected by the screen.
- Tilting the screen from one plane to another can cause the image on the screen to likewise tilt giving the impression that the model is in reality just behind the screen and the screen is merely a window through which the user can look at the model.
- This feature of the present invention overcomes a number of the problems associated with the prior art and in addition provides new features in the field of virtual model manipulation.
- the present invention obviates the need for VR helmets and their problems such as a high degree of latency, poor stereoscopic vision, the requirement to move the users head to select a reference point and the requirement to be totally immersed while in a work environment.
- the present invention provides the user with a fast and intuitive means by which to locate and view a 3-dimensional model by its feature of moving a visual screen and having the display on the screen give the impression that the screen is moving through and/or around the model.
- the 3-dimensional model may be a medical model, scanned in from a patient.
- a surgeon can follow various organic pathways through the model, for example veins and get a greater appreciation of linkages of component parts within that model.
- the 3-dimensional model may be of a building including various components such as walls, structural columns, pipes, wiring, joists and the like.
- a particular contractor may choose it's specialty (say pipes) and track the positioning of the pipes within the building using the present invention.
- the contractor may use the present invention to actually place the pipes within the model of the building, this procedure being enhanced by the ability to view the building from all angles.
- the physical interface provided by the present invention from the user to the computer model can also be used in the construction of 3-dimensional model.
- the user may wish to construct an organic model such as a tree.
- the user may place the screen on substantially horizontal plane, draw say a circle and then move the screen upwards causing the circle to form into a cylinder using the image manipulation tools (such as function buttons).
- the user may wish to create a branch. This can involve the user repositioning the screen at an angle to the cylinder already formed and again creating a series of circles with differing diameters at an angle from the main cylinder or trunk - thus resulting in a tapering branch.
- the present invention can also be used to intuitively sculpt a 3-dimensional model.
- the original image may be a solid block.
- the user may move the screen up to the solid block and then use a model manipulation stylus to carve off pieces from that block to form a sculpture.
- the present invention can also be used in the entertainment industry as well.
- the imagery on the screen may be similar to that used in present day computer games.
- the screen may be used as an equivalent to a joystick allowing the user to effectively move through a scene and play the game.
- Figure 1 is a diagrammatic view of one possible embodiment of the present invention.
- FIG. 1 Hardware generally indicated by arrow 1 associated with a virtual model manipulation system in accordance with the present invention is illustrated in Figure 1.
- the hardware (1) includes a user interactive area (10) which includes a screen (2), handle (12) and function buttons (11).
- the screen (2) is a transparent LCD screen.
- a stylus (3) attached to the screen (2) can be used to select reference points of images displayed on the screen (2). In other embodiments, the stylus need not be attached.
- the screen (2) is attached to an arm generally indicated by arrow 4.
- the arm (4) has fully moveable ball joints (5 and 6) along with a pivot point (7).
- the ball joints (5, 6) and pivot points (7) provide the screen (2) with six degrees of freedom in its movement.
- the handle (12) on the screen (2) can be readily grasped by the user to move the screen (2).
- a base (8) ensures that the arm (4) is substantially stable when in use.
- the screen (2) is electrically connected via the arm (4) to a central processing unit (9).
- the central processing unit (CPU) (9) contains all of the software required to alter the image on the screen depending on the orientation position of the screen.
- the CPU (9) also includes the program required to alter the image on the screen in accordance with input via the user using the stylus (3) and the function buttons (11).
- the CPU (9) also receives electronic electrical signals from potentiometers in the ball joints (5,6) and pivot points (7) of the arm (4). This provides the CPU (9) with information as to the actual position and orientation of the screen (2) which is essential for the required virtual model manipulation. Sensors other than potentiometers may also be used to give positional/orientation data for example remote sensors as discussed previously.
- the CPU may be sited on the screen itself.
- the screen could be a Palm PilotTM or some other compact device with its own CPU.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU63267/00A AU6326700A (en) | 1999-08-02 | 2000-08-02 | Virtual model manipulation |
NZ516991A NZ516991A (en) | 1999-08-02 | 2000-08-02 | Virtual model manipulation |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NZ337027 | 1999-08-02 | ||
NZ33702799 | 1999-08-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2001013337A1 true WO2001013337A1 (en) | 2001-02-22 |
Family
ID=19927420
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/NZ2000/000145 WO2001013337A1 (en) | 1999-08-02 | 2000-08-02 | Virtual model manipulation |
Country Status (2)
Country | Link |
---|---|
AU (1) | AU6326700A (en) |
WO (1) | WO2001013337A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2003088026A1 (en) * | 2002-04-10 | 2003-10-23 | Technische Universiteit Delft | Apparatus for storage and reproduction of image data |
US7324121B2 (en) | 2003-07-21 | 2008-01-29 | Autodesk, Inc. | Adaptive manipulators |
DE102007050060A1 (en) * | 2007-10-19 | 2009-04-23 | Dräger Medical AG & Co. KG | Device and method for issuing medical data |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4984179A (en) * | 1987-01-21 | 1991-01-08 | W. Industries Limited | Method and apparatus for the perception of computer-generated imagery |
JPH08263698A (en) * | 1995-03-20 | 1996-10-11 | Matsushita Electric Ind Co Ltd | Environmental experience simulator |
US5615132A (en) * | 1994-01-21 | 1997-03-25 | Crossbow Technology, Inc. | Method and apparatus for determining position and orientation of a moveable object using accelerometers |
US5764217A (en) * | 1995-01-23 | 1998-06-09 | International Business Machines Corporation | Schematic guided control of the view point of a graphics processing and display system |
US6057810A (en) * | 1996-06-20 | 2000-05-02 | Immersive Technologies, Inc. | Method and apparatus for orientation sensing |
-
2000
- 2000-08-02 WO PCT/NZ2000/000145 patent/WO2001013337A1/en active IP Right Grant
- 2000-08-02 AU AU63267/00A patent/AU6326700A/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4984179A (en) * | 1987-01-21 | 1991-01-08 | W. Industries Limited | Method and apparatus for the perception of computer-generated imagery |
US5615132A (en) * | 1994-01-21 | 1997-03-25 | Crossbow Technology, Inc. | Method and apparatus for determining position and orientation of a moveable object using accelerometers |
US5764217A (en) * | 1995-01-23 | 1998-06-09 | International Business Machines Corporation | Schematic guided control of the view point of a graphics processing and display system |
JPH08263698A (en) * | 1995-03-20 | 1996-10-11 | Matsushita Electric Ind Co Ltd | Environmental experience simulator |
US6057810A (en) * | 1996-06-20 | 2000-05-02 | Immersive Technologies, Inc. | Method and apparatus for orientation sensing |
Non-Patent Citations (1)
Title |
---|
DATABASE WPI Derwent World Patents Index; Class T01, AN 1996-510501/51 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2003088026A1 (en) * | 2002-04-10 | 2003-10-23 | Technische Universiteit Delft | Apparatus for storage and reproduction of image data |
US7324121B2 (en) | 2003-07-21 | 2008-01-29 | Autodesk, Inc. | Adaptive manipulators |
DE102007050060A1 (en) * | 2007-10-19 | 2009-04-23 | Dräger Medical AG & Co. KG | Device and method for issuing medical data |
US9133975B2 (en) | 2007-10-19 | 2015-09-15 | Dräger Medical GmbH | Device and process for the output of medical data |
DE102007050060B4 (en) * | 2007-10-19 | 2017-07-27 | Drägerwerk AG & Co. KGaA | Device and method for issuing medical data |
Also Published As
Publication number | Publication date |
---|---|
AU6326700A (en) | 2001-03-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11723734B2 (en) | User-interface control using master controller | |
Mine | Virtual environment interaction techniques | |
Weimer et al. | A synthetic visual environment with hand gesturing and voice input | |
Ware | Using hand position for virtual object placement | |
Satriadi et al. | Augmented reality map navigation with freehand gestures | |
US8547328B2 (en) | Methods, apparatus, and article for force feedback based on tension control and tracking through cables | |
Poston et al. | Dextrous virtual work | |
Song et al. | WYSIWYF: exploring and annotating volume data with a tangible handheld device | |
US5973678A (en) | Method and system for manipulating a three-dimensional object utilizing a force feedback interface | |
EP3217910B1 (en) | Interaction between user-interface and master controller | |
US20080010616A1 (en) | Spherical coordinates cursor, mouse, and method | |
Leibe et al. | The perceptive workbench: Toward spontaneous and natural interaction in semi-immersive virtual environments | |
US8203529B2 (en) | Tactile input/output device and system to represent and manipulate computer-generated surfaces | |
Withana et al. | ImpAct: Immersive haptic stylus to enable direct touch and manipulation for surface computing | |
JP2003085590A (en) | Method and device for operating 3d information operating program, and recording medium therefor | |
WO2006047018A2 (en) | Input device for controlling movement in a three dimensional virtual environment | |
Bai et al. | 3D gesture interaction for handheld augmented reality | |
Poston et al. | The virtual workbench: Dextrous VR | |
Ware et al. | Frames of reference in virtual object rotation | |
Stellmach et al. | Investigating Freehand Pan and Zoom. | |
Tseng et al. | EZ-Manipulator: Designing a mobile, fast, and ambiguity-free 3D manipulation interface using smartphones | |
Bai et al. | Asymmetric Bimanual Interaction for Mobile Virtual Reality. | |
WO2001013337A1 (en) | Virtual model manipulation | |
Mahdikhanlou et al. | Object manipulation and deformation using hand gestures | |
Kim et al. | A tangible user interface with multimodal feedback |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 516991 Country of ref document: NZ |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10048618 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 516991 Country of ref document: NZ |
|
122 | Ep: pct application non-entry in european phase | ||
WWG | Wipo information: grant in national office |
Ref document number: 516991 Country of ref document: NZ |
|
NENP | Non-entry into the national phase |
Ref country code: JP |