EP1776659A2 - Method of enabling to model virtual objects - Google Patents
Method of enabling to model virtual objectsInfo
- Publication number
- EP1776659A2 EP1776659A2 EP05776513A EP05776513A EP1776659A2 EP 1776659 A2 EP1776659 A2 EP 1776659A2 EP 05776513 A EP05776513 A EP 05776513A EP 05776513 A EP05776513 A EP 05776513A EP 1776659 A2 EP1776659 A2 EP 1776659A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- shape
- location
- pressure
- user
- touch screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims description 21
- 238000009877 rendering Methods 0.000 claims abstract description 6
- 230000004048 modification Effects 0.000 claims description 12
- 238000012986 modification Methods 0.000 claims description 12
- 230000008859 change Effects 0.000 claims description 10
- 238000010586 diagram Methods 0.000 description 9
- 239000000463 material Substances 0.000 description 7
- 230000003993 interaction Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 239000004927 clay Substances 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 241000406668 Loxodonta cyclotis Species 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000000465 moulding Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000000384 rearing effect Effects 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
Definitions
- the invention relates to a data processing system with a display monitor for rendering a virtual object, and with a touch screen for enabling a user to interact with the object rendered.
- the invention further relates to a method and to control software for enabling to model a shape of a virtual object rendered on a display monitor having a touch screen.
- Video games, graphics games and other computer-related entertainment software applications have become increasingly more widespread, and are currently being used even on mobile phones.
- players use animated graphical representations, known as avatars, as their representatives in a virtual environment.
- Dedicated devices are being marketed for electronic pet toys, e.g., Tamaguchi: a rearing game, wherein the user has to take care of a virtual animal rendered on a display monitor.
- Tamaguchi a rearing game
- This patent document discloses making a graphics model of a physical object shaped as, e.g., an elephant, by using bitmap silhouettes of the physical model in different orientations to carve away voxels from a voxel block.
- US patent publication 2002/0089500 filed for Jennings et al. for SYSTEMS AND METHODS OF THREE-DIMENSIONAL MODELING discloses systems and methods for modifying a virtual object stored within a computer. The systems and methods allow virtual object modifications that are otherwise computationally inconvenient.
- the virtual object is represented as a volumetric representation.
- a portion of the volumetric model is converted into an alternative representation.
- the alternative representation can be a representation having a different number of dimensions from the volumetric representations.
- a stimulus is applied to the alternative representation, for example by a user employing a force- feedback haptic interface.
- the response of the alternative representation to the stimulus is calculated.
- the change in shape of the virtual object is determined from the response of the alternative representation.
- the representations of the virtual object can be displayed at any time for the user.
- the user can be provided a force- feedback response. Multiple stimuli can be applied in succession. Multiple alternative representations can be employed in the system and method.
- the inventors propose a system or a method for enabling to create or shape a virtual model that can be used as an alternative to the known systems and methods discussed above, or in addition to the above systems and methods.
- a data processing system with a display monitor for rendering a virtual object, and with a touch screen for enabling a user to interact with the object rendered.
- the system is operative to enable the user to modify a shape of the object at a first location on the object.
- the shape is modified under control of a magnitude of a pressure registered at a second location on the touch screen substantially coinciding with the first location when viewed through the touch screen in operational use of the system.
- the Jennings document referred to above neither teaches nor suggests using the touch screen as if this itself were to physically represent the surface of the object.
- the object is manually shaped by the user through the user's applying a pressure to a certain location at the touch screen that corresponds or coincides with a specific part of the object's surface displayed.
- input devices such as a computer mouse, joystick or touch screen are being used as equivalent alternatives to interact with tools graphically represented through the user-interactive software application.
- gradations of shaping the object can be achieved simply by means of re-scaling (magnifying or reducing) the image of the object rendered on the display monitor.
- the touch screen physically represents the object
- feedback to the user can be limited to visual feedback only as if he/she were molding a chunk of clay.
- the object's shape continues to be modified only if the pressure, as registered by the touch screen, increases. Lowering the pressure at the same location leaves the shape as it was at the time of the maximum value of the pressure. That is, the shape responds to a change in pressure at a location perceived by the user to correspond and coincide with an image of the object, which provides for a direct and more intuitive user interface than the one used in Jennings.
- Rendering the virtual object as if the corresponding physical object were put under proper illumination conditions may enhance the visual feedback.
- the resulting shadows and changes therein during user interaction with the virtual object are then similar to those experienced as if the user were handling the corresponding physical object in reality.
- the touch screen registers the user's hand already when approaching, so as to be able to generate an artificial shadow of the hand on the virtual object in order to enhance visual impressions.
- the system of the invention allows programming a relationship between the levels of deformation of the shape on one hand, and the magnitude of the applied pressure on the other hand.
- This can be used, e.g., to program or simulate the physical or material properties such as elasticity or rigidity of a physical object corresponding to the virtual object.
- this relationship may take into account the scale of the image of the object.
- pressure is the force per unit of area. The force is applied by the user to an area of the touch screen having an order of magnitude of that of the surface of a fingertip. Upon re-scaling the object as displayed, the same force is applied to a larger or smaller area when mapped onto the object displayed.
- the virtual pressure applied to the virtual object depends on the scale at which it is being displayed. Therefore, above relationship may be programmable or programmed to take the scaling effects into account.
- refinements may relate to, for example, providing a non- linear character to the relationship of pressure versus deformation in order to model the increasing resistance of physical materials to increasing compression.
- the system has provisions to enable the touch screen to be used for modeling the virtual object by pushing at the virtual object, as well as by pulling at the object. That is, the system has a further operational mode wherein the shape of the virtual object responds to a decrease of the pressure to the touch screen. For example, the user may increase the pressure at a certain location at a rate faster than a certain threshold.
- the system is programmed to interpret this as that the user wants to pull at the object, rather than push. Upon a gentle release of the pressure the object is deformed as if it were pulled, e.g., in the direction towards the user and at the location corresponding to the area at the touch screen where the user is touching the latter.
- the invention also relates to a method of enabling to model a shape of a virtual object rendered on a display monitor having a touch screen.
- the shape is enabled to get modified at a first location on the object under control of a magnitude of a pressure registered at a second location on the touch screen substantially coinciding with the first location on the display monitor when viewed through the screen in operational use of the system.
- the method is relevant to, e.g., a service provider on the Internet, or to a multi-user computer game under control of a server that enables in the virtual world the kind of interaction discussed above with respect to the system and its features.
- the invention may also be embodied in control software for use on a data processing system with a display monitor and a touch screen.
- the software allows the user interaction and use of the features described above.
- FIG. 1 is a block diagram of a system in the invention
- Figs. 2 -5 illustrate several embodiments of the invention
- Fig. 6 is a flow diagram illustrating a process in the invention
- Figs. 7-9 are diagrams illustrating reversal of the polarity of the deformation. Throughout the figures, same reference numerals indicate similar or corresponding features.
- Fig. 1 is a block diagram of a system 100 in the invention.
- System 100 comprises a display monitor 102, and a touch screen 104 arranged so that the user sees the images displayed on monitor 102 through screen 104.
- Touch screen 104 is capable of processing input data representative of the touch location relative to the screen as well as input data representative of a force or pressure that the user exerts on the touch screen in operational use.
- the user input in the form of a location where the user touches screen 104 corresponds with a specific location of the image displayed on monitor 102.
- System 100 further comprises a data processing sub-system 106, e.g., a PC or another computer, e.g., at a remote location and connected to monitor 102 and touch screen 104 via the Internet or a home network (not shown).
- a data processing sub-system 106 e.g., a PC or another computer, e.g., at a remote location and connected to monitor 102 and touch screen 104 via the Internet or a home network (not shown).
- above components 102-106 may be integrated together in a PC or a handheld device such as a cell phone, a PDA, or a touch-screen remote control.
- Sub-system 106 is operative to process the user input data and to provide the images under control of a software application 108.
- Sub-system 106 may comprise a remote server taking care of the data processing accompanying the intended deformations of the virtual object. Under circumstances this data processing may well be compute-intensive, e.g., in a real-time multi-user computer game or when
- Touch screen 104 is configured to register both a touch location and a magnitude of the pressure applied to screen 104 when the user touches screen 104.
- This configuration allows the user input to be considered 3-dimensional: two coordinates that determine a position at the surface of screen 104 and a further coordinate perpendicular to screen 104 represented by a magnitude of the pressure of the touch. This is now being used in the invention to model a virtual object.
- Figs. 2 and 3 are diagrams illustrating modeling of a virtual object in a virtual pottery application.
- monitor 102 renders a cylindrical object 202.
- virtual object 202 is made to rotate around its axis of symmetry 204 that is fixed in (virtual) space. That is, axis 204 is not to be moved as a result of the user's applying a pressure to touch screen 104.
- the user pushes with his/her finger 302 against touch screen 104 at a location coinciding with a location on the surface area of object 202.
- Touch screen 104 registers the coordinates of the contact with finger 302 as well as its pressure against screen 104.
- PC 106 receives this data and inputs this to application 108 that generates a modification of the shape of object 202 compliant with the coordinates and pressure level registered. As object 202 is rotating, the modification to the shape now has a rotational symmetry as well.
- Figs. 4 and 5 are diagrams illustrating another mode of modeling virtual object 202 rendered at monitor 102.
- object 202 is not to be moved as an entity across monitor 102, but is only to undergo a deformation as a result of the user's applying a pressure to screen 104 in suitable locations.
- the user is now applying a pressure to touch screen 104 with both the right hand 302 and the left hand 502 at locations coinciding with the image of object 202 as if to locally squeeze object 202. That is, the locations of contact between hands 302 and 502 as well as a change in the locations while applying pressure define the resulting deformation of object 202.
- object 202 is deformed at the top at the right hand side and at the bottom at the left hand side.
- system 100 allows the user to move object 202 in its entirety across monitor 102, e.g., to reposition it or to change its orientation with respect to the direction of viewing.
- monitor 102 can display menu options in an area not visually covering object 202.
- interaction with touch screen 104 is carried out in such a manner so as to enable system 100 to discriminate between commands to deform object 202 and commands to change the position or orientation of object 202 as a whole.
- a sweeping movement of the user's hand across screen 104 starting outside of the region occupied by object 202 is interpreted as a command to rotate object 202 in the direction of the sweep around an axis perpendicular to that direction and coinciding with, e.g., a (virtual) center of mass of object 202 that itself remains fixed in the virtual environment.
- the rotation continues as long as the user is contacting and moving his/her hand.
- Fig. 6 is a flow diagram illustrating a process 600 in the invention.
- touch screen 104 supplies data to PC 106 representative of the location of contact and of the contact pressure.
- a step 604 it is determined if the location matches a location on a surface of object 202. If there is no match, application 108 interprets the input as a command for an operation other than a modification of the shape of object 202 in an optional step 606. For example, a succession of coordinates, i.e., an ordered set of coordinates, that does not match object 202 is interpreted as a command to shift object 202 in its entirety in the direction of the vector corresponding with the succession.
- a pressure increase is interpreted as a zooming in on the image of object 202.
- a zooming out operation is initiated, e.g., upon a rate of change in pressure above a certain threshold or upon the pressure itself exceeding a specific threshold.
- specific operations other than shape modification may be listed as options in a menu displayed on monitor 102 together with object 202. If the coordinates do match with object 202, an optional step 608 checks if the pressure or changes therein indicate a transition to another operation mode, examples of which have been given above. If there is no mode switching, the modification to the shape of object 202 is determined in a step 610 based on the input of step 602 and the modified shape is rendered in a step 612.
- Figs. 7-9 are diagrams to illustrate relationships between the pressure "p" applied to touch screen 102 and the resulting deformation "D" of object 202 over a period of time "t".
- system 100 is in a first operational mode, wherein the pressure is increasing over time and the resulting deformation, e.g., the spatial deviation from the original shape is increasing likewise as if object 202 were locally compressed.
- the pressure is raised above a threshold T, or when the pressure is raised above threshold T at a rate higher than a certain minimum rate
- system 100 interprets this as that the final deformation of object 202 has been reached in this session.
- the deformation stops and the pressure can be lowered to zero without the deformation changing.
- Threshold T and the minimum rate are preferably programmable.
- a pressure whose value stays below the threshold may have deformation effects depending on the material properties programmed. For example, if virtual object 202 is to represent a piece of modeling clay, a decrease of pressure after a raise in pressure will leave the deformation as it was at the instant pressure "p" reached its maximum value (lower than threshold T). If object 202 is to represent a material that is rather elastic or spongy, a decrease in pressure after the pressure has reached a maximum (below threshold T) results in a decrease of the deformation, not necessarily instantly depending on the material properties programmed.
- Fig. 8 illustrates a second operational mode of system 100.
- pressure "p" is made to increase quickly above threshold T.
- System 100 interprets this as that the user intends a deformation corresponding to a local expansion, rather than compression of the diagram of Fig. 7.
- pressure p is lowered below threshold T, system 100 controls the local expansion of object 202, e.g., as if equilibrium were being conserved all the time between the internal pressure of object 202 being determined by, on the one hand, the material properties of object 202 programmed, and on the other hand the pressure applied by the user through touch screen 104.
- the invention can be used, e.g., to create a virtual object for aesthetic purposes; as a toy; as an aid for helping to understand the behavior of physical objects with specific or programmable material properties; as a template for a physical model to be made through computer-aided manufacturing; as an application in a computer game to shape the virtual environment or to interact with it and its virtual occupants in operational use; to have fun during uninspiring video conferences by applying touch-induced conformal mappings to the image of the current speaker displayed at one's PC, etc.
- an instant-reset button for returning to the normal viewing mode in order to get rid of too favorable effects that may interfere with the conferencing, as well as an "undo" button to retrieve the results of the last mapping.
- touch screen as used in this text is also to include graphical tablets, e.g., stylus-operated. What has been discussed above with regard to touch screens that interact with the user's finger is also applicable to graphical tablets.
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP05776513A EP1776659A2 (en) | 2004-08-02 | 2005-07-21 | Method of enabling to model virtual objects |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP04103705 | 2004-08-02 | ||
PCT/IB2005/052451 WO2006013520A2 (en) | 2004-08-02 | 2005-07-21 | System and method for enabling the modeling virtual objects |
EP05776513A EP1776659A2 (en) | 2004-08-02 | 2005-07-21 | Method of enabling to model virtual objects |
Publications (1)
Publication Number | Publication Date |
---|---|
EP1776659A2 true EP1776659A2 (en) | 2007-04-25 |
Family
ID=35787499
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP05776513A Withdrawn EP1776659A2 (en) | 2004-08-02 | 2005-07-21 | Method of enabling to model virtual objects |
Country Status (6)
Country | Link |
---|---|
US (1) | US20080062169A1 (en) |
EP (1) | EP1776659A2 (en) |
JP (1) | JP2008508630A (en) |
KR (1) | KR20070043993A (en) |
CN (1) | CN101253466A (en) |
WO (1) | WO2006013520A2 (en) |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8730156B2 (en) * | 2010-03-05 | 2014-05-20 | Sony Computer Entertainment America Llc | Maintaining multiple views on a shared stable virtual space |
US20090172557A1 (en) * | 2008-01-02 | 2009-07-02 | International Business Machines Corporation | Gui screen sharing between real pcs in the real world and virtual pcs in the virtual world |
US9665197B2 (en) | 2008-01-30 | 2017-05-30 | Nokia Technologies Oy | Apparatus and method for enabling user input |
KR101032632B1 (en) * | 2008-04-01 | 2011-05-06 | 한국표준과학연구원 | Method for providing an user interface and the recording medium thereof |
KR101545736B1 (en) * | 2009-05-04 | 2015-08-19 | 삼성전자주식회사 | 3 apparatus and method for generating three-dimensional content in portable terminal |
KR20100138700A (en) * | 2009-06-25 | 2010-12-31 | 삼성전자주식회사 | Method and apparatus for processing virtual world |
WO2011035723A1 (en) * | 2009-09-23 | 2011-03-31 | Han Dingnan | Method and interface for man-machine interaction |
US20110221684A1 (en) * | 2010-03-11 | 2011-09-15 | Sony Ericsson Mobile Communications Ab | Touch-sensitive input device, mobile device and method for operating a touch-sensitive input device |
US9544543B2 (en) | 2011-02-11 | 2017-01-10 | Tangome, Inc. | Augmenting a video conference |
US8665307B2 (en) | 2011-02-11 | 2014-03-04 | Tangome, Inc. | Augmenting a video conference |
US8508494B2 (en) * | 2011-06-01 | 2013-08-13 | Motorola Mobility Llc | Using pressure differences with a touch-sensitive display screen |
US9724600B2 (en) * | 2011-06-06 | 2017-08-08 | Microsoft Technology Licensing, Llc | Controlling objects in a virtual environment |
US9417754B2 (en) | 2011-08-05 | 2016-08-16 | P4tents1, LLC | User interface system, method, and computer program product |
JP6021335B2 (en) * | 2011-12-28 | 2016-11-09 | 任天堂株式会社 | Information processing program, information processing apparatus, information processing system, and information processing method |
CN102647458A (en) * | 2012-03-28 | 2012-08-22 | 成都立方体科技有限公司 | Method for displaying various files in a cell phone mobile office system with B (Browser)/S (Server) structure |
JP6107271B2 (en) * | 2013-03-21 | 2017-04-05 | カシオ計算機株式会社 | Information processing apparatus, information processing system, and program |
CN106933397B (en) * | 2015-12-30 | 2020-06-30 | 网易(杭州)网络有限公司 | Virtual object control method and device |
KR20170085836A (en) * | 2016-01-15 | 2017-07-25 | 삼성전자주식회사 | Information input device for use in 3D design, and Method for producing 3D image with the same |
JP6315122B2 (en) * | 2017-03-08 | 2018-04-25 | カシオ計算機株式会社 | Display control apparatus, display control method, and program |
US11488331B2 (en) * | 2020-11-03 | 2022-11-01 | International Business Machines Corporation | Smart interactive simulation-based content on a flexible display device |
Family Cites Families (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2827612B2 (en) * | 1991-10-07 | 1998-11-25 | 富士通株式会社 | A touch panel device and a method for displaying an object on the touch panel device. |
US7345675B1 (en) * | 1991-10-07 | 2008-03-18 | Fujitsu Limited | Apparatus for manipulating an object displayed on a display device by using a touch screen |
US6597347B1 (en) * | 1991-11-26 | 2003-07-22 | Itu Research Inc. | Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom |
US5483261A (en) * | 1992-02-14 | 1996-01-09 | Itu Research, Inc. | Graphical input controller and method with rear screen image detection |
DE69416960T2 (en) * | 1993-12-07 | 1999-08-19 | Seiko Epson Corp | Touch panel input device and method for generating input signals for an information processing device |
US5534893A (en) * | 1993-12-15 | 1996-07-09 | Apple Computer, Inc. | Method and apparatus for using stylus-tablet input in a computer system |
JPH0817288A (en) * | 1994-07-04 | 1996-01-19 | Matsushita Electric Ind Co Ltd | Transparent touch panel |
US5731819A (en) * | 1995-07-18 | 1998-03-24 | Softimage | Deformation of a graphic object to emphasize effects of motion |
JP3426847B2 (en) * | 1996-05-14 | 2003-07-14 | アルプス電気株式会社 | Coordinate input device |
US6720949B1 (en) * | 1997-08-22 | 2004-04-13 | Timothy R. Pryor | Man machine interfaces and applications |
US7614008B2 (en) * | 2004-07-30 | 2009-11-03 | Apple Inc. | Operation of a computer with touch screen interface |
US6459439B1 (en) * | 1998-03-09 | 2002-10-01 | Macromedia, Inc. | Reshaping of paths without respect to control points |
US6522328B1 (en) * | 1998-04-07 | 2003-02-18 | Adobe Systems Incorporated | Application of a graphical pattern to a path |
US6421048B1 (en) * | 1998-07-17 | 2002-07-16 | Sensable Technologies, Inc. | Systems and methods for interacting with virtual objects in a haptic virtual reality environment |
US6292173B1 (en) * | 1998-09-11 | 2001-09-18 | Stmicroelectronics S.R.L. | Touchpad computer input system and method |
JP2000231627A (en) * | 1998-12-22 | 2000-08-22 | Xerox Corp | Plural modes scanning pen provided with feedback mechanism and input method using the same |
SE513866C2 (en) * | 1999-03-12 | 2000-11-20 | Spectronic Ab | Hand- or pocket-worn electronic device and hand-controlled input device |
US7138983B2 (en) * | 2000-01-31 | 2006-11-21 | Canon Kabushiki Kaisha | Method and apparatus for detecting and interpreting path of designated position |
US6608631B1 (en) * | 2000-05-02 | 2003-08-19 | Pixar Amination Studios | Method, apparatus, and computer program product for geometric warps and deformations |
JP2002032173A (en) * | 2000-07-13 | 2002-01-31 | Jatco Transtechnology Ltd | Information input device |
US6958752B2 (en) * | 2001-01-08 | 2005-10-25 | Sensable Technologies, Inc. | Systems and methods for three-dimensional modeling |
US6819316B2 (en) * | 2001-04-17 | 2004-11-16 | 3M Innovative Properties Company | Flexible capacitive touch sensor |
US6765572B2 (en) * | 2001-04-23 | 2004-07-20 | Koninklijke Philips Electronics N.V. | Virtual modeling by voxel-clipping shadow-cast |
CN1280698C (en) * | 2001-09-24 | 2006-10-18 | 皇家飞利浦电子股份有限公司 | Interactive system and method of interaction |
US7385612B1 (en) * | 2002-05-30 | 2008-06-10 | Adobe Systems Incorporated | Distortion of raster and vector artwork |
JP4500485B2 (en) * | 2002-08-28 | 2010-07-14 | 株式会社日立製作所 | Display device with touch panel |
JP2004133086A (en) * | 2002-10-09 | 2004-04-30 | Seiko Epson Corp | Display apparatus, electronic device and watch |
JP4100195B2 (en) * | 2003-02-26 | 2008-06-11 | ソニー株式会社 | Three-dimensional object display processing apparatus, display processing method, and computer program |
US7519223B2 (en) * | 2004-06-28 | 2009-04-14 | Microsoft Corporation | Recognizing gestures and using gestures for interacting with software applications |
US7653883B2 (en) * | 2004-07-30 | 2010-01-26 | Apple Inc. | Proximity detector in handheld device |
US7538760B2 (en) * | 2006-03-30 | 2009-05-26 | Apple Inc. | Force imaging input device and system |
US20080007532A1 (en) * | 2006-07-05 | 2008-01-10 | E-Lead Electronic Co., Ltd. | Touch-sensitive pad capable of detecting depressing pressure |
-
2005
- 2005-07-21 CN CNA2005800263472A patent/CN101253466A/en active Pending
- 2005-07-21 US US11/572,927 patent/US20080062169A1/en not_active Abandoned
- 2005-07-21 EP EP05776513A patent/EP1776659A2/en not_active Withdrawn
- 2005-07-21 JP JP2007524434A patent/JP2008508630A/en active Pending
- 2005-07-21 KR KR1020077002512A patent/KR20070043993A/en not_active Application Discontinuation
- 2005-07-21 WO PCT/IB2005/052451 patent/WO2006013520A2/en active Application Filing
Non-Patent Citations (1)
Title |
---|
See references of WO2006013520A2 * |
Also Published As
Publication number | Publication date |
---|---|
CN101253466A (en) | 2008-08-27 |
WO2006013520A3 (en) | 2008-01-17 |
JP2008508630A (en) | 2008-03-21 |
US20080062169A1 (en) | 2008-03-13 |
KR20070043993A (en) | 2007-04-26 |
WO2006013520A2 (en) | 2006-02-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20080062169A1 (en) | Method Of Enabling To Model Virtual Objects | |
US11221730B2 (en) | Input device for VR/AR applications | |
Schkolne et al. | Surface drawing: creating organic 3D shapes with the hand and tangible tools | |
US9619106B2 (en) | Methods and apparatus for simultaneous user inputs for three-dimensional animation | |
Weimer et al. | A synthetic visual environment with hand gesturing and voice input | |
Gannon et al. | Tactum: a skin-centric approach to digital design and fabrication | |
Dani et al. | Creation of concept shape designs via a virtual reality interface | |
Sheng et al. | An interface for virtual 3D sculpting via physical proxy. | |
JP6074170B2 (en) | Short range motion tracking system and method | |
US6529210B1 (en) | Indirect object manipulation in a simulation | |
JP2018190443A (en) | Friction modulation for three-dimensional relief in haptic device | |
US8232989B2 (en) | Method and apparatus for enhancing control of an avatar in a three dimensional computer-generated virtual environment | |
US8350843B2 (en) | Virtual hand: a new 3-D haptic interface and system for virtual environments | |
Smith et al. | Digital foam interaction techniques for 3D modeling | |
JP2018142313A (en) | System and method for touch of virtual feeling | |
Oshita et al. | Character motion control interface with hand manipulation inspired by puppet mechanism | |
Kamuro et al. | An ungrounded pen-shaped kinesthetic display: Device construction and applications | |
Marchal et al. | Designing intuitive multi-touch 3d navigation techniques | |
Leal et al. | 3d sketching using interactive fabric for tangible and bimanual input | |
Fikkert et al. | User-evaluated gestures for touchless interactions from a distance | |
Kamuro et al. | 3D Haptic modeling system using ungrounded pen-shaped kinesthetic display | |
Oshita | Multi-touch interface for character motion control using example-based posture synthesis | |
Kwon et al. | Inflated roly-poly | |
Humberston et al. | Hands on: interactive animation of precision manipulation and contact | |
Wesson et al. | Evaluating organic 3D sculpting using natural user interfaces with the Kinect |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA HR MK YU |
|
DAX | Request for extension of the european patent (deleted) | ||
RTI1 | Title (correction) |
Free format text: SYSTEM AND METHOD FOR ENABLING THE MODELING OF VIRTUAL OBJECTS |
|
R17D | Deferred search report published (corrected) |
Effective date: 20080117 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06F 3/048 20060101AFI20080212BHEP |
|
17P | Request for examination filed |
Effective date: 20080717 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
18W | Application withdrawn |
Effective date: 20100618 |