WO2018075054A1 - Virtual reality input - Google Patents

Virtual reality input Download PDF

Info

Publication number
WO2018075054A1
WO2018075054A1 PCT/US2016/058022 US2016058022W WO2018075054A1 WO 2018075054 A1 WO2018075054 A1 WO 2018075054A1 US 2016058022 W US2016058022 W US 2016058022W WO 2018075054 A1 WO2018075054 A1 WO 2018075054A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
tablet
plane
stylus
user
Prior art date
Application number
PCT/US2016/058022
Other languages
French (fr)
Inventor
Ian N. Robinson
Hiroshi Horii
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to CN201680090258.2A priority Critical patent/CN109863467B/en
Priority to EP16919434.7A priority patent/EP3510474B1/en
Priority to PCT/US2016/058022 priority patent/WO2018075054A1/en
Priority to US16/075,610 priority patent/US10768716B2/en
Publication of WO2018075054A1 publication Critical patent/WO2018075054A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Definitions

  • CAD Computer-aided design
  • 3D three-dimensional
  • FIG. 1 is a diagram of an input system according to an example of the principles described herein.
  • FIGS. 2A and 2B are user view reference diagrams
  • FIG. 3 is a diagram showing two resulting view reference diagrams resulting from an unfrozen and frozen view, respectively, according to an example of the principles described herein.
  • Figs. 4A and 4B are top view diagrams of a view frustum and tablet frustum as before and after a viewpoint motion respectively according to an example of the principles described herein.
  • Fig. 5 is a flowchart showing a method of applying a two- dimensional (2D) input into a three-dimensional (3D) space according to one example of the principles described herein.
  • CAD drawing is used to create 3D objects within a 3D space.
  • a user may execute a CAD program on a computing device that receives input from an input device and translates that input onto a two-dimensional (2D) screen.
  • 2D two-dimensional
  • a stylus is used as an input device.
  • the stylus may be moved within a 3D space in order to allow the user to create the 3D object.
  • this method of input where the user's arm, hand and stylus are ail unsupported, has proven to be imprecise compared to conventional, 2D-constrained input methods.
  • Other types of input devices may be used but generally these types of input devices are not intuitive to a user and an amount of practice and/or mental visualization is implemented in order to create the intended 3D object.
  • the present specification therefore describes an input system that may include a stylus, a positionabie output device, a tablet to receive input via interaction with the stylus, and a three-dimensional (3D) workspace represented on a graphical user interface (GUI) of the positionabie output device communicatively coupled to the tablet wherein two-dimensional (2D) input on the tablet translates to a 3D input on the 3D workspace based on the orientation of an input plane represented in the 3D workspace and wherein interface of the stylus with the tablet freezes a view of a tabiet-to-input-piane mapping displayed on the positionabie output device.
  • GUI graphical user interface
  • the present specification further describes a method of applying a two-dimensional (2D) input into a three-dimensional (3D) space including receiving input from a first input device at a processor indicating a change in position of an input plane within the 3D space represented on an output device, receiving input from a second input device having a 2D surface at the processor indicating a line to be drawn in the 3D space, representing the received input from the second input device as a 3D image on the 3D space displayed on the output device, and maintaining a current mapping of the input form the second input device to the input plane within 3D space when a stylus interacts with the second input device and as the output device is moved.
  • the present specification also describes a computer program product for applying a two-dimensional (2D) input into a three-dimensional (3D) space
  • the computer program product including a computer readable storage medium including computer usable program code embodied therewith, the computer usable program code to, when executed by a processor receive input from a first input device at a processor indicating a change in position of an input plane within the 3D space represented on an output device, receive input from a second input device having a 2D surface at the processor indicating a line to be drawn in the 3D space, represent the received input from the second input device as a 3D image on the 3D space displayed on the output device, and maintain a current mapping of the input from the second input device to the input plane within the 3D space when a stylus interacts with the second input device and as the output device is moved.
  • frustum is meant to be understood as any three-dimensional region which is visible on a viewing, in some examples, the frustum is called a "view frustum.” In an example, the frustum is the 3D shape of a clipped rectangular pyramid.
  • Fig. 1 is a diagram of an input system (100) according to an example of the principles described herein.
  • the input system (100) may include a stylus (105) and tablet (1 10) and a
  • the stylus (105) may be any type of elongated device that a user may hold and touch to a surface such as the tablet (1 10).
  • the stylus (105) is a passive or capacitive stylus in that the stylus (105) acts just like a human finger when touching a touch sensitive screen, for example. In this example, no electronic communication is present between the passive stylus and a device such as the tablet (1 10).
  • the stylus (105) is an active stylus (105) in that the stylus (105) includes electronic components that communicate with the touchscreen controller on a device such as the tablet (1 10).
  • a user may implement the stylus (105) against or near the tablet (1 10) in order to have input received and presented on the 3D workspace (120) within the positionable output device (1 15).
  • the positionable output device (1 15) may be any device that implements a viewing surface to represent a 3D object or image within a 3D workspace (120).
  • the positionable output device (1 15) is a virtual reality (VR) headset implementing stereoscopic images called stereograms to represent the 3D object or image within the 3D workspace (120).
  • the images shown by the 3D workspace (120) may be still images or video images based on what the user is to display within the VR headset (1 15).
  • the VR headset (1 15) may present the 3D workspace (120) and 3D object or image to a user via a number of ocular screens.
  • the ocular screens are placed in an eyeglass or goggle system allowing a user to view both ocular screens simultaneously. This creates the illusion of a 3D workspace (120) and 3D objects using two individual ocular screens.
  • the VR headset (1 15) may further include a gyroscopic device and an acceierometer.
  • the gyroscope may be used to detect the orientation of the VR headset (1 15) in 3D space as the VR headset (1 15) is on the user's head.
  • the acceierometer may be used to detect the speed and change in speed of the VR headset (1 15) as it travels from one location in 3D space to another location in 3D space.
  • the gyroscope and acceierometer may provide to the processor this data such that movement of the VR headset (1 15) as it sits on the user's head is translated into a change in the point of view within the 3D workspace (120).
  • an AR system may include a visual presentation provided to a user via a computer screen or a headset including a number of screens, among other types of devices to present a visual representation of the 3D workspace (120).
  • a VR headset (1 15) in order to be presented with a 3D workspace (120)
  • an AR system may include a visual presentation provided to a user via a computer screen or a headset including a number of screens, among other types of devices to present a visual representation of the 3D workspace (120).
  • a user may provide input to a processor using the stylus (105) and tablet (1 10). The input is then processed and presented to the user via the VR headset (1 15). Thus, the user may, in real time, create and see input created. This allows a user to manipulate the 3D workspace (120) and the 3D objects created therein to create or augment the 3D objects.
  • the processor may be a built-in component of the tablet (1 10) and/or the VR headset (1 15).
  • the processor may be a component of a computing device separate from the tablet (1 10) and/or VR headset (1 15).
  • the computing device may receive the input from the tablet (1 10) and stylus (105) and cause the processor to relay the processed data to the VR headset (1 15) in real time.
  • the processor may be implemented in an electronic device.
  • electronic devices include servers, desktop computers, laptop computers, personal digital assistants (PDAs), mobile devices, smartphones, gaming systems, VR headsets (1 15), and tablets, among other electronic devices.
  • PDAs personal digital assistants
  • mobile devices smartphones, gaming systems, VR headsets (1 15), and tablets, among other electronic devices.
  • the processor may be utilized in any data processing scenario including, stand-alone hardware, mobile applications, through a computing network, or combinations thereof. Further, the processor may be used in a computing network, a public cloud network, a private cloud network, a hybrid cloud network, other forms of networks, or combinations thereof. In one example, the methods provided by the processor are provided as a service over a network by, for example, a third party.
  • the service may comprise, for example, the following: a Software as a Service (SaaS) hosting a number of applications; a Platform as a Service (PaaS) hosting a computing platform comprising, for example, operating systems, hardware, and storage, among others; an infrastructure as a Service (laaS) hosting equipment such as, for example, servers, storage components, network, and components, among others; application program interface (API) as a service (APiaaS), other forms of network services, or combinations thereof.
  • SaaS Software as a Service
  • PaaS Platform as a Service
  • laaS infrastructure as a Service
  • API application program interface
  • the present input system (100) may be implemented on one or multiple hardware platforms, in which certain modules in the system can be executed on one or across multiple platforms.
  • the input system (100) may further include various hardware components. Among these hardware components may the processor described above, a number of data storage devices, a number of peripheral device adapters, and a number of network adapters. These hardware components may be interconnected through the use of a number of busses and/or network connections. In one example, the processor, data storage device, peripheral device adapters, and a network adapter may be communicatively coupled via a bus. [0026] The processor may include the hardware architecture to retrieve executable code from the data storage device and execute the executable code.
  • the executable code may, when executed by the processor, cause the processor to implement at least the functionality of receiving input and displaying an image or series of images to a user via the VR headset (1 15) according to the methods of the present specification described herein.
  • the processor may receive input from and provide output to a number of the remaining hardware units.
  • the data storage device may store data such as executable program code that is executed by the processor or another processing device. As will be discussed, the data storage device may specifically store computer code representing a number of applications that the processor executes to implement at least the functionality described herein.
  • the data storage device may include various types of memory modules, including volatile and nonvolatile memory.
  • the data storage device of the present example includes Random Access Memory (RAM), Read Only Memory (ROM), and Hard Disk Drive (HDD) memory.
  • RAM Random Access Memory
  • ROM Read Only Memory
  • HDD Hard Disk Drive
  • Many other types of memory may also be utilized, and the present specification contemplates the use of many varying type(s) of memory in the data storage device as may suit a particular application of the principles described herein.
  • different types of memory in the data storage device may be used for different data storage needs.
  • the processor may boot from Read Only Memory (ROM), maintain nonvolatile storage in the Hard Disk Drive (HDD) memory, and execute program code stored in Random Access Memory (RAM).
  • ROM Read Only Memory
  • HDD Hard Disk Drive
  • the data storage device may comprise a computer readable medium, a computer readable storage medium, or a non-transitory computer readable medium, among others.
  • the data storage device may be, but not limited to, an electronic, magnetic, optical,
  • a computer readable storage medium may include, for example, the following: an electrical connection having a number of wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPRO or Flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing, in the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store computer usable program code for use by or in connection with an instruction execution system, apparatus, or device, in another example, a computer readable storage medium may be any non-transitory medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • the hardware adapters enable the processor to interface with various other hardware elements.
  • the peripheral device adapters may provide an interface to input/output devices, such as, for example, display device, the VR headset (1 15), a mouse, or a keyboard.
  • the peripheral device adapters may also provide access to other external devices such as an external storage device, a number of network devices such as, for example, servers, switches, and routers, client devices, other types of computing devices, and combinations thereof.
  • the VR headset (1 15) may be replaced with a display screen associated with a desktop or laptop computing device.
  • operation of the input system (100) would be similar to the input system (100) implementing the stylus (105) and tablet (1 10) described above.
  • the boundaries of the tablet frustum (210) and the boundaries of the user view frustum (205) would be equal, in an example, the boundaries of the tablet frustum (210) and the boundaries of the user visible frustum (205) are not equal with the boundaries of either the tablet frustum (210) or user visible frustum (205) being relatively larger than the other.
  • the VR headset (1 15) may be replaced with a touch screen display associated with a desktop or laptop computing device, in this example, operation would be similar to the above except that the
  • the input system (100) may allow for the fact that the depth of the stylus (105) in the 3D workspace (120) is visible and fixed at the surface of the touch screen. Consequently, if a line is drawn on the surface of the touchscreen sloping into the 3D workspace, the view will follow, zooming in to the workspace to follow the stylus (105) as it draws.
  • Figs. 2A and 2B are user view reference diagrams (200) implemented in a VR headset (1 15) according to an example of the principles described herein.
  • the view presented in the VR headset (1 15) contains a number of frustums (205, 210) and an input plane (215).
  • the frustums (205, 210) include a user view frustum (205) and a tablet frustum (210).
  • the user view frustum (205) is all of the 3D space or 3D workspace (120) visible to a user implementing the VR headset (1 15) described above, for a particular location and orientation of that VR headset (1 15).
  • This volume of virtual space may include a number of widgets that the user can address by implementing an input device such as a mouse or the stylus (105) and tablet (1 10).
  • an input device such as a mouse or the stylus (105) and tablet (1 10).
  • the widgets may include other 3D object editing commands including any number of drawing tools such as palettes, fills, line drawing tools, object forming tools, cropping tools, and cutting tools, among others.
  • the tablet frustum (210) may be included within the user view frustum (205) at all times.
  • the tablet frustum (210) corresponds to a mapped area on the 2D surface of the tablet (1 10). in an example, the user visible frustum (205) and the tablet frustum (210) share the same viewpoint (220).
  • movement of the VR headset (1 15) while the user is viewing the 3D workspace (120) changes the view within the 3D workspace (120).
  • the ocular screens of the VR headset (1 15) would display to the user a view that is left of the view.
  • a view of a 3D object within the 3D workspace (120) may go "off screen" when a user looks to the left such that, in order to view the 3D object again, the user could position his or her head in its original position. The same would apply if the user were to look in any other direction: right, up down, etc.
  • the 3D object in the 3D workspace (120) appears to remain in a specific location within the 3D workspace (120) and it is the user that is looking away.
  • a user may use the described input system (100) to draw any number of 3D objects within the 3D workspace (120).
  • an input plane (215) may be visible.
  • the input plane (215) may be freely translated and rotated within the 3D workspace (120) as described above using various user widgets.
  • the boundaries of the input plane (215) may extend at least up to the sides of the tablet frustum (210), in an example, the input plane (215) does not extend beyond the near (225) or far (230) planes of the user view frustum (205) in the case of extreme rotations of the input plane (215) within the 3D workspace (120). in an example, the input plane (215) does extend beyond the near (225) or far (230) planes of the user view frustum (205) within the 3D workspace (120).
  • the processor prevents the input plane (215) from being rotated or moved into a position where it appears edge-on in the image presented in the VR headset (1 15), in this example, a user viewing an edge-on view of the input plane (215) would see only a line representing the edge of the input plane (215). in such a situation, it may be difficult to draw on the input plane in such a configuration,
  • a user may engage the tablet (1 10) with the stylus (105) thus creating a point (235) on the input plane (215).
  • the line (230) created from the pen input location and the input plane results in an intersection within the tablet frustum.
  • Fig. 2B shows a user drawing a line (240).
  • a digital representation of the line (240) is then produced on the input plane (215) within the user view frustum (205) of the 3D workspace (120).
  • the input plane (215) may be repositioned as desired to enable the user to draw a line in a different location in the 3D workspace.
  • moving the input plane (215) may also move ail the lines associated with that input plane (215).
  • any line drawn by the user can be selected such that the input plane (215) onto which the line was drawn can be recalled thereby displaying that corresponding input plane (215) again. This would enable further lines to be added on that input plane (215), or allow ail the lines associated with that plane to be moved together by moving the input plane.
  • various other types of input from these devices can result in various other types of markings being mapped onto the input plane (215). Examples of these other types of markings may include dots, 2D shapes, filled 2D shapes, 3D shapes, filled 3D shapes, clip art, and curves, among others. Consequently, the term "line” not meant to be limiting and instead is meant only as an example of a marking that could result from the input received from the tablet (1 10) and stylus (105).
  • the input plane (215) may be moved or rotated within the 3D workspace (120). As the user draws on the input plane (215), a 3D object is created based on the plane within the 3D workspace (120) that the input plane (215) occupies. As described above, in an example the user can position the input plane (215) using a number of widgets displayed next to the input plane (215).
  • the user can position the input plane (215) by placing the stylus (105) in a dedicated manipulation mode, in this example, the stylus (105) may include a button or other activation device to switch from a "drawing" state to an "input plane (215) manipulation" states, in an example, the input plane (215) may be positioned using a six degree-of-freedom (DOF) input device.
  • the six DOF input device may be controlled by the user using the user's non-drawing hand or the hand that is not holding the stylus (105).
  • the input plane (215) may be "snapped" to predetermined distance and angle increments.
  • a user may input coordinates of the input plane (215) using the above mentioned widgets, in this example, certain inputs from the user to adjust the location of the input plane (215) may not be accepted until the user provides further instructions indicating that the input plane (215) may be "unsnapped” from the particular position.
  • the input plane (215) may be "snapped" to a
  • the preexisting location in the 3D workspace (120) may include an end of a previously drawn line.
  • a user may specifically position the input plane (215) within the 3D workspace (120) by snapping one axis of the input plane (215) between two existing line end points and then freely rotating the input plane (215) about that axis created in order to achieve a desired orientation, potentially snapping the plane to intersect a third point.
  • the input plane (215) itself, may also be manipulated, in an example, a widget displayed within the 3D workspace (120) may be actuated by the user to curve the input plane (215).
  • the curvature of the input plane (215) along its axes could be manipulated using the widget.
  • the curvature of the input plane may be manipulated in any way by the user allowing the user to both curve the input plane (215) as well as add edges and/or corners into the input plane (215).
  • lines or other markings drawn on the input plane (215) may be used to create a new 3D input plane (215) having a user-defined shape, in an example, a curved line drawn on the input plane (215) could be extruded perpendicular to that input plane (215) to create an additional input plane (215) that the user may also implement.
  • an additional widget may be used to toggle between the original input plane (215) and the additional input plane (215) while the user is drawing the 3D object in the 3D workspace (120).
  • any number of input planes (215) may be created, in order to prevent a view of the 3D object from being obstructed due to an abundance of created input planes (215), a previously created input plane (215) may be optically removed, faded out, or shadowed as an additional input plane (215) is created. Again, as a drawn line is selected, the input plane (215) associated with that line may be shown while the other input planes (215) are removed from view or faded out or shadowed.
  • the input system (100) in an example, implements a VR headset (1 15), changes in the position of the user's head changes the location of the tablet frustum (210). Consequently, while the user is drawing a curve, for example, the current view within the 3D workspace (120) is altered and the position of the input plane (215) in relation to the tablet frustum (210) may be changed, in an example, motion of the tablet frustum (210) relative to the 3D workspace (120) may be frozen by the user. In an example, the freezing of the tablet frustum (210) may be accomplished as the stylus (105) engages the surface of the tablet (1 10).
  • the freezing of the tablet frustum (210) may be accomplished as the stylus (105) is within a threshold "hover" distance above the tablet (1 10). Motion or adjustment of the tablet frustum (210) may be re-enabled when the stylus (105) is lifted off of the surface of the tablet (1 10), when the threshold distance between the tablet (1 10) and stylus (105) is exceeded, or explicitly by a user activating a widget that re-enables movement of the tablet frustum (210).
  • the input system (100) implements an augmented reality (AR) headset.
  • AR augmented reality
  • the user is provided with aspects of the real world along with a visual representation of a 3D object being drawn in the AR environment.
  • the user may draw and adjust the 3D object being formed similarly as described above, in an example, a user implementing the AR headset, however, may interact with real surfaces in the AR environment.
  • a user implementing the above described tablet (1 10) and stylus (105) may draw a 3D object onto a real world surface that has been mapped by, for example, a laser mapping system associated with the input system (100).
  • This mapping of the visible real world surfaces allows a user to virtually draw on the surface of real world surfaces such as walls, ceilings, and floors. Additionally, because all real world surfaces may be mapped this way, the input plane may be these real world surfaces.
  • the input plane (215) may include not only flat surfaces but curved surfaces as well such as a globe, a pipe, a cone, a square, among any other curved or multiple-surface object, in this example, a user may virtually add objects onto surfaces of any real world object, add color to the surface of any real world object, and incorporate virtual 3D objects into or on real world surfaces and objects, among others actions.
  • any real world surface may be mapped as described above, and extrapolated into a virtual 3D object.
  • This allows a user to "copy” the real world surface and place that copy within the AR environment.
  • This allows the user to manipulate the 3D object as described above and define that newly copied surface as the input plane (215) itself.
  • a number of widgets provided in the AR environment similar to that described in connection with the VR environment may be provided to the user to execute the "copy,” "move,” and input plane (215) designation actions as described herein.
  • Fig, 3 is diagram showing two resulting view reference diagrams resulting from an unfrozen (305) and frozen view (310), respectively, according to an example of the principles described herein.
  • An originating view reference (300) shows the user view frustum (205) and tablet frustum (210) as well as the input plane (215) described above.
  • a user is currently engaged in drawing an object on the surface of the input plane (215) via the tablet (1 10) and stylus (105) as described above.
  • the stylus (105) shown on the input plane (215) is by reference only and is not to be understood as the input plane (215) comprising a visual representation of the stylus (105).
  • the unfrozen view (305) is the result, in the unfrozen view (305), a user currently drawing may mistakenly draw in a location on the input plane (215) that was not expected.
  • the unfrozen view (305) shows a situation where a currently drawn object (315) includes unintentional mark (320).
  • unintentional mark (320) is the result of the motion of the tablet frustum (210) relative to the 3D workspace (120) being changed due to a user of the VR headset (1 15) turning his or her head; in this example, to the right. Because the user is not able to maintain the tablet frustum (210) in the same location, the unintentional mark (320) is drawn. However, the user may activate a frozen state such that a frozen view (310) is maintained. In an example, for small movements of the users head and the VR headset (1 15), the tablet frustum (210) remains fixed relative to the input plane (215) while the view frustum (205) moves with the user's head.
  • movements of the user's head and the VR headset (1 15) causes the tablet frustum (210) to pivot around the users viewpoint independent of the user view frustum (205) so that the current x and y coordinate location of the marking produced by the stylus (105) does not change.
  • a user may, according to one example, engage the frozen view (310) by applying the stylus (105) to the surface of the tablet (1 10).
  • a user may engage the frozen view (310) by passing an end of the stylus (105) past a threshold distance from the surface of the tablet (1 10).
  • a user may engage the frozen view (310) by pushing a button on the stylus (105).
  • a user may engage the frozen view (310) by actuating a widget placed within the 3D workspace (120).
  • FIGs. 4A and 4B are top view diagrams of a view frustum (205) and tablet frustum (210) as before and after a viewpoint motion respectively according to an example of the principles described herein.
  • the input plane (215) stays fixed in the 3D workspace (120) as the user moves their viewpoint.
  • the mapping of the current (x, y) position of the stylus (105) on the tablet (1 10) to the input plane (215) is locked as described above.
  • a currently drawn object (315) will appear where the top right edge of the tablet frustum (210) intersects the input plane (215). If the stylus (105) is maintained still on the tablet (1 10) and the user shifts and/or turns his or her head with the VR headset (1 15) so the currently drawn object (315) is in the center of the user's view frustum (205), the location on the input plane (215) of the stylus (105) doesn't change. Additionally, the currently drawn object (315) should stay a currently drawn object (315) without additional markings on the input plane (215) being made. Fig.
  • FIG. 4B shows the tablet frustum (210) and view frustum (205) both pointing back towards the same viewpoint, but now extend off in different directions so that the intersection of the pen location within the tablet frustum (210) with the input plane (215), the point marked by the currently drawn object (315), stays fixed.
  • the stylus (105) is not moved as the user changes their viewpoint, but this is not a requirement.
  • the tablet frustum (210) is recalculated as described above, and then any changes in the stylus (105) position on the tablet (1 10) are translated into a stroke input on the input plane (215) based on this new mapping,
  • a line or other indicator may be placed within the 3D workspace (120) indicating the boundaries of the tablet frustum (210).
  • the indicator indicating the boundaries of the tablet frustum (210) may be visible to the user at all times, in another example, the indicator indicating the boundaries of the tablet frustum (210) may become visible when the user's stylus (105) approaches the boundary of the tablet frustum (210).
  • a user may draw a 3D object within the 3D workspace (120) without inadvertently making an unintended mark.
  • a user may not be able to keep his or her head completely still, instead, there may be inadvertent small shifts in the user ' s head position with the impact of these shifts being magnified significantly depending on the distance between viewpoint and the input point in the 3D workspace (120).
  • the mapping may be frozen as describe above for at least the duration of the stylus (105) stroke.
  • Fig. 5 is a flowchart showing a method (500) of applying a two- dimensional (2D) input into a three-dimensional (3D) space according to one example of the principles described herein.
  • the method (500) may begin with receiving (505) input from a first input device at a processor indicating a change in position of an input plane (215) within the 3D space represented on an output device, in an example, the first input device is a stylus (105) and tablet (1 10). in this example, the stylus (105) may be used to adjust the input plane (215) as described above.
  • the first input device is a mouse, in this example, a user may implement a stylus (105) and tablet (1 10) along with the mouse to both draw a 3D image in the 3D workspace (120) and adjust the input plane (215).
  • the method (500) may continue with receiving (510) input from a second input device having a 2D surface at the processor indicating a line to be drawn in the 3D space, in an example, the second input device is a stylus (105) and tablet (1 10). As described above, the stylus (105) and tablet (1 10) receive (510) input and pass the input onto a processor associated with the input system (100).
  • the method (500) may continue with representing (515) the received input from the second input device as a 3D image within the 3D space displayed on a user interface.
  • the processor converts the input data presented by the stylus (105) and tablet (1 10) into image data and presents the image data to a user via, for example, the VR headset (1 15) or AR headset described above.
  • the processor maintains (520) a current mapping of the input from the second input device to the input plane within the VR headset when a stylus interacts with the second input device and as the VR headset is moved.
  • maintaining (520) the current mapping of the input from, for example, the tablet to the input plane allows a user of the VR headset (1 15) to adjust the position of his or her head while drawing a 3D object in the 3D workspace (120). This prevents unintended and errant drawing strokes by the user.
  • FIG. 1 Aspects of the present system and method are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to examples of the principles described herein.
  • Each block of the flowchart illustrations and block diagrams, and combinations of blocks in the flowchart illustrations and block diagrams, may be implemented by computer usable program code.
  • the computer usable program code may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the computer usable program code, when executed via, for example, the processor or other programmable data processing apparatus, implement the functions or acts specified in the flowchart and/or block diagram block or blocks.
  • the computer usable program code may be embodied within a computer readable storage medium; the computer readable storage medium being part of the computer program product, in one example, the computer readable storage medium is a non-transitory computer readable medium.
  • the specification and figures describe an input system implementing a virtual reality headset to provide a user with a relatively intuitive way of creating 3D images within a 3D workspace.
  • the input device and the method described herein allow for a user to adjust an input plane such that a 2D input on a tablet is translated into a 3D image within the 3D workspace.
  • a user may use the VR headset while still drawing on the tablet as a result of a freeze feature.
  • the freeze feature freezes the mapping of the tablet to the input plane when a stylus contacts or breaches a threshold distance from a tablet. The result provides for a user to input strokes on the input plane without changes in the user's head position causing unintentional markings when the user turns his or her head.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Processing Or Creating Images (AREA)

Abstract

An input system may include a stylus, a positionable output device, a tablet to receive input via interaction with the stylus, and a three-dimensional (3D) workspace represented on a graphical user interface (GUI) of the positionable output device communicatively coupled to the tablet wherein two-dimensional (2D) input on the tablet translates to a 3D input on the 3D workspace based on the orientation of an input plane represented in the 3D workspace and wherein interface of the stylus with the tablet freezes a view of a tablet-to-input mapping displayed on the positionable output device.

Description

VIRTUAL REALITY INPUT
BACKGROUND
[0001] Computer-aided design (CAD) allows a user to create objects in three-dimensional (3D) space. These objects may be used to create physical 3D objects or better visualize those objects.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] The accompanying drawings illustrate various examples of the principles described herein and are a part of the specification. The illustrated examples are given merely for illustration, and do not limit the scope of the claims.
[0003] Fig. 1 is a diagram of an input system according to an example of the principles described herein.
[0004] Figs. 2A and 2B are user view reference diagrams
implemented in a VR headset according to an example of the principles described herein.
[0005] Fig. 3 is a diagram showing two resulting view reference diagrams resulting from an unfrozen and frozen view, respectively, according to an example of the principles described herein.
[0006] Figs. 4A and 4B are top view diagrams of a view frustum and tablet frustum as before and after a viewpoint motion respectively according to an example of the principles described herein. [0007] Fig. 5 is a flowchart showing a method of applying a two- dimensional (2D) input into a three-dimensional (3D) space according to one example of the principles described herein.
[0008] Throughout the drawings, identical reference numbers designate similar, but not necessarily identical, elements.
DETAI LED DESCRIPTION
[0009] As described above, CAD drawing is used to create 3D objects within a 3D space. In some instances, a user may execute a CAD program on a computing device that receives input from an input device and translates that input onto a two-dimensional (2D) screen. However, realizing a 3D object within a 3D workspace displayed on a 2D screen it not intuitive.
[0010] In some examples, a stylus is used as an input device. In this example, the stylus may be moved within a 3D space in order to allow the user to create the 3D object. However, this method of input, where the user's arm, hand and stylus are ail unsupported, has proven to be imprecise compared to conventional, 2D-constrained input methods. Other types of input devices may be used but generally these types of input devices are not intuitive to a user and an amount of practice and/or mental visualization is implemented in order to create the intended 3D object.
[0011] The present specification, therefore describes an input system that may include a stylus, a positionabie output device, a tablet to receive input via interaction with the stylus, and a three-dimensional (3D) workspace represented on a graphical user interface (GUI) of the positionabie output device communicatively coupled to the tablet wherein two-dimensional (2D) input on the tablet translates to a 3D input on the 3D workspace based on the orientation of an input plane represented in the 3D workspace and wherein interface of the stylus with the tablet freezes a view of a tabiet-to-input-piane mapping displayed on the positionabie output device.
[0012] The present specification further describes a method of applying a two-dimensional (2D) input into a three-dimensional (3D) space including receiving input from a first input device at a processor indicating a change in position of an input plane within the 3D space represented on an output device, receiving input from a second input device having a 2D surface at the processor indicating a line to be drawn in the 3D space, representing the received input from the second input device as a 3D image on the 3D space displayed on the output device, and maintaining a current mapping of the input form the second input device to the input plane within 3D space when a stylus interacts with the second input device and as the output device is moved.
[0013] The present specification also describes a computer program product for applying a two-dimensional (2D) input into a three-dimensional (3D) space, the computer program product including a computer readable storage medium including computer usable program code embodied therewith, the computer usable program code to, when executed by a processor receive input from a first input device at a processor indicating a change in position of an input plane within the 3D space represented on an output device, receive input from a second input device having a 2D surface at the processor indicating a line to be drawn in the 3D space, represent the received input from the second input device as a 3D image on the 3D space displayed on the output device, and maintain a current mapping of the input from the second input device to the input plane within the 3D space when a stylus interacts with the second input device and as the output device is moved.
[0014] As used in the present specification and in the appended claims, the term "frustum" is meant to be understood as any three-dimensional region which is visible on a viewing, in some examples, the frustum is called a "view frustum." In an example, the frustum is the 3D shape of a clipped rectangular pyramid.
[001 S] Additionally, as used in the present specification and in the appended claims, the term "a number of or similar language is meant to be understood broadly as any positive number comprising 1 to infinity.
[0016] In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough
understanding of the present systems and methods. It will be apparent, however, to one skilled in the art that the present apparatus, systems and methods may be practiced without these specific details. Reference in the specification to "an example" or similar language means that a particular feature, structure, or characteristic described in connection with that example is included as described, but may not be included in other examples,
[0017] Turning now to the figures, Fig. 1 is a diagram of an input system (100) according to an example of the principles described herein. The input system (100) may include a stylus (105) and tablet (1 10) and a
positionable output device (1 15) having a three-dimensional (3D) workspace (120) displayed therein. Each of these will now be described in more detail.
[0018] The stylus (105) may be any type of elongated device that a user may hold and touch to a surface such as the tablet (1 10). in an example, the stylus (105) is a passive or capacitive stylus in that the stylus (105) acts just like a human finger when touching a touch sensitive screen, for example. In this example, no electronic communication is present between the passive stylus and a device such as the tablet (1 10). In an example, the stylus (105) is an active stylus (105) in that the stylus (105) includes electronic components that communicate with the touchscreen controller on a device such as the tablet (1 10). During use, a user may implement the stylus (105) against or near the tablet (1 10) in order to have input received and presented on the 3D workspace (120) within the positionable output device (1 15).
[0019] The positionable output device (1 15) may be any device that implements a viewing surface to represent a 3D object or image within a 3D workspace (120). in an example and for convenience of description herein, the positionable output device (1 15) is a virtual reality (VR) headset implementing stereoscopic images called stereograms to represent the 3D object or image within the 3D workspace (120). The images shown by the 3D workspace (120) may be still images or video images based on what the user is to display within the VR headset (1 15). The VR headset (1 15) may present the 3D workspace (120) and 3D object or image to a user via a number of ocular screens. In an example, the ocular screens are placed in an eyeglass or goggle system allowing a user to view both ocular screens simultaneously. This creates the illusion of a 3D workspace (120) and 3D objects using two individual ocular screens.
[0020] The VR headset (1 15) may further include a gyroscopic device and an acceierometer. The gyroscope may be used to detect the orientation of the VR headset (1 15) in 3D space as the VR headset (1 15) is on the user's head. The acceierometer may be used to detect the speed and change in speed of the VR headset (1 15) as it travels from one location in 3D space to another location in 3D space. The gyroscope and acceierometer may provide to the processor this data such that movement of the VR headset (1 15) as it sits on the user's head is translated into a change in the point of view within the 3D workspace (120).
[0021] Although the present specification describes the user implementing a VR headset (1 15) in order to be presented with a 3D workspace (120), other types of environments may also be used. In an example, an augmented reality (AR) environment may be used where aspects of the real world are viewable in a visual representation while a 3D object is being drawn within the AR environment. Thus, much like the VR system described herein, an AR system may include a visual presentation provided to a user via a computer screen or a headset including a number of screens, among other types of devices to present a visual representation of the 3D workspace (120). Thus the present description contemplates the use of not only a VR
environment but an AR environment as well.
[0022] During operation, a user may provide input to a processor using the stylus (105) and tablet (1 10). The input is then processed and presented to the user via the VR headset (1 15). Thus, the user may, in real time, create and see input created. This allows a user to manipulate the 3D workspace (120) and the 3D objects created therein to create or augment the 3D objects. In an example, the processor may be a built-in component of the tablet (1 10) and/or the VR headset (1 15). in an example, the processor may be a component of a computing device separate from the tablet (1 10) and/or VR headset (1 15). In this example, the computing device may receive the input from the tablet (1 10) and stylus (105) and cause the processor to relay the processed data to the VR headset (1 15) in real time.
[0023] The processor may be implemented in an electronic device. Examples of electronic devices include servers, desktop computers, laptop computers, personal digital assistants (PDAs), mobile devices, smartphones, gaming systems, VR headsets (1 15), and tablets, among other electronic devices.
[0024] The processor may be utilized in any data processing scenario including, stand-alone hardware, mobile applications, through a computing network, or combinations thereof. Further, the processor may be used in a computing network, a public cloud network, a private cloud network, a hybrid cloud network, other forms of networks, or combinations thereof. In one example, the methods provided by the processor are provided as a service over a network by, for example, a third party. In this example, the service may comprise, for example, the following: a Software as a Service (SaaS) hosting a number of applications; a Platform as a Service (PaaS) hosting a computing platform comprising, for example, operating systems, hardware, and storage, among others; an infrastructure as a Service (laaS) hosting equipment such as, for example, servers, storage components, network, and components, among others; application program interface (API) as a service (APiaaS), other forms of network services, or combinations thereof. The present input system (100) may be implemented on one or multiple hardware platforms, in which certain modules in the system can be executed on one or across multiple platforms.
[0025] The input system (100) may further include various hardware components. Among these hardware components may the processor described above, a number of data storage devices, a number of peripheral device adapters, and a number of network adapters. These hardware components may be interconnected through the use of a number of busses and/or network connections. In one example, the processor, data storage device, peripheral device adapters, and a network adapter may be communicatively coupled via a bus. [0026] The processor may include the hardware architecture to retrieve executable code from the data storage device and execute the executable code. The executable code may, when executed by the processor, cause the processor to implement at least the functionality of receiving input and displaying an image or series of images to a user via the VR headset (1 15) according to the methods of the present specification described herein. In the course of executing code, the processor may receive input from and provide output to a number of the remaining hardware units.
[0027] The data storage device may store data such as executable program code that is executed by the processor or another processing device. As will be discussed, the data storage device may specifically store computer code representing a number of applications that the processor executes to implement at least the functionality described herein.
[0028] The data storage device may include various types of memory modules, including volatile and nonvolatile memory. For example, the data storage device of the present example includes Random Access Memory (RAM), Read Only Memory (ROM), and Hard Disk Drive (HDD) memory. Many other types of memory may also be utilized, and the present specification contemplates the use of many varying type(s) of memory in the data storage device as may suit a particular application of the principles described herein. In certain examples, different types of memory in the data storage device may be used for different data storage needs. For example, in certain examples the processor may boot from Read Only Memory (ROM), maintain nonvolatile storage in the Hard Disk Drive (HDD) memory, and execute program code stored in Random Access Memory (RAM).
[0029] Generally, the data storage device may comprise a computer readable medium, a computer readable storage medium, or a non-transitory computer readable medium, among others. For example, the data storage device may be, but not limited to, an electronic, magnetic, optical,
electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of the computer readable storage medium may include, for example, the following: an electrical connection having a number of wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPRO or Flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing, in the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store computer usable program code for use by or in connection with an instruction execution system, apparatus, or device, in another example, a computer readable storage medium may be any non-transitory medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
[0030] The hardware adapters enable the processor to interface with various other hardware elements. For example, the peripheral device adapters may provide an interface to input/output devices, such as, for example, display device, the VR headset (1 15), a mouse, or a keyboard. The peripheral device adapters may also provide access to other external devices such as an external storage device, a number of network devices such as, for example, servers, switches, and routers, client devices, other types of computing devices, and combinations thereof.
[0031] in an example, the VR headset (1 15) may be replaced with a display screen associated with a desktop or laptop computing device. In this example, operation of the input system (100) would be similar to the input system (100) implementing the stylus (105) and tablet (1 10) described above. In an example, the boundaries of the tablet frustum (210) and the boundaries of the user view frustum (205) would be equal, in an example, the boundaries of the tablet frustum (210) and the boundaries of the user visible frustum (205) are not equal with the boundaries of either the tablet frustum (210) or user visible frustum (205) being relatively larger than the other.
[0032] In an example, the VR headset (1 15) may be replaced with a touch screen display associated with a desktop or laptop computing device, in this example, operation would be similar to the above except that the
boundaries of the tablet frustum (210) and the boundaries of the user visible frustum (205) are the same. In an example of the display being a stereo 3D display, the input system (100) may allow for the fact that the depth of the stylus (105) in the 3D workspace (120) is visible and fixed at the surface of the touch screen. Consequently, if a line is drawn on the surface of the touchscreen sloping into the 3D workspace, the view will follow, zooming in to the workspace to follow the stylus (105) as it draws.
[0033] Figs. 2A and 2B are user view reference diagrams (200) implemented in a VR headset (1 15) according to an example of the principles described herein. The view presented in the VR headset (1 15) contains a number of frustums (205, 210) and an input plane (215). The frustums (205, 210) include a user view frustum (205) and a tablet frustum (210). The user view frustum (205) is all of the 3D space or 3D workspace (120) visible to a user implementing the VR headset (1 15) described above, for a particular location and orientation of that VR headset (1 15). This volume of virtual space may include a number of widgets that the user can address by implementing an input device such as a mouse or the stylus (105) and tablet (1 10). By implementing the widgets, a user may adjust the position of the input plane (215) and/or the space within the user view frustum (205) that the tablet frustum (210) occupies. Additionally, the widgets may include other 3D object editing commands including any number of drawing tools such as palettes, fills, line drawing tools, object forming tools, cropping tools, and cutting tools, among others.
[0034] The tablet frustum (210) may be included within the user view frustum (205) at all times. The tablet frustum (210) corresponds to a mapped area on the 2D surface of the tablet (1 10). in an example, the user visible frustum (205) and the tablet frustum (210) share the same viewpoint (220). During operation of the VR headset (1 15), movement of the VR headset (1 15) while the user is viewing the 3D workspace (120) changes the view within the 3D workspace (120). As an example, if the user were to look left while wearing the VR headset (1 15), the ocular screens of the VR headset (1 15) would display to the user a view that is left of the view. In some examples, a view of a 3D object within the 3D workspace (120) may go "off screen" when a user looks to the left such that, in order to view the 3D object again, the user could position his or her head in its original position. The same would apply if the user were to look in any other direction: right, up down, etc. Thus, to the user, the 3D object in the 3D workspace (120) appears to remain in a specific location within the 3D workspace (120) and it is the user that is looking away. As such, a user may use the described input system (100) to draw any number of 3D objects within the 3D workspace (120).
[0035] Within the user view frustum (205) and tablet frustum (210), an input plane (215) may be visible. The input plane (215) may be freely translated and rotated within the 3D workspace (120) as described above using various user widgets. The boundaries of the input plane (215) may extend at least up to the sides of the tablet frustum (210), in an example, the input plane (215) does not extend beyond the near (225) or far (230) planes of the user view frustum (205) in the case of extreme rotations of the input plane (215) within the 3D workspace (120). in an example, the input plane (215) does extend beyond the near (225) or far (230) planes of the user view frustum (205) within the 3D workspace (120). in an example, the processor prevents the input plane (215) from being rotated or moved into a position where it appears edge-on in the image presented in the VR headset (1 15), in this example, a user viewing an edge-on view of the input plane (215) would see only a line representing the edge of the input plane (215). in such a situation, it may be difficult to draw on the input plane in such a configuration,
[0036] During operation, a user may engage the tablet (1 10) with the stylus (105) thus creating a point (235) on the input plane (215). The line (230) created from the pen input location and the input plane results in an intersection within the tablet frustum. Fig. 2B shows a user drawing a line (240). A digital representation of the line (240) is then produced on the input plane (215) within the user view frustum (205) of the 3D workspace (120). After a line has been drawn, the input plane (215) may be repositioned as desired to enable the user to draw a line in a different location in the 3D workspace. In another mode, moving the input plane (215) may also move ail the lines associated with that input plane (215). Additionally, any line drawn by the user can be selected such that the input plane (215) onto which the line was drawn can be recalled thereby displaying that corresponding input plane (215) again. This would enable further lines to be added on that input plane (215), or allow ail the lines associated with that plane to be moved together by moving the input plane. Although the description herein references the input from the tablet (1 10) and stylus (105) as a line," various other types of input from these devices can result in various other types of markings being mapped onto the input plane (215). Examples of these other types of markings may include dots, 2D shapes, filled 2D shapes, 3D shapes, filled 3D shapes, clip art, and curves, among others. Consequently, the term "line" not meant to be limiting and instead is meant only as an example of a marking that could result from the input received from the tablet (1 10) and stylus (105).
[0037] in an example, the input plane (215) may be moved or rotated within the 3D workspace (120). As the user draws on the input plane (215), a 3D object is created based on the plane within the 3D workspace (120) that the input plane (215) occupies. As described above, in an example the user can position the input plane (215) using a number of widgets displayed next to the input plane (215). in an example, the user can position the input plane (215) by placing the stylus (105) in a dedicated manipulation mode, in this example, the stylus (105) may include a button or other activation device to switch from a "drawing" state to an "input plane (215) manipulation" states, in an example, the input plane (215) may be positioned using a six degree-of-freedom (DOF) input device. In this example, the six DOF input device may be controlled by the user using the user's non-drawing hand or the hand that is not holding the stylus (105).
[0038] In an example, the input plane (215) may be "snapped" to predetermined distance and angle increments. In this example, a user may input coordinates of the input plane (215) using the above mentioned widgets, in this example, certain inputs from the user to adjust the location of the input plane (215) may not be accepted until the user provides further instructions indicating that the input plane (215) may be "unsnapped" from the particular position. In an example, the input plane (215) may be "snapped" to a
preexisting location in the 3D workspace (120). in this example, the preexisting location in the 3D workspace (120) may include an end of a previously drawn line. Further, a user may specifically position the input plane (215) within the 3D workspace (120) by snapping one axis of the input plane (215) between two existing line end points and then freely rotating the input plane (215) about that axis created in order to achieve a desired orientation, potentially snapping the plane to intersect a third point.
[0039] The input plane (215), itself, may also be manipulated, in an example, a widget displayed within the 3D workspace (120) may be actuated by the user to curve the input plane (215). In this example, the curvature of the input plane (215) along its axes could be manipulated using the widget. In this example, the curvature of the input plane may be manipulated in any way by the user allowing the user to both curve the input plane (215) as well as add edges and/or corners into the input plane (215). in an example, lines or other markings drawn on the input plane (215) may be used to create a new 3D input plane (215) having a user-defined shape, in an example, a curved line drawn on the input plane (215) could be extruded perpendicular to that input plane (215) to create an additional input plane (215) that the user may also implement. As a result, an additional widget may be used to toggle between the original input plane (215) and the additional input plane (215) while the user is drawing the 3D object in the 3D workspace (120). Any number of input planes (215) may be created, in order to prevent a view of the 3D object from being obstructed due to an abundance of created input planes (215), a previously created input plane (215) may be optically removed, faded out, or shadowed as an additional input plane (215) is created. Again, as a drawn line is selected, the input plane (215) associated with that line may be shown while the other input planes (215) are removed from view or faded out or shadowed.
[0040] Because the input system (100), in an example, implements a VR headset (1 15), changes in the position of the user's head changes the location of the tablet frustum (210). Consequently, while the user is drawing a curve, for example, the current view within the 3D workspace (120) is altered and the position of the input plane (215) in relation to the tablet frustum (210) may be changed, in an example, motion of the tablet frustum (210) relative to the 3D workspace (120) may be frozen by the user. In an example, the freezing of the tablet frustum (210) may be accomplished as the stylus (105) engages the surface of the tablet (1 10). In an example, the freezing of the tablet frustum (210) may be accomplished as the stylus (105) is within a threshold "hover" distance above the tablet (1 10). Motion or adjustment of the tablet frustum (210) may be re-enabled when the stylus (105) is lifted off of the surface of the tablet (1 10), when the threshold distance between the tablet (1 10) and stylus (105) is exceeded, or explicitly by a user activating a widget that re-enables movement of the tablet frustum (210).
[0041] In an example, the input system (100) implements an augmented reality (AR) headset. As described above, the user is provided with aspects of the real world along with a visual representation of a 3D object being drawn in the AR environment. During operation, the user may draw and adjust the 3D object being formed similarly as described above, in an example, a user implementing the AR headset, however, may interact with real surfaces in the AR environment. Specifically, a user implementing the above described tablet (1 10) and stylus (105) may draw a 3D object onto a real world surface that has been mapped by, for example, a laser mapping system associated with the input system (100). This mapping of the visible real world surfaces allows a user to virtually draw on the surface of real world surfaces such as walls, ceilings, and floors. Additionally, because all real world surfaces may be mapped this way, the input plane may be these real world surfaces.
Consequently, as mentioned above, the input plane (215) may include not only flat surfaces but curved surfaces as well such as a globe, a pipe, a cone, a square, among any other curved or multiple-surface object, in this example, a user may virtually add objects onto surfaces of any real world object, add color to the surface of any real world object, and incorporate virtual 3D objects into or on real world surfaces and objects, among others actions.
In an example, where the user implements an AR headset, any real world surface may be mapped as described above, and extrapolated into a virtual 3D object. This allows a user to "copy" the real world surface and place that copy within the AR environment. This allows the user to manipulate the 3D object as described above and define that newly copied surface as the input plane (215) itself, A number of widgets provided in the AR environment similar to that described in connection with the VR environment may be provided to the user to execute the "copy," "move," and input plane (215) designation actions as described herein. Fig, 3 is diagram showing two resulting view reference diagrams resulting from an unfrozen (305) and frozen view (310), respectively, according to an example of the principles described herein. An originating view reference (300) shows the user view frustum (205) and tablet frustum (210) as well as the input plane (215) described above. In the originating view (300) a user is currently engaged in drawing an object on the surface of the input plane (215) via the tablet (1 10) and stylus (105) as described above. The stylus (105) shown on the input plane (215) is by reference only and is not to be understood as the input plane (215) comprising a visual representation of the stylus (105).
[0042] As described above, motion of the tablet frustum (210) relative to the 3D workspace (120) and user visible frustum (205) may be frozen by the user. Without the freezing of the tablet frustum (210), the unfrozen view (305) is the result, in the unfrozen view (305), a user currently drawing may mistakenly draw in a location on the input plane (215) that was not expected. In the example shown in Fig. 3, the unfrozen view (305) shows a situation where a currently drawn object (315) includes unintentional mark (320). The
unintentional mark (320) is the result of the motion of the tablet frustum (210) relative to the 3D workspace (120) being changed due to a user of the VR headset (1 15) turning his or her head; in this example, to the right. Because the user is not able to maintain the tablet frustum (210) in the same location, the unintentional mark (320) is drawn. However, the user may activate a frozen state such that a frozen view (310) is maintained. In an example, for small movements of the users head and the VR headset (1 15), the tablet frustum (210) remains fixed relative to the input plane (215) while the view frustum (205) moves with the user's head. In an example, movements of the user's head and the VR headset (1 15) causes the tablet frustum (210) to pivot around the users viewpoint independent of the user view frustum (205) so that the current x and y coordinate location of the marking produced by the stylus (105) does not change. This is seen in Figs. 4A and 4B. As described above, a user may, according to one example, engage the frozen view (310) by applying the stylus (105) to the surface of the tablet (1 10). In an example, a user may engage the frozen view (310) by passing an end of the stylus (105) past a threshold distance from the surface of the tablet (1 10). in still another example, a user may engage the frozen view (310) by pushing a button on the stylus (105). In a further example, a user may engage the frozen view (310) by actuating a widget placed within the 3D workspace (120).
[0043] Figs. 4A and 4B are top view diagrams of a view frustum (205) and tablet frustum (210) as before and after a viewpoint motion respectively according to an example of the principles described herein. As briefly described above, when the user has finished moving the input plane (215) to a specific location, the input plane (215) stays fixed in the 3D workspace (120) as the user moves their viewpoint. Additionally, the mapping of the current (x, y) position of the stylus (105) on the tablet (1 10) to the input plane (215) is locked as described above. Consequently, if the stylus (105) is placed onto, for example, the top right corner of the tablet (1 10), a currently drawn object (315) will appear where the top right edge of the tablet frustum (210) intersects the input plane (215). If the stylus (105) is maintained still on the tablet (1 10) and the user shifts and/or turns his or her head with the VR headset (1 15) so the currently drawn object (315) is in the center of the user's view frustum (205), the location on the input plane (215) of the stylus (105) doesn't change. Additionally, the currently drawn object (315) should stay a currently drawn object (315) without additional markings on the input plane (215) being made. Fig. 4B shows the tablet frustum (210) and view frustum (205) both pointing back towards the same viewpoint, but now extend off in different directions so that the intersection of the pen location within the tablet frustum (210) with the input plane (215), the point marked by the currently drawn object (315), stays fixed. For simplicity, the above examples assume the stylus (105) is not moved as the user changes their viewpoint, but this is not a requirement. On each display update cycle the tablet frustum (210) is recalculated as described above, and then any changes in the stylus (105) position on the tablet (1 10) are translated into a stroke input on the input plane (215) based on this new mapping,
[0044] in an example, a line or other indicator may be placed within the 3D workspace (120) indicating the boundaries of the tablet frustum (210). in one example, the indicator indicating the boundaries of the tablet frustum (210) may be visible to the user at all times, in another example, the indicator indicating the boundaries of the tablet frustum (210) may become visible when the user's stylus (105) approaches the boundary of the tablet frustum (210).
[004S] As a result of the freezing feature described above, a user may draw a 3D object within the 3D workspace (120) without inadvertently making an unintended mark. During the drawing process, a user may not be able to keep his or her head completely still, instead, there may be inadvertent small shifts in the user's head position with the impact of these shifts being magnified significantly depending on the distance between viewpoint and the input point in the 3D workspace (120). In order to support precise input, the mapping may be frozen as describe above for at least the duration of the stylus (105) stroke.
[0046] Fig. 5 is a flowchart showing a method (500) of applying a two- dimensional (2D) input into a three-dimensional (3D) space according to one example of the principles described herein. The method (500) may begin with receiving (505) input from a first input device at a processor indicating a change in position of an input plane (215) within the 3D space represented on an output device, in an example, the first input device is a stylus (105) and tablet (1 10). in this example, the stylus (105) may be used to adjust the input plane (215) as described above. In an example, the first input device is a mouse, in this example, a user may implement a stylus (105) and tablet (1 10) along with the mouse to both draw a 3D image in the 3D workspace (120) and adjust the input plane (215).
[0047] The method (500) may continue with receiving (510) input from a second input device having a 2D surface at the processor indicating a line to be drawn in the 3D space, in an example, the second input device is a stylus (105) and tablet (1 10). As described above, the stylus (105) and tablet (1 10) receive (510) input and pass the input onto a processor associated with the input system (100).
[0048] The method (500) may continue with representing (515) the received input from the second input device as a 3D image within the 3D space displayed on a user interface. The processor converts the input data presented by the stylus (105) and tablet (1 10) into image data and presents the image data to a user via, for example, the VR headset (1 15) or AR headset described above. During the representation (515) of the received input from the second input device, the processor maintains (520) a current mapping of the input from the second input device to the input plane within the VR headset when a stylus interacts with the second input device and as the VR headset is moved. As described above, maintaining (520) the current mapping of the input from, for example, the tablet to the input plane allows a user of the VR headset (1 15) to adjust the position of his or her head while drawing a 3D object in the 3D workspace (120). This prevents unintended and errant drawing strokes by the user.
[0049] Aspects of the present system and method are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to examples of the principles described herein. Each block of the flowchart illustrations and block diagrams, and combinations of blocks in the flowchart illustrations and block diagrams, may be implemented by computer usable program code. The computer usable program code may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the computer usable program code, when executed via, for example, the processor or other programmable data processing apparatus, implement the functions or acts specified in the flowchart and/or block diagram block or blocks. In one example, the computer usable program code may be embodied within a computer readable storage medium; the computer readable storage medium being part of the computer program product, in one example, the computer readable storage medium is a non-transitory computer readable medium.
[0050] The specification and figures describe an input system implementing a virtual reality headset to provide a user with a relatively intuitive way of creating 3D images within a 3D workspace. The input device and the method described herein allow for a user to adjust an input plane such that a 2D input on a tablet is translated into a 3D image within the 3D workspace.
Additionally, a user may use the VR headset while still drawing on the tablet as a result of a freeze feature. The freeze feature freezes the mapping of the tablet to the input plane when a stylus contacts or breaches a threshold distance from a tablet. The result provides for a user to input strokes on the input plane without changes in the user's head position causing unintentional markings when the user turns his or her head.
[0051] The preceding description has been presented to illustrate and describe examples of the principles described. This description is not intended to be exhaustive or to limit these principles to any precise form disclosed. Many modifications and variations are possible in light of the above teaching.

Claims

CLAI S WHAT IS CLAIMED IS:
1 . An input system, comprising:
a stylus;
a positionable output device;
a tablet to receive input via interaction with the stylus; and
a three-dimensional (3D) workspace represented on a graphical user interface (GUI) of the positionable output device communicatively coupled to the tablet;
wherein two-dimensional (2D) input on the tablet translates to a 3D input on the 3D workspace based on the orientation of an input plane represented in the 3D workspace; and
wherein interface of the stylus with the tablet freezes a view of a tablet- to-input mapping displayed on the positionable output device.
2. The input system of claim 1 , wherein a position of the input plane is adjustable via input from the interaction of the stylus on the tablet.
3. The input system of claim 1 , further comprising a mouse wherein a position of the input plane is adjustable via input from the mouse.
4. The input system of claim 1 , wherein the input plane is adjustable with reference to a consistent point of reference within the 3D workspace.
5. The input system of claim 1 , wherein the freezing of the current view provided by the positionable output device occurs when the stylus is a threshold distance from the surface of the tablet.
6. The input system of claim 4, wherein the input plane fits within a tablet frustum defined within a view frustum within the 3D workspace.
7. The input system of claim 6, wherein boundaries of the input plane extend between the tablet frustum and the view frustum.
8. A method of applying a two-dimensional (2D) input into a three- dimensional (3D) space, comprising:
receiving input from a first input device at a processor indicating a change in position of an input plane within the 3D space represented on an output device;
receiving input from a second input device having a 2D surface at the processor indicating a line to be drawn in the 3D space;
representing the received input from the second input device as a 3D image within the 3D space displayed on the output device; and
maintaining a current mapping of the input from the second input device to the input plane within the 3D space when a stylus interacts with the second input device and as the output device is moved.
9. The method of claim 8, wherein the first input device is a mouse.
10. The method of claim 8, wherein disengagement of a stylus from the second input device changes the input plane based on the position of the output device.
10. The method of claim 8, further comprising receiving input representing how the input plane is to be curved.
1 1 . The method of claim 8, further comprising receiving input representing a position the input plane is to be snapped or glued to.
12. The method of claim 8, wherein receiving input from a first input device at a processor indicating a change in position of an input plane within the 3D space represented on an output device further comprises preventing the input plane from aligning perpendicular to a user viewpoint within the 3D space.
13. A computer program product for applying a two-dimensional (2D) input into a three-dimensional (3D) space, the computer program product comprising: a computer readable storage medium comprising computer usable program code embodied therewith, the computer usable program code to, when executed by a processor:
receive input from a first input device at a processor indicating a change in position of an input plane within the 3D space represented on an output device;
receive input from a second input device having a 2D surface at the processor indicating a line to be drawn in the 3D space;
represent the received input from the second input device as a 3D image within the 3D space displayed on the output device; and
maintain a current mapping of the input from the second input device to the input plane within the 3D space when a stylus interacts with the second input device and as the output device is moved.
14. The computer program product of claim 13, further comprising computer usable program code to, when executed by a processor, receive instructions to map
15. The computer program product of claim 13, further comprising computer usable program code to, when executed by a processor, prevent the input plane from aligning edge-on to a user viewpoint.
PCT/US2016/058022 2016-10-21 2016-10-21 Virtual reality input WO2018075054A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201680090258.2A CN109863467B (en) 2016-10-21 2016-10-21 System, method and storage medium for virtual reality input
EP16919434.7A EP3510474B1 (en) 2016-10-21 2016-10-21 Virtual reality input
PCT/US2016/058022 WO2018075054A1 (en) 2016-10-21 2016-10-21 Virtual reality input
US16/075,610 US10768716B2 (en) 2016-10-21 2016-10-21 Virtual reality input including maintaining a current mapping of the input from an input device to an input plane within the 3D space

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2016/058022 WO2018075054A1 (en) 2016-10-21 2016-10-21 Virtual reality input

Publications (1)

Publication Number Publication Date
WO2018075054A1 true WO2018075054A1 (en) 2018-04-26

Family

ID=62019077

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/058022 WO2018075054A1 (en) 2016-10-21 2016-10-21 Virtual reality input

Country Status (4)

Country Link
US (1) US10768716B2 (en)
EP (1) EP3510474B1 (en)
CN (1) CN109863467B (en)
WO (1) WO2018075054A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109308132A (en) * 2018-08-31 2019-02-05 青岛小鸟看看科技有限公司 Implementation method, device, equipment and the system of the handwriting input of virtual reality
WO2020208204A1 (en) * 2019-04-11 2020-10-15 Goggle Collective Ltd. Tool and method for drawing 3-d curves in 2-d
CN112639682A (en) * 2018-08-24 2021-04-09 脸谱公司 Multi-device mapping and collaboration in augmented reality environments
EP4156113A4 (en) * 2020-07-27 2023-11-29 Wacom Co., Ltd. Method executed by computer, computer, and program

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10649550B2 (en) * 2018-06-26 2020-05-12 Intel Corporation Predictive detection of user intent for stylus use

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013084096A (en) * 2011-10-07 2013-05-09 Sharp Corp Information processing apparatus
WO2014107629A1 (en) * 2013-01-04 2014-07-10 Vuzix Corporation Interactive wearable and portable smart devices
US20150316980A1 (en) * 2014-04-18 2015-11-05 Magic Leap, Inc. User interface rendering in augmented or virtual reality systems
US20150363980A1 (en) 2014-06-17 2015-12-17 Valorisation-Recherche, Limited Partnership 3d virtual environment interaction system
JP5876607B1 (en) * 2015-06-12 2016-03-02 株式会社コロプラ Floating graphical user interface
JP2016167219A (en) * 2015-03-10 2016-09-15 株式会社コロプラ Method and program for displaying user interface on head-mounted display

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2397762A1 (en) 2000-01-25 2001-08-02 Autodesk, Inc. Method and apparatus for providing access to and working with architectural drawings on the internet
US6753847B2 (en) 2002-01-25 2004-06-22 Silicon Graphics, Inc. Three dimensional volumetric display input and output configurations
US7190331B2 (en) * 2002-06-06 2007-03-13 Siemens Corporate Research, Inc. System and method for measuring the registration accuracy of an augmented reality system
US8199107B2 (en) 2004-12-22 2012-06-12 University Of Waterloo Input interface device with transformable form factor
JP2009508274A (en) * 2005-09-13 2009-02-26 スペースタイムスリーディー・インコーポレーテッド System and method for providing a three-dimensional graphical user interface
US7701457B2 (en) 2006-02-21 2010-04-20 Chrysler Group Llc Pen-based 3D drawing system with geometric-constraint based 3D cross curve drawing
US8760391B2 (en) * 2009-05-22 2014-06-24 Robert W. Hawkins Input cueing emersion system and method
US10528156B2 (en) * 2009-05-22 2020-01-07 Hawkvision Emmersion Computing, LLC Input cueing emmersion system and method
US8922558B2 (en) 2009-09-25 2014-12-30 Landmark Graphics Corporation Drawing graphical objects in a 3D subsurface environment
US8232990B2 (en) 2010-01-05 2012-07-31 Apple Inc. Working with 3D objects
US8791899B1 (en) * 2010-01-26 2014-07-29 Open Invention Network, Llc Method and apparatus of position tracking and detection of user input information
US9632677B2 (en) * 2011-03-02 2017-04-25 The Boeing Company System and method for navigating a 3-D environment using a multi-input interface
US20130104086A1 (en) * 2011-10-21 2013-04-25 Digital Artforms, Inc. Systems and methods for human-computer interaction using a two handed interface
US9916009B2 (en) * 2013-04-26 2018-03-13 Leap Motion, Inc. Non-tactile interface systems and methods
CN106030495B (en) * 2015-01-30 2021-04-13 索尼深度传感解决方案股份有限公司 Multi-modal gesture-based interaction system and method utilizing a single sensing system
US10185464B2 (en) * 2015-05-28 2019-01-22 Microsoft Technology Licensing, Llc Pausing transient user interface elements based on hover information
CN109952552A (en) * 2016-10-11 2019-06-28 惠普发展公司有限责任合伙企业 Visual cues system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013084096A (en) * 2011-10-07 2013-05-09 Sharp Corp Information processing apparatus
WO2014107629A1 (en) * 2013-01-04 2014-07-10 Vuzix Corporation Interactive wearable and portable smart devices
US20150316980A1 (en) * 2014-04-18 2015-11-05 Magic Leap, Inc. User interface rendering in augmented or virtual reality systems
US20150363980A1 (en) 2014-06-17 2015-12-17 Valorisation-Recherche, Limited Partnership 3d virtual environment interaction system
JP2016167219A (en) * 2015-03-10 2016-09-15 株式会社コロプラ Method and program for displaying user interface on head-mounted display
JP5876607B1 (en) * 2015-06-12 2016-03-02 株式会社コロプラ Floating graphical user interface

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112639682A (en) * 2018-08-24 2021-04-09 脸谱公司 Multi-device mapping and collaboration in augmented reality environments
CN109308132A (en) * 2018-08-31 2019-02-05 青岛小鸟看看科技有限公司 Implementation method, device, equipment and the system of the handwriting input of virtual reality
WO2020208204A1 (en) * 2019-04-11 2020-10-15 Goggle Collective Ltd. Tool and method for drawing 3-d curves in 2-d
EP4156113A4 (en) * 2020-07-27 2023-11-29 Wacom Co., Ltd. Method executed by computer, computer, and program

Also Published As

Publication number Publication date
EP3510474A1 (en) 2019-07-17
CN109863467B (en) 2022-01-25
EP3510474B1 (en) 2021-12-01
CN109863467A (en) 2019-06-07
US10768716B2 (en) 2020-09-08
US20190042002A1 (en) 2019-02-07
EP3510474A4 (en) 2020-04-22

Similar Documents

Publication Publication Date Title
US10768716B2 (en) Virtual reality input including maintaining a current mapping of the input from an input device to an input plane within the 3D space
US9886102B2 (en) Three dimensional display system and use
US9436369B2 (en) Touch interface for precise rotation of an object
CA2893586C (en) 3d virtual environment interaction system
CN116324680A (en) Method for manipulating objects in an environment
US8643569B2 (en) Tools for use within a three dimensional scene
CN106325835B (en) 3D application icon interaction method applied to touch terminal and touch terminal
CN110141855A (en) Method of controlling viewing angle, device, storage medium and electronic equipment
US20190050132A1 (en) Visual cue system
US20150067603A1 (en) Display control device
US10649615B2 (en) Control interface for a three-dimensional graphical object
US10983661B2 (en) Interface for positioning an object in three-dimensional graphical space
US10359906B2 (en) Haptic interface for population of a three-dimensional virtual environment
CN106204713B (en) Static merging processing method and device
CN103488292A (en) Three-dimensional application icon control method and device
WO2019076084A1 (en) Method and apparatus for displaying with 3d parallax effect
JP2004362218A (en) Three-dimensional object operating method
Lee et al. Tunnelslice: Freehand subspace acquisition using an egocentric tunnel for wearable augmented reality
Simeone et al. Comparing indirect and direct touch in a stereoscopic interaction task
JP5767371B1 (en) Game program for controlling display of objects placed on a virtual space plane
Ohnishi et al. Virtual interaction surface: Decoupling of interaction and view dimensions for flexible indirect 3D interaction
JP2021099800A (en) Method of modifying rendering of 3d scene area in immersive environment
US11087528B1 (en) 3D object generation
Palmerius et al. Interaction design for selection and manipulation on immersive touch table display systems for 3D geographic visualization
CN106296829A (en) The localization method of the physical environment in virtual reality system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16919434

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016919434

Country of ref document: EP

Effective date: 20190411

NENP Non-entry into the national phase

Ref country code: DE