WO2022174192A1 - X-ray sight view and remote anchor for selection tasks in xr environment - Google Patents

X-ray sight view and remote anchor for selection tasks in xr environment Download PDF

Info

Publication number
WO2022174192A1
WO2022174192A1 PCT/US2022/016463 US2022016463W WO2022174192A1 WO 2022174192 A1 WO2022174192 A1 WO 2022174192A1 US 2022016463 W US2022016463 W US 2022016463W WO 2022174192 A1 WO2022174192 A1 WO 2022174192A1
Authority
WO
WIPO (PCT)
Prior art keywords
controller
ray
sight view
anchor
reality environment
Prior art date
Application number
PCT/US2022/016463
Other languages
French (fr)
Inventor
Chao MEI
Yifan Yang
Yi Xu
Original Assignee
Innopeak Technology, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Innopeak Technology, Inc. filed Critical Innopeak Technology, Inc.
Priority to CN202280046627.3A priority Critical patent/CN117597652A/en
Publication of WO2022174192A1 publication Critical patent/WO2022174192A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/06Ray-tracing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Definitions

  • the present disclosure relates, in general, to methods, systems, and apparatuses for interacting with an extended reality (XR) environment.
  • XR extended reality
  • Virtual environments have increasingly integrated their way into the modern everyday experience.
  • a virtual pointer is typically used to interact with menus or virtual objects. Operation is analogous to a laser pointer, in which a ray is projected forward, in a straight line, from a controller. The rays are projected forward until there is collision with a virtual object.
  • Virtual rays associated with laser pointers in an XR environment are typically plain rays controlled by devices, such as a 6 degree-of-freedom (dof) tracked controller (e.g., an Oculus Quest controller) or 3-dof controller implemented using smartphones.
  • dof 6 degree-of-freedom
  • Novel tools and techniques for providing an x-ray sight view and remote anchor control for selection tasks in an XR environment are provided.
  • a method may include generating, via a computer system, a controller ray in an extended reality environment, wherein the controller ray is configured to originate from a virtual representation of a controller and extend from the virtual representation of the controller in a first direction in the extended reality environment.
  • the method may further include generating, via the computer system, a sight view camera at a first position along a trajectory of the controller ray, and displaying, via the computer system, a sight view to a user in the extended reality environment, the sight view displaying a view of the sight view camera at the first position along the trajectory of the controller ray.
  • An apparatus may include a non-transitory computer readable medium in communication with the processor, the non-transitory computer readable medium having encoded thereon a set of instructions executable by the processor to perform various functions.
  • the set of instructions may be executed by the processor to generate, via a computer system, a controller ray in an extended reality environment, wherein the controller ray is configured to originate from a virtual representation of a controller and extend from the virtual representation of the controller in a first direction in the extended reality environment.
  • the set of instructions may further be executed by the processor to generate, via the computer system, a sight view camera at a first position along a trajectory of the controller ray, and display, via the computer system, a sight view to a user in the extended reality environment, the sight view displaying a view of the sight view camera at the first position along the trajectory of the controller ray.
  • a system may include a controller and a user device.
  • the user device may further include a processor, and a non-transitory computer readable medium in communication with the processor, the non-transitory computer readable medium having encoded thereon a set of instructions executable by the processor to generate, via a computer system, a controller ray in an extended reality environment, wherein the controller ray is configured to originate from a virtual representation of a controller and extend from the virtual representation of the controller in a first direction in the extended reality environment.
  • the set of instructions may further be executed by the processor to generate, via the computer system, a sight view camera at a first position along a trajectory of the controller ray, and display, via the computer system, a sight view to a user in the extended reality environment, the sight view displaying a view of the sight view camera at the first position along the trajectory of the controller ray.
  • FIG. 1 is a schematic block diagram of a system for an x-ray sight view and remote anchor control for selection tasks in an XR environment are provided, in accordance with various embodiments;
  • FIG. 2 is a schematic diagram of an x-ray sight view in an XR environment, in accordance with various embodiments
  • FIG. 3 is a schematic diagram of an XR environment providing an x-ray sight view and remote anchor control, in accordance with various embodiments;
  • FIG. 4 is a flow diagram of a method for providing an x-ray sight view in an
  • FIG. 5 is a schematic block diagram of a computer system for providing transformer-based scene text detection, in accordance with various embodiments.
  • a method may include generating, via a computer system, a controller ray in an extended reality environment, wherein the controller ray is configured to originate from a virtual representation of a controller and extend from the virtual representation of the controller in a first direction in the extended reality environment.
  • the method may further include generating, via the computer system, a sight view camera at a first position along a trajectory of the controller ray, and displaying, via the computer system, a sight view to a user in the extended reality environment, the sight view displaying a view of the sight view camera at the first position along the trajectory of the controller ray.
  • the method may further include generating, via the computer system, a remote anchor at a second position along the trajectory of the controller ray, wherein the remote anchor is a proxy virtual representation of the controller, wherein an orientation of the remote anchor is configured to be controlled by the controller, and generating, via the computer system, an anchor ray, wherein anchor ray is configured to originate from the remote anchor, and extend from the remote anchor in a second direction that is based on the orientation of the controller.
  • the method may further include determining, via the computer system, a collision point of the controller ray with a foreground object in the extended reality environment, wherein at least one of the first position or second position along the trajectory of the controller ray is the collision point.
  • the method may include selecting, via the controller, an object with the anchor ray.
  • the method may include adjusting, via the controller, a three dimensional position of the remote anchor in the extended reality environment based on an adjustment input.
  • the method may include adjusting, via the controller, a three dimensional position of the sight view camera in the extended reality environment based on an adjustment input.
  • the sight view camera may be positioned such that a near clip plane of the sight view camera is at or beyond the collision point on a trajectory along the controller ray traveling in a direction from the controller towards the foreground object.
  • the method may further include determining, via the computer system, that an occluded object has been selected by a sight view ray, wherein the sight view ray continues as an extension of the controller ray, in the same direction as the controller ray, and from the collision point, wherein the sight view ray is displayed in the sight view, wherein the controller ray is blocked, in the extended reality environment, by a foreground object from reaching the occluded object.
  • an apparatus for an x-ray sight view and remote anchor in an XR environment may include a processor, and a non-transitory computer readable medium in communication with the processor, the non- transitory computer readable medium having encoded thereon a set of instructions executable by the processor to perform various functions.
  • the set of instructions may be executed by the processor to generate, via a computer system, a controller ray in an extended reality environment, wherein the controller ray is configured to originate from a virtual representation of a controller and extend from the virtual representation of the controller in a first direction in the extended reality environment.
  • the set of instructions may further be executed by the processor to generate, via the computer system, a sight view camera at a first position along a trajectory of the controller ray, and display, via the computer system, a sight view to a user in the extended reality environment, the sight view displaying a view of the sight view camera at the first position along the trajectory of the controller ray.
  • the set of instructions may further be executed by the processor to generate, via the computer system, a remote anchor at a second position along the trajectory of the controller ray, wherein the remote anchor is a proxy virtual representation of the controller, wherein an orientation of the remote anchor is configured to be controlled by the controller, and generate, via the computer system, an anchor ray, wherein anchor ray is configured to originate from the remote anchor, and extend from the remote anchor in a second direction that is based on the orientation of the controller.
  • the set of instructions may be executed by the processor to determine, via the computer system, a collision point of the controller ray with a foreground object in the extended reality environment, wherein at least one of the first position or second position along the trajectory of the controller ray is the collision point.
  • the set of instructions may further be executed by the processor to select, via the controller, an object with the anchor ray.
  • the set of instructions may further be executed by the processor to adjust, via the controller, a three dimensional position of the remote anchor in the extended reality environment based on an adjustment input, and adjust, via the controller, a three dimensional position of the sight view camera in the extended reality environment based on an adjustment input.
  • the sight view camera is positioned such that a near clip plane of the sight view camera is at or beyond the collision point on a trajectory along the controller ray traveling in a direction from the controller towards the foreground object.
  • the set of instructions may further be executable by the processor to determine, via the computer system, that an occluded object has been selected by a sight view ray, wherein the sight view ray continues as an extension of the controller ray, in the same direction as the controller ray, and from the collision point, wherein the sight view ray is displayed in the sight view, wherein the controller ray is blocked, in the extended reality environment, by a foreground object from reaching the occluded object.
  • a system for x-ray sight view and remote anchor in an XR environment may include a controller and a user device.
  • the user device may further include a processor, and a non-transitory computer readable medium in communication with the processor, the non-transitory computer readable medium having encoded thereon a set of instructions executable by the processor to generate, via a computer system, a controller ray in an extended reality environment, wherein the controller ray is configured to originate from a virtual representation of a controller and extend from the virtual representation of the controller in a first direction in the extended reality environment.
  • the set of instructions may further be executed by the processor to generate, via the computer system, a sight view camera at a first position along a trajectory of the controller ray, and display, via the computer system, a sight view to a user in the extended reality environment, the sight view displaying a view of the sight view camera at the first position along the trajectory of the controller ray.
  • the set of instructions may further be executable by the processor to generate a remote anchor at the collision point at a second position along the trajectory of the controller ray, wherein the remote anchor is a proxy virtual representation of the controller, wherein an orientation of the remote anchor is configured to be controlled by the controller, generate an anchor ray, wherein anchor ray is configured to originate from the remote anchor, and extend from the remote anchor in a second direction that is based on the orientation of the controller, and select, via the controller, an object with the anchor ray.
  • the set of instructions may be executed by the processor to determine a collision point of the controller ray with a foreground object in the extended reality environment, wherein at least one of the first position or second position along the trajectory of the controller ray is the collision point.
  • the sight view camera may be positioned such that a near clip plane of the sight view camera is at or beyond the collision point on a trajectory along the controller ray traveling in a direction from the controller towards the foreground object.
  • the set of instructions may further be executable by the processor to determine, via the computer system, that an occluded object has been selected by a sight view ray, wherein the sight view ray continues as an extension of the controller ray, in the same direction as the controller ray, and from the collision point, wherein the sight view ray is displayed in the sight view, wherein the controller ray is blocked, in the extended reality environment, by a foreground object from reaching the occluded object
  • the various embodiments include, without limitation, methods, systems, apparatuses, and/or software products.
  • a method might comprise one or more procedures, any or all of which may be executed by a computer system.
  • an embodiment might provide a computer system configured with instructions to perform one or more procedures in accordance with methods provided by various other embodiments.
  • a computer program might comprise a set of instructions that are executable by a computer system (and/or a processor therein) to perform such operations.
  • such software programs are encoded on physical, tangible, and/or non-transitory computer readable media (such as, to name but a few examples, optical media, magnetic media, and/or the like).
  • Various embodiments described herein, embodying software products and computer-performed methods represent tangible, concrete improvements to existing technological areas, including, without limitation, XR platforms and environments.
  • implementations of various embodiments provide additional ways for user interaction with XR environments, and specifically for selection tasks within the XR environment.
  • Conventional approaches to selection tasks in an XR environment utilize a virtual "laser pointer,” referred to here as a "plain ray,” for a user to indicate an object for selection.
  • a straight ray may be projected from a controller towards a desired object for selection.
  • an object with which the ray collides may be selected by a user.
  • occluded objects e.g., objects behind other objects
  • objects may, for example, be located behind walls, behind other objects, or within other objects.
  • high density object placement in the XR environments makes selection between closely placed objects difficult.
  • objects that are far from the user and/or small objects become difficult to precisely select and interact with.
  • the x-ray sight view and remote anchor user functionality allows for a more robust selection task and object interaction solution. Specifically, objects which previously could not be selected, or are difficult to select, are more easily selectable. [0035] To the extent any abstract concepts are present in the various embodiments, those concepts can be implemented as described herein by devices, software, systems, and methods that involve functionality (e.g., steps or operations), such as providing and using an x-ray sight view and remote anchor in an XR environment.
  • Fig. 1 is a schematic block diagram of a system 100 providing x-ray sight view and remote anchor in an XR environment.
  • the system 100 includes a user device 105, XR application 110, x-ray sight view logic 115, remote anchor logic 120, and controller 125.
  • XR application 110 x-ray sight view logic 115
  • remote anchor logic 120 remote anchor logic 120
  • controller 125 controller 125
  • the user device 105 may include the XR application
  • the user device 105 may be coupled to the controller 125.
  • a user device 105 may allow a user to interact with the XR application 110 via the controller 125.
  • the XR application 110 may include an application or program configured to generate a virtual environment.
  • extended reality or "XR” may be an umbrella covering various types of simulated virtual environments, and combined physical and virtual environments. Accordingly, an XR environment may include virtual reality (VR), augmented reality (AR), and mixed reality (MR) environments.
  • the x-ray sight view logic 115 and remote anchor logic 120 may be implemented as hardware and/or software running on one or more computer systems. In some embodiments, as depicted, the x-ray sight view logic 115 and remote anchor logic 120 may be implemented as software that may be executed on the user device 105, and/or as part of the XR application 110.
  • the x-ray sight view logic 115 and remote anchor logic 120 may be configured to run on one or more remote devices, such as an edge compute device of a cloud platform, to which the user device 105 may be communicatively coupled.
  • the XR application 110 may also be configured to run, at least in part, on a remote computer system.
  • the computer systems implementing the x-ray sight view logic 115 and remote anchor logic 120 may include one or more physical machines or one or more virtual machines (VM).
  • each of the XR application 110, x-ray sight view logic 115, and remote anchor logic 120 may be implemented to run locally on a user device.
  • the one or more computer systems may be arranged in a distributed (or centralized) architecture, such as in a cloud platform, and thus run on one or more remote devices.
  • the controller 125 may include various user input / output devices, including, without limitation, a 6-dof controller (e.g., Bluetooth, IR, RF, wired, etc.), 3-dof controller, smartphone, wearable XR devices (e.g., a VR/AR headset, glasses, etc.), motion trackers, position trackers, joystick, mouse, keyboard, etc.
  • a 6-dof controller e.g., Bluetooth, IR, RF, wired, etc.
  • 3-dof controller e.g., smartphone
  • wearable XR devices e.g., a VR/AR headset, glasses, etc.
  • motion trackers e.g., a VR/AR headset, glasses, etc.
  • position trackers e.g., joystick, mouse, keyboard, etc.
  • the controller 125 may accept a user's inputs for interaction with the XR environment, and specifically with the XR app 110, x-ray sight view logic 115, and remote anchor logic 120.
  • the controller 125 may be configured to provide information regarding its position and orientation within the XR environment and/or relative to the user within the XR environment.
  • the controller 125 may act as an origination point of a virtual ray (e.g., a virtual laser pointer) in the XR environment.
  • the controller 125 may include additional virtual or physical interfaces, such as a virtual button on a touch screen, or a physical button.
  • the interfaces of the controller 125 may be configured to allow a user to toggle on/off an x-ray sight view and remote anchor of the x- ray sight view as described in greater detail below.
  • the controller 125 may include a touchscreen or physical touchpad, and be configured to enable gesture interactions in the XR environment.
  • (3D) virtual objects may block other 3D virtual objects from being selected or otherwise interacted with.
  • a plain ray originating from the controller 125 (or laser pointer in the XR environment) may be blocked by an object in the foreground (or otherwise in covering) of the object to be selected and/or interacted with by a user.
  • Fig. 2 is a schematic diagram of an x-ray sight view in an XR environment 200, in accordance with various embodiments.
  • the XR environment 200 may include a controller 205a, 205b, ray 210a, 210b, virtual cabinet 215a, 215b, collision point 220a, 220b, x-ray sight view 225, and virtual object 230.
  • the various objects and tools of the XR environment 200 are schematically illustrated in Fig. 2, and that modifications to the various components, objects, and other arrangements of XR environment 200 may be possible and in accordance with the various embodiments.
  • the XR environment 200 may be an environment generated by the XR application 110.
  • a ray 210a may be projected from the controller 205a towards the virtual cabinet 215a.
  • the ray 210a may, thus, be configured to continue projecting forward until collision with an object is made, at collision point 220a.
  • various virtual objects, including object 230 may be held inside the cabinet 215a.
  • the cabinet 215a may occlude the object 230, preventing object 230 from being selected with the ray 210a.
  • the user via controller 205a
  • a user may provide an input on the controller 125 to invoke an x-ray sight view 225.
  • a user may use a swipe gesture on a touch screen or touchpad of the controller 125 to invoke an x-ray sight view 225.
  • the controller 125 may be a smartphone, and the user may utilize swiping gesture on the touchscreen of the smartphone to invoke the x-ray sight view 225.
  • a swiping gesture is described in the above examples as the input to invoke the x-ray sight view 225, it is to be understood that the invocation of the x-ray sight view 225 should not be limited to such gestures, and in other embodiments, other inputs may be used. For example, one or more of button presses, taps, swipes, sequence of inputs, or other types of inputs may be configured to invoke the x- ray sight view 225. In some further examples, a voice input, obtained via a microphone (e.g., of the controller and/or coupled to the user device 105) may be utilized to invoke the x-ray sight view 225.
  • a voice input obtained via a microphone (e.g., of the controller and/or coupled to the user device 105) may be utilized to invoke the x-ray sight view 225.
  • the x-ray sight view 225 may be configured to allow a user to peer into and/or otherwise see through objects in the foreground, as with the cabinet 215b.
  • the x-ray sight view 225 may be created to see through the immediate object with which ray 210a makes contact at collision point 220a.
  • an additional virtual camera may be generated to render a close-up view.
  • the x-ray sight view logic 115 may be configured to generate the additional camera, referred to as the "sight view camera.”
  • the x-ray sight view logic 115 may further be configured to generate and display the close-up view to a user.
  • the close-up view may be rendered on top of all the objects of the XR environment 200 (such as the cabinet 215b, and ray 210b).
  • the x-ray sight view 225 may further be displayed over a specific area of a user's view in the XR environment 200 (e.g., a top right corner, top left corner, center, off-center, etc.).
  • the location at which the x-ray sight view 225 is rendered in the XR environment 200 may be set and/or adjusted by the user.
  • the ray 210b may continue, through cabinet 215b, to select object 220b.
  • the point of collision 220b of the ray 210b is thus located on the object 230.
  • the x-ray sight view logic 115 may be configured to allow the ray 210b to continue through the cabinet 215b.
  • a new ray, a "sight view ray” may be generated to continue from the point of collision 220a, internally into the cabinet 215b.
  • the sight view ray may be an anchor ray, as described below with respect to Fig. 3.
  • the sight view ray may be projected from the sight view camera.
  • the x-ray sight view 225 may provide the user with see-through capability, with object(s) between the view of the x-ray sight view 225 and the user not being rendered in the x-ray sight view 225. In some examples, this may be similar to a near clip plane of a virtual camera.
  • the sight view camera may be generated, in some examples, such that near clip plane of the sight view camera is located at (or beyond) the point of collision 220a, such that the sight view camera can view through the occluding object (e.g., cabinet 215a, 215b). Because the front surface of the cabinet 215b is between the view of the sight view camera and the user, it is not rendered.
  • the user may control the depth of the view of the sight view camera along the trajectory of the ray 210b and/or sight view ray.
  • operations such as push forward or pull backward may be implemented as input gestures on the controller 125, such as, without limitation, a swipe up or down gesture on a touchscreen or touchpad.
  • the sight view camera may be configured to look through the front surface of the cabinet 215a, 215b, and present the view to the user via the x-ray sight view 225.
  • the view of the sight view camera may be presented as the x-ray sight view 225, over (e.g., on top of) the other objects of a user's view, or in a dedicated area of the user's view (e.g., top right, top left, center, off-center, left, right, bottom left, bottom right, etc.).
  • the location of the x-ray sight view 225 may be user configurable. Accordingly, utilizing the x-ray sight view 225, the user may be able to select the object 230 through the cabinet 215b.
  • Fig. 3 is a schematic diagram of an XR environment 300 providing an x-ray sight view and remote anchor control, in accordance with various embodiments.
  • Fig. 3 depicts the XR environment 300 from a schematic top-down view.
  • the XR environment 300 includes controller 305, remote anchor 310, a first virtual object 315a and second virtual object 315b, sight view 320, controller ray 325a, 325b, and anchor ray 330a, 330b.
  • controller ray 325a, 325b includes controller ray 325a, 325b, and anchor ray 330a, 330b.
  • remote anchor logic 120 may be configured to generate a remote anchor 310 within the XR environment 300 of the XR application 110.
  • a remote anchor 310 may be placed within the XR environment 300, and act as a virtual proxy of the controller 305.
  • the remote anchor 310 may be a remote virtual representation of the controller 305 (which is itself a virtual version of the controller 125).
  • the remote anchor 310 may be configured to behave as if it were the controller 305, and controlled by the physical controller 125, but from a different location (e.g., the location at which the remote anchor 310 is placed).
  • the remote anchor 310 may be placed at a location nearer to an object 315a, 315b.
  • the remote anchor logic 120 be configured to generate the remote anchor 310 and anchor ray originating from the remote anchor 310.
  • the remote anchor 310 enables a ray (e.g., the anchor ray) for selection of an object to originate from an arbitrary point in space, instead of originating at the controller 305.
  • the anchor ray 330a may be controlled by rotation and movement of the original controller 125, 305. For example, a movement of the controller 305 to the left, resulting in a movement of the controller ray 325a to the position of controller ray 325b, results in the same movement of anchor ray 330a to the position of anchor ray 330b.
  • the length of the anchor ray 330a, 330b (e.g., distance from the remote anchor 310 to the object 330a, 330b of interest) is much shorter than the length of the controller ray 325a, 325b (e.g., distance from the controller 305 to the object 315a, 315b).
  • the controller ray 325a, 325b e.g., distance from the controller 305 to the object 315a, 315b.
  • the user may further be presented with an x-ray sight view 320 that is closer to the objects 315a, 315b.
  • the x-ray sight view may be generated from the view of the remote anchor 310.
  • the sight view may be generated at the location of the remote anchor.
  • the remote anchor 310 may also be a sight view camera.
  • the remote anchor 310 may generate a sight view camera from locations visible to the remote anchor 310.
  • the sight view camera may be at a different location from the remote anchor 310.
  • the sight view camera may, in some examples, be placed close to the remote anchor 310, or be situated in a co-planar manner to the remote anchor 310.
  • the sight view camera may be positioned to provide a sight view that has the same offset to the remote anchor as a user's location to the controller 305 (e.g., the virtual representation of a physical controller).
  • the perception of the user to the movement of the remote anchor 310 may be similar to the user's perception in movement of the controller 305.
  • a user may enable the remote anchor 310 as an
  • an anchor ray may be generated, via the remote anchor logic 120, originating from the remote anchor 310, which may be controllable by the controller 125, 305.
  • the sight view 320 may provide a magnified view of the object 330a, to which the view of the remote anchor 310 is directed.
  • the sight view may, in some examples, be an x-ray sight view as previously described.
  • the remote anchor 310 and/or sight view camera may be placed such that a near clipping plane of the remote anchor 310 and/or the sight view camera is at or past an occluding object.
  • the remote anchor 310 may be invoked, for example by providing user input via the controller 125.
  • user inputs to invoke the remote anchor 310 may include, without limitation, one or more of button presses, taps, swipes, sequence of inputs, voice inputs, or other types of inputs.
  • remote anchor logic 120 may be configured to generate the remote anchor 310 in response to receiving the user input invoking the remote anchor 310.
  • the remote anchor 310 may be generated at a point in space along the controller ray 325a, such as at a collision point 220a, or in free-space.
  • a user may point at a location on the wall and tap to place the remote anchor 310 at the location to which the controller ray 325a points.
  • the location of the remote anchor 310 may further be user configurable.
  • the remote anchor 310 may be generated in free space, or at a location indicated by the controller ray 325.
  • the 3D position of the remote anchor 310 may be manipulated via user inputs to the controller 125.
  • a sight view 225, 320 may be invoked, and the position of the remote anchor 310 may be determined automatically based on the position and orientation of the sight view 225, 320.
  • the remote anchor 310 may be generated as a sight view camera for the sight view 225, 320.
  • Fig. 4 is a flow diagram of a method for providing an x-ray sight view and remote anchor, in accordance with various embodiments.
  • the method 400 begins, at block 405, by generating a controller ray.
  • a controller ray may be a plain ray used by a user in the XR environment to complete selection and interaction tasks.
  • the controller ray may be emitted from a virtual controller, which may be controllable via a physical controller.
  • the virtual controller in the XR environment may be configured to mimic the orientation and motion of the physical controller.
  • the controller ray may project in a straight direction, according to an alignment / position of the controller (e.g., direction at which the controller is pointed).
  • the controller may act as a virtual laser pointer, and the controller ray may be projected forward by the controller, analogous to a virtual laser beam.
  • the method 400 may continue, at block 410, by obtaining, from the controller, a user input invoking a sight view and/or remote anchor.
  • the user may invoke both a remote anchor and sight view.
  • the user may choose to invoke only a sight view, such as an x-ray sight view.
  • a user may invoke a remote anchor and/or sight view based on user inputs from a controller.
  • User inputs may include one or more of button presses, taps, swipes, voice command, sequence of inputs, or other types of inputs.
  • the method 400 may continue, at block 415, by determining a collision point of the controller ray with an object.
  • the controller ray may continue forward, from the virtual controller of the XR environment, until the controller ray collides with an object.
  • the collision point may be a point on the object at which contact is made between the controller ray and the object.
  • the method 400 may continue, at block 415, by generating a sight view camera.
  • the sight view camera may be generated at the collision point of the controller ray.
  • the sight view camera may be generated such that near clip plane of the sight view camera is located at (or beyond) the point of collision of the controller ray.
  • the sight view camera may be configured to view through an occluding object blocking the controller ray.
  • the user may control the depth of the view of the sight view camera along the trajectory of the controller ray and/or remote anchor ray. For example, operations such as push forward or pull backward may be implemented as input gestures on the controller.
  • the sight view camera may be configured to look through the front surface of the object with which the controller ray collides, and to present the view of the sight view camera (placed such a near clipping plane of the virtual camera is at and/or past the collision point).
  • the sight view is displayed to the user.
  • the view of the sight view camera may be presented as an x- ray sight view.
  • the sight view may be a point in space that is closer in distance to an object of interest.
  • the sight view may be displayed over (e.g., on top of) the other objects of a in user's view (e.g., the view is presented on top of other objects, including occluding objects), or in a dedicated area of the user's view (e.g., top right, top left, center, off-center, left, right, bottom left, bottom right, etc.).
  • the location of the x-ray sight view may be user configurable.
  • a remote anchor may be generated at the collision point of the controller ray, or in free space.
  • the positioning of the remote anchor in 3D space may further be user configurable / adjustable. For example, the position of the remote anchor may be moved forward and backward, up, down, left, and right. In some further embodiments, the orientation of the remote anchor may further be user adjusted. For example, the remote anchor may be flipped and/or rotated.
  • the remote anchor may be a virtual proxy of the controller, and configured to behave as if it were the controller.
  • the remote anchor may mimic the controller's movements and rotation.
  • the remote anchor may be configured to be controlled by the physical controller, but at a different location from the physical controller.
  • the position of the remote anchor may be determined automatically based on the position and orientation of the sight view / sight view camera.
  • the method 400 may continue by generating a remote anchor ray.
  • the remote anchor logic of a user device may be configured to generate an anchor ray in the XR environment, originating from the remote anchor.
  • the anchor ray may be controlled, via the controller, through the use of the remote anchor.
  • the method may continue, at block 430, by identifying an object selected with the anchor ray.
  • the anchor ray may be rendered from the perspective of the sight view camera in the sight view.
  • the method 400 continues by identifying an object selected, by the user, with the anchor ray.
  • the anchor ray may be a sight view ray originating from the sight view camera.
  • the sight view camera and remote anchor may disparate components, and thus the sight view camera may render the anchor ray from the perspective of the sight view camera, with the anchor ray originating from the remote anchor.
  • anchor ray may coincide with the controller ray (e.g., the remote anchor is aligned with the controller and controller ray.
  • the remote anchor may placed at the same location as the controller, or no remote anchor may be placed, in which case the anchor ray may be the same as the controller ray.
  • Fig. 5 is a schematic block diagram of a computer system for transformer-based scene text detection, in accordance with various embodiments.
  • Fig. 5 provides a schematic illustration of one embodiment of a computer system 500, such as the system 100, user device 105, controller 125, x-ray sight view logic 115, remote anchor logic 120, or subsystems thereof, which may perform the methods provided by various other embodiments, as described herein.
  • Fig. 5 only provides a generalized illustration of various components, of which one or more of each may be utilized as appropriate.
  • Fig. 5, therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
  • the computer system 500 includes multiple hardware elements that may be electrically coupled via a bus 505 (or may otherwise be in communication, as appropriate).
  • the hardware elements may include one or more processors 510, including, without limitation, one or more general-purpose processors and/or one or more special-purpose processors (such as microprocessors, digital signal processing chips, graphics acceleration processors, and microcontrollers); one or more input devices 515, which include, without limitation, a mouse, a keyboard, one or more sensors, and/or the like; and one or more output devices 520, which can include, without limitation, a display device, and/or the like.
  • processors 510 including, without limitation, one or more general-purpose processors and/or one or more special-purpose processors (such as microprocessors, digital signal processing chips, graphics acceleration processors, and microcontrollers); one or more input devices 515, which include, without limitation, a mouse, a keyboard, one or more sensors, and/or the like; and one or more output devices 520, which can include,
  • the computer system 500 may further include (and/or be in communication with) one or more storage devices 525, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, solid-state storage device such as a random-access memory (“RAM”) and/or a read-only memory (“ROM”), which can be programmable, flash- updateable, and/or the like.
  • storage devices 525 can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, solid-state storage device such as a random-access memory (“RAM”) and/or a read-only memory (“ROM”), which can be programmable, flash- updateable, and/or the like.
  • RAM random-access memory
  • ROM read-only memory
  • Such storage devices may be configured to implement any appropriate data stores, including, without limitation, various file systems, database structures, and/or the like.
  • the computer system 500 might also include a communications subsystem
  • the computer system 500 further comprises a working memory 535, which can include a RAM or ROM device, as described above.
  • the computer system 500 also may comprise software elements, shown as being currently located within the working memory 535, including an operating system 540, device drivers, executable libraries, and/or other code, such as one or more application programs 545, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • application programs 545 may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods.
  • a set of these instructions and/or code might be encoded and/or stored on a non-transitory computer readable storage medium, such as the storage device(s) 525 described above.
  • the storage medium might be incorporated within a computer system, such as the system 500.
  • the storage medium might be separate from a computer system (i.e., a removable medium, such as a compact disc, etc.), and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon.
  • These instructions might take the form of executable code, which is executable by the computer system 500 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer system 500 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.) then takes the form of executable code.
  • some embodiments may employ a computer or hardware system (such as the computer system 500) to perform methods in accordance with various embodiments of the invention.
  • some or all of the procedures of such methods are performed by the computer system 500 in response to processor 510 executing one or more sequences of one or more instructions (which might be incorporated into the operating system 540 and/or other code, such as an application program 545) contained in the working memory 535.
  • Such instructions may be read into the working memory 535 from another computer readable medium, such as one or more of the storage device(s) 525.
  • execution of the sequences of instructions contained in the working memory 535 might cause the processor(s) 510 to perform one or more procedures of the methods described herein.
  • machine readable medium and “computer readable medium,” as used herein, refer to any medium that participates in providing data that causes a machine to operate in a specific fashion.
  • various computer readable media might be involved in providing instructions/code to processor(s) 510 for execution and/or might be used to store and/or carry such instructions/code (e.g., as signals).
  • a computer readable medium is a non-transitory, physical, and/or tangible storage medium.
  • a computer readable medium may take many forms, including, but not limited to, non-volatile media, volatile media, or the like.
  • Non-volatile media includes, for example, optical and/or magnetic disks, such as the storage device(s) 525.
  • Volatile media includes, without limitation, dynamic memory, such as the working memory 535.
  • a computer readable medium may take the form of transmission media, which includes, without limitation, coaxial cables, copper wire and fiber optics, including the wires that comprise the bus 505, as well as the various components of the communication subsystem 530 (and/or the media by which the communications subsystem 530 provides communication with other devices).
  • transmission media can also take the form of waves (including, without limitation, radio, acoustic, and/or light waves, such as those generated during radio-wave and infra-red data communications).
  • Common forms of physical and/or tangible computer readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read instructions and/or code.
  • Various form of computer readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 510 for execution.
  • the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer.
  • a remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer system 500.
  • These signals which might be in the form of electromagnetic signals, acoustic signals, optical signals, and/or the like, are all examples of carrier waves on which instructions can be encoded, in accordance with various embodiments of the invention.
  • the communications subsystem 530 (and/or components thereof) generally receives the signals, and the bus 505 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the working memory 535, from which the processor(s) 510 retrieves and executes the instructions.
  • the instructions received by the working memory 535 may optionally be stored on a storage device 525 either before or after execution by the processor(s) 510.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A system may include a controller and user device comprising a processor, and a non-transitory computer readable medium having encoded thereon a set of instructions executable by the processor to generate a controller ray in an extended reality environment, wherein the controller ray is configured to originate from a virtual representation of the controller and extend from the virtual representation of the controller in a first direction in the extended reality environment, generate a sight view camera at a first position along a trajectory of the controller ray, and display a sight view to a user in the extended reality environment.

Description

X-RAY SIGHT VIEW AND REMOTE ANCHOR FOR SELECTION TASKS IN XR
ENVIRONMENT
COPYRIGHT STATEMENT
[0001] A portion of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
FIELD
[0002] The present disclosure relates, in general, to methods, systems, and apparatuses for interacting with an extended reality (XR) environment.
BACKGROUND
[0003] Virtual environments have increasingly integrated their way into the modern everyday experience. In an XR environment, a virtual pointer is typically used to interact with menus or virtual objects. Operation is analogous to a laser pointer, in which a ray is projected forward, in a straight line, from a controller. The rays are projected forward until there is collision with a virtual object. Virtual rays associated with laser pointers in an XR environment are typically plain rays controlled by devices, such as a 6 degree-of-freedom (dof) tracked controller (e.g., an Oculus Quest controller) or 3-dof controller implemented using smartphones.
[0004] Using plain rays to complete 3D selection tasks becomes challenging using traditional laser pointer control. For example, occluded objects are unable to be selected, and objects to which the user does not have line of sight also become difficult to select. Moreover, precise selection of objects placed closely together (e.g., high density object placement), or objects far from the user is also challenging.
[0005] Thus, methods, systems, and apparatuses for providing an x-ray sight view and remote anchor control for selection tasks in an XR environment are provided. SUMMARY
[0006] Novel tools and techniques for providing an x-ray sight view and remote anchor control for selection tasks in an XR environment are provided.
[0007] A method may include generating, via a computer system, a controller ray in an extended reality environment, wherein the controller ray is configured to originate from a virtual representation of a controller and extend from the virtual representation of the controller in a first direction in the extended reality environment. The method may further include generating, via the computer system, a sight view camera at a first position along a trajectory of the controller ray, and displaying, via the computer system, a sight view to a user in the extended reality environment, the sight view displaying a view of the sight view camera at the first position along the trajectory of the controller ray.
[0008] An apparatus may include a non-transitory computer readable medium in communication with the processor, the non-transitory computer readable medium having encoded thereon a set of instructions executable by the processor to perform various functions. The set of instructions may be executed by the processor to generate, via a computer system, a controller ray in an extended reality environment, wherein the controller ray is configured to originate from a virtual representation of a controller and extend from the virtual representation of the controller in a first direction in the extended reality environment. The set of instructions may further be executed by the processor to generate, via the computer system, a sight view camera at a first position along a trajectory of the controller ray, and display, via the computer system, a sight view to a user in the extended reality environment, the sight view displaying a view of the sight view camera at the first position along the trajectory of the controller ray.
[0009] A system may include a controller and a user device. The user device may further include a processor, and a non-transitory computer readable medium in communication with the processor, the non-transitory computer readable medium having encoded thereon a set of instructions executable by the processor to generate, via a computer system, a controller ray in an extended reality environment, wherein the controller ray is configured to originate from a virtual representation of a controller and extend from the virtual representation of the controller in a first direction in the extended reality environment. The set of instructions may further be executed by the processor to generate, via the computer system, a sight view camera at a first position along a trajectory of the controller ray, and display, via the computer system, a sight view to a user in the extended reality environment, the sight view displaying a view of the sight view camera at the first position along the trajectory of the controller ray.
[0010] These illustrative embodiments are mentioned not to limit or define the disclosure, but to provide examples to aid understanding thereof. Additional embodiments are discussed in the Detailed Description, and further description is provided therein.
BRIEF DESCRIPTION OF THE DRAWINGS [0011] A further understanding of the nature and advantages of particular embodiments may be realized by reference to the remaining portions of the specification and the drawings, in which like reference numerals are used to refer to similar components. In some instances, a sub-label is associated with a reference numeral to denote one of multiple similar components. When reference is made to a reference numeral without specification to an existing sub-label, it is intended to refer to all such multiple similar components.
[0012] Fig. 1 is a schematic block diagram of a system for an x-ray sight view and remote anchor control for selection tasks in an XR environment are provided, in accordance with various embodiments;
[0013] Fig. 2 is a schematic diagram of an x-ray sight view in an XR environment, in accordance with various embodiments;
[0014] Fig. 3 is a schematic diagram of an XR environment providing an x-ray sight view and remote anchor control, in accordance with various embodiments;
[0015] Fig. 4 is a flow diagram of a method for providing an x-ray sight view in an
XR environment, in accordance with various embodiments;
[0016] Fig. 5 is a schematic block diagram of a computer system for providing transformer-based scene text detection, in accordance with various embodiments.
DETAILED DESCRIPTION OF EMBODIMENTS [0017] Various embodiments provide tools and techniques for an x-ray sight view and remote anchor for selection tasks in an XR environment.
[0018] In some examples, a method for an x-ray sight view and remote anchor in an
XR environment is provided. A method may include generating, via a computer system, a controller ray in an extended reality environment, wherein the controller ray is configured to originate from a virtual representation of a controller and extend from the virtual representation of the controller in a first direction in the extended reality environment. The method may further include generating, via the computer system, a sight view camera at a first position along a trajectory of the controller ray, and displaying, via the computer system, a sight view to a user in the extended reality environment, the sight view displaying a view of the sight view camera at the first position along the trajectory of the controller ray. [0019] In some examples, the method may further include generating, via the computer system, a remote anchor at a second position along the trajectory of the controller ray, wherein the remote anchor is a proxy virtual representation of the controller, wherein an orientation of the remote anchor is configured to be controlled by the controller, and generating, via the computer system, an anchor ray, wherein anchor ray is configured to originate from the remote anchor, and extend from the remote anchor in a second direction that is based on the orientation of the controller.
[0020] In some examples, the method may further include determining, via the computer system, a collision point of the controller ray with a foreground object in the extended reality environment, wherein at least one of the first position or second position along the trajectory of the controller ray is the collision point. In further examples, the method may include selecting, via the controller, an object with the anchor ray. In some examples, the method may include adjusting, via the controller, a three dimensional position of the remote anchor in the extended reality environment based on an adjustment input. In further examples, the method may include adjusting, via the controller, a three dimensional position of the sight view camera in the extended reality environment based on an adjustment input.
[0021] In yet further examples, the sight view camera may be positioned such that a near clip plane of the sight view camera is at or beyond the collision point on a trajectory along the controller ray traveling in a direction from the controller towards the foreground object. In some examples, the method may further include determining, via the computer system, that an occluded object has been selected by a sight view ray, wherein the sight view ray continues as an extension of the controller ray, in the same direction as the controller ray, and from the collision point, wherein the sight view ray is displayed in the sight view, wherein the controller ray is blocked, in the extended reality environment, by a foreground object from reaching the occluded object.
[0022] In some embodiments, an apparatus for an x-ray sight view and remote anchor in an XR environment is provided. The apparatus may include a processor, and a non-transitory computer readable medium in communication with the processor, the non- transitory computer readable medium having encoded thereon a set of instructions executable by the processor to perform various functions. The set of instructions may be executed by the processor to generate, via a computer system, a controller ray in an extended reality environment, wherein the controller ray is configured to originate from a virtual representation of a controller and extend from the virtual representation of the controller in a first direction in the extended reality environment. The set of instructions may further be executed by the processor to generate, via the computer system, a sight view camera at a first position along a trajectory of the controller ray, and display, via the computer system, a sight view to a user in the extended reality environment, the sight view displaying a view of the sight view camera at the first position along the trajectory of the controller ray.
[0023] In some examples, the set of instructions may further be executed by the processor to generate, via the computer system, a remote anchor at a second position along the trajectory of the controller ray, wherein the remote anchor is a proxy virtual representation of the controller, wherein an orientation of the remote anchor is configured to be controlled by the controller, and generate, via the computer system, an anchor ray, wherein anchor ray is configured to originate from the remote anchor, and extend from the remote anchor in a second direction that is based on the orientation of the controller.
[0024] In some examples, the set of instructions may be executed by the processor to determine, via the computer system, a collision point of the controller ray with a foreground object in the extended reality environment, wherein at least one of the first position or second position along the trajectory of the controller ray is the collision point. In some examples, the set of instructions may further be executed by the processor to select, via the controller, an object with the anchor ray. In some examples, the set of instructions may further be executed by the processor to adjust, via the controller, a three dimensional position of the remote anchor in the extended reality environment based on an adjustment input, and adjust, via the controller, a three dimensional position of the sight view camera in the extended reality environment based on an adjustment input.
[0025] In some examples, the sight view camera is positioned such that a near clip plane of the sight view camera is at or beyond the collision point on a trajectory along the controller ray traveling in a direction from the controller towards the foreground object. In yet further examples, the set of instructions may further be executable by the processor to determine, via the computer system, that an occluded object has been selected by a sight view ray, wherein the sight view ray continues as an extension of the controller ray, in the same direction as the controller ray, and from the collision point, wherein the sight view ray is displayed in the sight view, wherein the controller ray is blocked, in the extended reality environment, by a foreground object from reaching the occluded object.
[0026] In in further embodiments, a system for x-ray sight view and remote anchor in an XR environment is provided. The system may include a controller and a user device. The user device may further include a processor, and a non-transitory computer readable medium in communication with the processor, the non-transitory computer readable medium having encoded thereon a set of instructions executable by the processor to generate, via a computer system, a controller ray in an extended reality environment, wherein the controller ray is configured to originate from a virtual representation of a controller and extend from the virtual representation of the controller in a first direction in the extended reality environment. The set of instructions may further be executed by the processor to generate, via the computer system, a sight view camera at a first position along a trajectory of the controller ray, and display, via the computer system, a sight view to a user in the extended reality environment, the sight view displaying a view of the sight view camera at the first position along the trajectory of the controller ray.
[0027] In some examples, the set of instructions may further be executable by the processor to generate a remote anchor at the collision point at a second position along the trajectory of the controller ray, wherein the remote anchor is a proxy virtual representation of the controller, wherein an orientation of the remote anchor is configured to be controlled by the controller, generate an anchor ray, wherein anchor ray is configured to originate from the remote anchor, and extend from the remote anchor in a second direction that is based on the orientation of the controller, and select, via the controller, an object with the anchor ray. [0028] In some examples, the set of instructions may be executed by the processor to determine a collision point of the controller ray with a foreground object in the extended reality environment, wherein at least one of the first position or second position along the trajectory of the controller ray is the collision point. In some examples, the sight view camera may be positioned such that a near clip plane of the sight view camera is at or beyond the collision point on a trajectory along the controller ray traveling in a direction from the controller towards the foreground object.
[0029] In yet further examples, the set of instructions may further be executable by the processor to determine, via the computer system, that an occluded object has been selected by a sight view ray, wherein the sight view ray continues as an extension of the controller ray, in the same direction as the controller ray, and from the collision point, wherein the sight view ray is displayed in the sight view, wherein the controller ray is blocked, in the extended reality environment, by a foreground object from reaching the occluded object
[0030] In the following description, for the purposes of explanation, numerous details are set forth to provide a thorough understanding of the described embodiments. It will be apparent to one skilled in the art, however, that other embodiments may be practiced without some of these details. In other instances, structures and devices are shown in block diagram form. Several embodiments are described herein, and while various features are ascribed to different embodiments, it should be appreciated that the features described with respect to one embodiment may be incorporated with other embodiments as well. By the same token, however, no single feature or features of any described embodiment should be considered essential to every embodiment of the invention, as other embodiments of the invention may omit such features.
[0031] Unless otherwise indicated, all numbers used herein to express quantities, dimensions, and so forth used should be understood as being modified in all instances by the term "about." In this application, the use of the singular includes the plural unless specifically stated otherwise, and use of the terms "and" and "or" means "and/or" unless otherwise indicated. Moreover, the use of the term "including," as well as other forms, such as "includes" and "included," should be considered non-exclusive. Also, terms such as "element" or "component" encompass both elements and components comprising one unit and elements and components that comprise more than one unit, unless specifically stated otherwise.
[0032] The various embodiments include, without limitation, methods, systems, apparatuses, and/or software products. Merely by way of example, a method might comprise one or more procedures, any or all of which may be executed by a computer system. Correspondingly, an embodiment might provide a computer system configured with instructions to perform one or more procedures in accordance with methods provided by various other embodiments. Similarly, a computer program might comprise a set of instructions that are executable by a computer system (and/or a processor therein) to perform such operations. In many cases, such software programs are encoded on physical, tangible, and/or non-transitory computer readable media (such as, to name but a few examples, optical media, magnetic media, and/or the like).
[0033] Various embodiments described herein, embodying software products and computer-performed methods, represent tangible, concrete improvements to existing technological areas, including, without limitation, XR platforms and environments. Specifically, implementations of various embodiments provide additional ways for user interaction with XR environments, and specifically for selection tasks within the XR environment. Conventional approaches to selection tasks in an XR environment utilize a virtual "laser pointer," referred to here as a "plain ray," for a user to indicate an object for selection. Like a laser beam, a straight ray may be projected from a controller towards a desired object for selection. Thus, an object with which the ray collides may be selected by a user. Using plain rays, occluded objects (e.g., objects behind other objects), for which there is no path for the plain ray to reach the object, are unable to be selected. For example, objects may, for example, be located behind walls, behind other objects, or within other objects. Furthermore, high density object placement in the XR environments makes selection between closely placed objects difficult. Similarly, objects that are far from the user and/or small objects become difficult to precisely select and interact with.
[0034] The x-ray sight view and remote anchor user functionality, set forth below, allows for a more robust selection task and object interaction solution. Specifically, objects which previously could not be selected, or are difficult to select, are more easily selectable. [0035] To the extent any abstract concepts are present in the various embodiments, those concepts can be implemented as described herein by devices, software, systems, and methods that involve functionality (e.g., steps or operations), such as providing and using an x-ray sight view and remote anchor in an XR environment.
[0036] Fig. 1 is a schematic block diagram of a system 100 providing x-ray sight view and remote anchor in an XR environment. The system 100 includes a user device 105, XR application 110, x-ray sight view logic 115, remote anchor logic 120, and controller 125. It should be noted that the various components of the system 100 are schematically illustrated in Fig. 1 , and that modifications to the various components and other arrangements of system 100 may be possible and in accordance with the various embodiments.
[0037] In various embodiments, the user device 105 may include the XR application
110, x-ray sight view logic 115, and remote anchor logic 120. The user device 105 may be coupled to the controller 125. Thus, in some examples, a user device 105 may allow a user to interact with the XR application 110 via the controller 125.
[0038] In various embodiments, the XR application 110 may include an application or program configured to generate a virtual environment. As used herein, extended reality or "XR," may be an umbrella covering various types of simulated virtual environments, and combined physical and virtual environments. Accordingly, an XR environment may include virtual reality (VR), augmented reality (AR), and mixed reality (MR) environments. In various embodiments, the x-ray sight view logic 115 and remote anchor logic 120 may be implemented as hardware and/or software running on one or more computer systems. In some embodiments, as depicted, the x-ray sight view logic 115 and remote anchor logic 120 may be implemented as software that may be executed on the user device 105, and/or as part of the XR application 110.
[0039] In further embodiments, the x-ray sight view logic 115 and remote anchor logic 120 may be configured to run on one or more remote devices, such as an edge compute device of a cloud platform, to which the user device 105 may be communicatively coupled. Similarly, in some examples, the XR application 110 may also be configured to run, at least in part, on a remote computer system. Accordingly, the computer systems implementing the x-ray sight view logic 115 and remote anchor logic 120 may include one or more physical machines or one or more virtual machines (VM).
[0040] As shown in Fig. 1, in various embodiments, each of the XR application 110, x-ray sight view logic 115, and remote anchor logic 120 may be implemented to run locally on a user device. In some examples, the one or more computer systems may be arranged in a distributed (or centralized) architecture, such as in a cloud platform, and thus run on one or more remote devices.
[0041] In various embodiments, the controller 125 may include various user input / output devices, including, without limitation, a 6-dof controller (e.g., Bluetooth, IR, RF, wired, etc.), 3-dof controller, smartphone, wearable XR devices (e.g., a VR/AR headset, glasses, etc.), motion trackers, position trackers, joystick, mouse, keyboard, etc.
Accordingly, in various embodiments, the controller 125 may accept a user's inputs for interaction with the XR environment, and specifically with the XR app 110, x-ray sight view logic 115, and remote anchor logic 120. In various examples, the controller 125 may be configured to provide information regarding its position and orientation within the XR environment and/or relative to the user within the XR environment. Specifically, in some embodiments, the controller 125 may act as an origination point of a virtual ray (e.g., a virtual laser pointer) in the XR environment. In further examples, the controller 125 may include additional virtual or physical interfaces, such as a virtual button on a touch screen, or a physical button. In some embodiments, the interfaces of the controller 125 may be configured to allow a user to toggle on/off an x-ray sight view and remote anchor of the x- ray sight view as described in greater detail below. In some examples, the controller 125 may include a touchscreen or physical touchpad, and be configured to enable gesture interactions in the XR environment.
[0042] When interacting in an XR environment with a plain ray, three dimensional
(3D) virtual objects may block other 3D virtual objects from being selected or otherwise interacted with. In some examples, a plain ray originating from the controller 125 (or laser pointer in the XR environment) may be blocked by an object in the foreground (or otherwise in covering) of the object to be selected and/or interacted with by a user.
[0043] Various embodiments making use of the x-ray sight view are described below with reference to both of Figs. 1 & 2. Fig. 2 is a schematic diagram of an x-ray sight view in an XR environment 200, in accordance with various embodiments. The XR environment 200 may include a controller 205a, 205b, ray 210a, 210b, virtual cabinet 215a, 215b, collision point 220a, 220b, x-ray sight view 225, and virtual object 230. It should be noted that the various objects and tools of the XR environment 200 are schematically illustrated in Fig. 2, and that modifications to the various components, objects, and other arrangements of XR environment 200 may be possible and in accordance with the various embodiments.
[0044] In various embodiments, the XR environment 200 may be an environment generated by the XR application 110. In a first view, a ray 210a may be projected from the controller 205a towards the virtual cabinet 215a. The ray 210a may, thus, be configured to continue projecting forward until collision with an object is made, at collision point 220a. [0045] In some examples, various virtual objects, including object 230, may be held inside the cabinet 215a. Thus, the cabinet 215a may occlude the object 230, preventing object 230 from being selected with the ray 210a. Instead, the user (via controller 205a) may only be able to select the cabinet 215a when the ray 210a is pointed at it.
[0046] In an example of a conventional approach, to select an object within the cabinet 215a using a conventional plain ray, a user would typically need to open the cabinet 215 and select the object 230 after opening the cabinet 215a. This leads to inefficiencies, and lead to poorer user experience perceptions. Various embodiments set forth a proposed solution, in which a user may provide an input on the controller 125 to invoke an x-ray sight view 225. In some examples, a user may use a swipe gesture on a touch screen or touchpad of the controller 125 to invoke an x-ray sight view 225. In further examples, the controller 125 may be a smartphone, and the user may utilize swiping gesture on the touchscreen of the smartphone to invoke the x-ray sight view 225. Although a swiping gesture is described in the above examples as the input to invoke the x-ray sight view 225, it is to be understood that the invocation of the x-ray sight view 225 should not be limited to such gestures, and in other embodiments, other inputs may be used. For example, one or more of button presses, taps, swipes, sequence of inputs, or other types of inputs may be configured to invoke the x- ray sight view 225. In some further examples, a voice input, obtained via a microphone (e.g., of the controller and/or coupled to the user device 105) may be utilized to invoke the x-ray sight view 225.
[0047] Accordingly, in various embodiments, the x-ray sight view 225 may be configured to allow a user to peer into and/or otherwise see through objects in the foreground, as with the cabinet 215b. In some embodiments, the x-ray sight view 225 may be created to see through the immediate object with which ray 210a makes contact at collision point 220a. Thus, in various embodiments, when the x-ray sight view 225 is invoked, an additional virtual camera may be generated to render a close-up view. In various embodiments, the x-ray sight view logic 115 may be configured to generate the additional camera, referred to as the "sight view camera." The x-ray sight view logic 115 may further be configured to generate and display the close-up view to a user. For example, the close-up view may be rendered on top of all the objects of the XR environment 200 (such as the cabinet 215b, and ray 210b). The x-ray sight view 225 may further be displayed over a specific area of a user's view in the XR environment 200 (e.g., a top right corner, top left corner, center, off-center, etc.). The location at which the x-ray sight view 225 is rendered in the XR environment 200 may be set and/or adjusted by the user.
[0048] With the x-ray sight view 225 enabled, the ray 210b may continue, through cabinet 215b, to select object 220b. The point of collision 220b of the ray 210b is thus located on the object 230. In various embodiments, the x-ray sight view logic 115 may be configured to allow the ray 210b to continue through the cabinet 215b. In other embodiments, a new ray, a "sight view ray," may be generated to continue from the point of collision 220a, internally into the cabinet 215b. In some embodiments, the sight view ray may be an anchor ray, as described below with respect to Fig. 3. In yet further embodiments, the sight view ray may be projected from the sight view camera.
[0049] In various embodiments, the x-ray sight view 225 may provide the user with see-through capability, with object(s) between the view of the x-ray sight view 225 and the user not being rendered in the x-ray sight view 225. In some examples, this may be similar to a near clip plane of a virtual camera. The sight view camera may be generated, in some examples, such that near clip plane of the sight view camera is located at (or beyond) the point of collision 220a, such that the sight view camera can view through the occluding object (e.g., cabinet 215a, 215b). Because the front surface of the cabinet 215b is between the view of the sight view camera and the user, it is not rendered. In some examples, the user may control the depth of the view of the sight view camera along the trajectory of the ray 210b and/or sight view ray. For example, operations such as push forward or pull backward may be implemented as input gestures on the controller 125, such as, without limitation, a swipe up or down gesture on a touchscreen or touchpad.
[0050] Thus, the sight view camera may be configured to look through the front surface of the cabinet 215a, 215b, and present the view to the user via the x-ray sight view 225. As previously described, in some examples, the view of the sight view camera may be presented as the x-ray sight view 225, over (e.g., on top of) the other objects of a user's view, or in a dedicated area of the user's view (e.g., top right, top left, center, off-center, left, right, bottom left, bottom right, etc.). In yet further examples, the location of the x-ray sight view 225 may be user configurable. Accordingly, utilizing the x-ray sight view 225, the user may be able to select the object 230 through the cabinet 215b.
[0051] In some examples, when an object is far away, small in size, or densely populated with other objects, precise selection of an object in the XR environment 200 may be difficult. The farther away an object is from the user, a small movement of the controller 125 may translate into a large movement at the end of the ray.
[0052] Accordingly, various embodiments making use of the x-ray sight view 225 and remote anchor are described below with reference to Figs. 1 & 3. Fig. 3 is a schematic diagram of an XR environment 300 providing an x-ray sight view and remote anchor control, in accordance with various embodiments. Fig. 3 depicts the XR environment 300 from a schematic top-down view. The XR environment 300 includes controller 305, remote anchor 310, a first virtual object 315a and second virtual object 315b, sight view 320, controller ray 325a, 325b, and anchor ray 330a, 330b. It should be noted that the various objects and other aspects of the XR environment 300 are schematically illustrated in Fig. 3, and that modifications to the various components, objects, and other arrangements of XR environment 300 may be possible and in accordance with the various embodiments.
[0053] In various embodiments, remote anchor logic 120 may be configured to generate a remote anchor 310 within the XR environment 300 of the XR application 110. Specifically, a remote anchor 310 may be placed within the XR environment 300, and act as a virtual proxy of the controller 305. Accordingly, the remote anchor 310 may be a remote virtual representation of the controller 305 (which is itself a virtual version of the controller 125). The remote anchor 310 may be configured to behave as if it were the controller 305, and controlled by the physical controller 125, but from a different location (e.g., the location at which the remote anchor 310 is placed). In some examples, the remote anchor 310 may be placed at a location nearer to an object 315a, 315b.
[0054] In various embodiments, the remote anchor logic 120 be configured to generate the remote anchor 310 and anchor ray originating from the remote anchor 310. Thus, the remote anchor 310 enables a ray (e.g., the anchor ray) for selection of an object to originate from an arbitrary point in space, instead of originating at the controller 305. The anchor ray 330a may be controlled by rotation and movement of the original controller 125, 305. For example, a movement of the controller 305 to the left, resulting in a movement of the controller ray 325a to the position of controller ray 325b, results in the same movement of anchor ray 330a to the position of anchor ray 330b. In such a scenario, the length of the anchor ray 330a, 330b (e.g., distance from the remote anchor 310 to the object 330a, 330b of interest) is much shorter than the length of the controller ray 325a, 325b (e.g., distance from the controller 305 to the object 315a, 315b). Thus, finer control and more precise selection between objects 315a, 315b may be possible.
[0055] In some embodiments, the user may further be presented with an x-ray sight view 320 that is closer to the objects 315a, 315b. For example, the x-ray sight view may be generated from the view of the remote anchor 310. Accordingly, in some embodiments, the sight view may be generated at the location of the remote anchor. Thus, in some embodiments the remote anchor 310 may also be a sight view camera. In yet further embodiments, the remote anchor 310 may generate a sight view camera from locations visible to the remote anchor 310. In other embodiments, the sight view camera may be at a different location from the remote anchor 310. For example, the sight view camera may, in some examples, be placed close to the remote anchor 310, or be situated in a co-planar manner to the remote anchor 310. In some embodiments, the sight view camera may be positioned to provide a sight view that has the same offset to the remote anchor as a user's location to the controller 305 (e.g., the virtual representation of a physical controller). Thus, the perception of the user to the movement of the remote anchor 310 may be similar to the user's perception in movement of the controller 305.
[0056] In yet further embodiments, a user may enable the remote anchor 310 as an
"avatar," or virtual proxy of the controller 125 and/or virtual controller 305. As previously described, the remote anchor 310 may mimic the controller's movements and rotation. Thus, in some examples, an anchor ray may be generated, via the remote anchor logic 120, originating from the remote anchor 310, which may be controllable by the controller 125, 305.
[0057] In various embodiments, the sight view 320 may provide a magnified view of the object 330a, to which the view of the remote anchor 310 is directed. The sight view may, in some examples, be an x-ray sight view as previously described. For example, the remote anchor 310 and/or sight view camera may be placed such that a near clipping plane of the remote anchor 310 and/or the sight view camera is at or past an occluding object.
[0058] In some embodiments, the remote anchor 310 may be invoked, for example by providing user input via the controller 125. For example, user inputs to invoke the remote anchor 310 may include, without limitation, one or more of button presses, taps, swipes, sequence of inputs, voice inputs, or other types of inputs. In some embodiments, remote anchor logic 120 may be configured to generate the remote anchor 310 in response to receiving the user input invoking the remote anchor 310. In some embodiments, the remote anchor 310 may be generated at a point in space along the controller ray 325a, such as at a collision point 220a, or in free-space. For instance, with a simple ray, a user may point at a location on the wall and tap to place the remote anchor 310 at the location to which the controller ray 325a points. In some embodiments, the location of the remote anchor 310 may further be user configurable. For example, the remote anchor 310 may be generated in free space, or at a location indicated by the controller ray 325. Once created, the 3D position of the remote anchor 310 may be manipulated via user inputs to the controller 125. In yet further embodiments, a sight view 225, 320 may be invoked, and the position of the remote anchor 310 may be determined automatically based on the position and orientation of the sight view 225, 320. For example, the remote anchor 310 may be generated as a sight view camera for the sight view 225, 320.
[0059] Fig. 4 is a flow diagram of a method for providing an x-ray sight view and remote anchor, in accordance with various embodiments. The method 400 begins, at block 405, by generating a controller ray. As described in the examples above, a controller ray may be a plain ray used by a user in the XR environment to complete selection and interaction tasks. The controller ray may be emitted from a virtual controller, which may be controllable via a physical controller. Thus, the virtual controller in the XR environment may be configured to mimic the orientation and motion of the physical controller. The controller ray may project in a straight direction, according to an alignment / position of the controller (e.g., direction at which the controller is pointed). Thus, the controller may act as a virtual laser pointer, and the controller ray may be projected forward by the controller, analogous to a virtual laser beam.
[0060] The method 400 may continue, at block 410, by obtaining, from the controller, a user input invoking a sight view and/or remote anchor. In some examples, the user may invoke both a remote anchor and sight view. In other examples, the user may choose to invoke only a sight view, such as an x-ray sight view. As previously described, a user may invoke a remote anchor and/or sight view based on user inputs from a controller. User inputs may include one or more of button presses, taps, swipes, voice command, sequence of inputs, or other types of inputs.
[0061] The method 400 may continue, at block 415, by determining a collision point of the controller ray with an object. As previously described, the controller ray may continue forward, from the virtual controller of the XR environment, until the controller ray collides with an object. The collision point may be a point on the object at which contact is made between the controller ray and the object. In response to determining that a sight view has been invoked, the method 400 may continue, at block 415, by generating a sight view camera. As previously described, in some examples, the sight view camera may be generated at the collision point of the controller ray. In some embodiments, the sight view camera may be generated such that near clip plane of the sight view camera is located at (or beyond) the point of collision of the controller ray. Thus, the sight view camera may be configured to view through an occluding object blocking the controller ray. In some examples, the user may control the depth of the view of the sight view camera along the trajectory of the controller ray and/or remote anchor ray. For example, operations such as push forward or pull backward may be implemented as input gestures on the controller. [0062] Thus, the sight view camera may be configured to look through the front surface of the object with which the controller ray collides, and to present the view of the sight view camera (placed such a near clipping plane of the virtual camera is at and/or past the collision point). Thus, at block 420, the sight view is displayed to the user. As previously described, in some examples, the view of the sight view camera may be presented as an x- ray sight view. Alternatively, the sight view may be a point in space that is closer in distance to an object of interest. The sight view may be displayed over (e.g., on top of) the other objects of a in user's view (e.g., the view is presented on top of other objects, including occluding objects), or in a dedicated area of the user's view (e.g., top right, top left, center, off-center, left, right, bottom left, bottom right, etc.). In yet further examples, the location of the x-ray sight view may be user configurable. [0063] In response to a determination that a remote anchor was invoked by the user, the method may continue, at block 430, by generating the remote anchor. As previously described, a remote anchor may be generated at the collision point of the controller ray, or in free space. The positioning of the remote anchor in 3D space may further be user configurable / adjustable. For example, the position of the remote anchor may be moved forward and backward, up, down, left, and right. In some further embodiments, the orientation of the remote anchor may further be user adjusted. For example, the remote anchor may be flipped and/or rotated.
[0064] As previously described, in various embodiments, the remote anchor may be a virtual proxy of the controller, and configured to behave as if it were the controller. For example, the remote anchor may mimic the controller's movements and rotation. Thus, the remote anchor may be configured to be controlled by the physical controller, but at a different location from the physical controller. In yet further embodiments, the position of the remote anchor may be determined automatically based on the position and orientation of the sight view / sight view camera.
[0065] At block 435, the method 400 may continue by generating a remote anchor ray. As previously described, the remote anchor logic of a user device may be configured to generate an anchor ray in the XR environment, originating from the remote anchor. Thus, the anchor ray may be controlled, via the controller, through the use of the remote anchor. The method may continue, at block 430, by identifying an object selected with the anchor ray. In some embodiments, the anchor ray may be rendered from the perspective of the sight view camera in the sight view. At block 440, the method 400 continues by identifying an object selected, by the user, with the anchor ray. As previously described, in some examples where the sight view camera and anchor ray are the same, the anchor ray may be a sight view ray originating from the sight view camera. In other embodiments, the sight view camera and remote anchor may disparate components, and thus the sight view camera may render the anchor ray from the perspective of the sight view camera, with the anchor ray originating from the remote anchor. In yet further embodiments, anchor ray may coincide with the controller ray (e.g., the remote anchor is aligned with the controller and controller ray. Alternatively, the remote anchor may placed at the same location as the controller, or no remote anchor may be placed, in which case the anchor ray may be the same as the controller ray.
[0066] The techniques and processes described above with respect to various embodiments may be performed by one or more computer systems. Fig. 5 is a schematic block diagram of a computer system for transformer-based scene text detection, in accordance with various embodiments. Fig. 5 provides a schematic illustration of one embodiment of a computer system 500, such as the system 100, user device 105, controller 125, x-ray sight view logic 115, remote anchor logic 120, or subsystems thereof, which may perform the methods provided by various other embodiments, as described herein. It should be noted that Fig. 5 only provides a generalized illustration of various components, of which one or more of each may be utilized as appropriate. Fig. 5, therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
[0067] The computer system 500 includes multiple hardware elements that may be electrically coupled via a bus 505 (or may otherwise be in communication, as appropriate). The hardware elements may include one or more processors 510, including, without limitation, one or more general-purpose processors and/or one or more special-purpose processors (such as microprocessors, digital signal processing chips, graphics acceleration processors, and microcontrollers); one or more input devices 515, which include, without limitation, a mouse, a keyboard, one or more sensors, and/or the like; and one or more output devices 520, which can include, without limitation, a display device, and/or the like. [0068] The computer system 500 may further include (and/or be in communication with) one or more storage devices 525, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, solid-state storage device such as a random-access memory ("RAM") and/or a read-only memory ("ROM"), which can be programmable, flash- updateable, and/or the like. Such storage devices may be configured to implement any appropriate data stores, including, without limitation, various file systems, database structures, and/or the like.
[0069] The computer system 500 might also include a communications subsystem
530, which may include, without limitation, a modem, a network card (wireless or wired), an IR communication device, a wireless communication device and/or chipset (such as a Bluetooth™ device, an 802.11 device, a WiFi device, a WiMax device, a WWAN device, a Z-Wave device, a ZigBee device, cellular communication facilities, etc.), and/or a low- power wireless device. The communications subsystem 530 may permit data to be exchanged with a network (such as the network described below, to name one example), with other computer or hardware systems, between data centers or different cloud platforms, and/or with any other devices described herein. In many embodiments, the computer system 500 further comprises a working memory 535, which can include a RAM or ROM device, as described above.
[0070] The computer system 500 also may comprise software elements, shown as being currently located within the working memory 535, including an operating system 540, device drivers, executable libraries, and/or other code, such as one or more application programs 545, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein. Merely by way of example, one or more procedures described with respect to the method(s) discussed above might be implemented as code and/or instructions executable by a computer (and/or a processor within a computer); in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods.
[0071] A set of these instructions and/or code might be encoded and/or stored on a non-transitory computer readable storage medium, such as the storage device(s) 525 described above. In some cases, the storage medium might be incorporated within a computer system, such as the system 500. In other embodiments, the storage medium might be separate from a computer system (i.e., a removable medium, such as a compact disc, etc.), and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon. These instructions might take the form of executable code, which is executable by the computer system 500 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer system 500 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.) then takes the form of executable code.
[0072] It will be apparent to those skilled in the art that substantial variations may be made in accordance with specific requirements. For example, customized hardware (such as programmable logic controllers, single board computers, FPGAs, ASICs, and SoCs) might also be used, and/or particular elements might be implemented in hardware, software (including portable software, such as applets, etc.), or both. Further, connection to other computing devices such as network input/output devices may be employed.
[0073] As mentioned above, in one aspect, some embodiments may employ a computer or hardware system (such as the computer system 500) to perform methods in accordance with various embodiments of the invention. According to a set of embodiments, some or all of the procedures of such methods are performed by the computer system 500 in response to processor 510 executing one or more sequences of one or more instructions (which might be incorporated into the operating system 540 and/or other code, such as an application program 545) contained in the working memory 535. Such instructions may be read into the working memory 535 from another computer readable medium, such as one or more of the storage device(s) 525. Merely by way of example, execution of the sequences of instructions contained in the working memory 535 might cause the processor(s) 510 to perform one or more procedures of the methods described herein.
[0074] The terms "machine readable medium" and "computer readable medium," as used herein, refer to any medium that participates in providing data that causes a machine to operate in a specific fashion. In an embodiment implemented using the computer system 500, various computer readable media might be involved in providing instructions/code to processor(s) 510 for execution and/or might be used to store and/or carry such instructions/code (e.g., as signals). In many implementations, a computer readable medium is a non-transitory, physical, and/or tangible storage medium. In some embodiments, a computer readable medium may take many forms, including, but not limited to, non-volatile media, volatile media, or the like. Non-volatile media includes, for example, optical and/or magnetic disks, such as the storage device(s) 525. Volatile media includes, without limitation, dynamic memory, such as the working memory 535. In some alternative embodiments, a computer readable medium may take the form of transmission media, which includes, without limitation, coaxial cables, copper wire and fiber optics, including the wires that comprise the bus 505, as well as the various components of the communication subsystem 530 (and/or the media by which the communications subsystem 530 provides communication with other devices). In an alternative set of embodiments, transmission media can also take the form of waves (including, without limitation, radio, acoustic, and/or light waves, such as those generated during radio-wave and infra-red data communications). [0075] Common forms of physical and/or tangible computer readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read instructions and/or code. [0076] Various form of computer readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 510 for execution. Merely by way of example, the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer. A remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer system 500. These signals, which might be in the form of electromagnetic signals, acoustic signals, optical signals, and/or the like, are all examples of carrier waves on which instructions can be encoded, in accordance with various embodiments of the invention.
[0077] The communications subsystem 530 (and/or components thereof) generally receives the signals, and the bus 505 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the working memory 535, from which the processor(s) 510 retrieves and executes the instructions. The instructions received by the working memory 535 may optionally be stored on a storage device 525 either before or after execution by the processor(s) 510.
[0078] While some features and aspects have been described with respect to the embodiments, one skilled in the art will recognize that numerous modifications are possible. For example, the methods and processes described herein may be implemented using hardware components, software components, and/or any combination thereof. Further, while various methods and processes described herein may be described with respect to particular structural and/or functional components for ease of description, methods provided by various embodiments are not limited to any particular structural and/or functional architecture but instead can be implemented on any suitable hardware, firmware and/or software configuration. Similarly, while some functionality is ascribed to one or more system components, unless the context dictates otherwise, this functionality can be distributed among various other system components in accordance with the several embodiments.
[0079] Moreover, while the procedures of the methods and processes described herein are described in a particular order for ease of description, unless the context dictates otherwise, various procedures may be reordered, added, and/or omitted in accordance with various embodiments. Moreover, the procedures described with respect to one method or process may be incorporated within other described methods or processes; likewise, system components described according to a particular structural architecture and/or with respect to one system may be organized in alternative structural architectures and/or incorporated within other described systems. Hence, while various embodiments are described with or without some features for ease of description and to illustrate aspects of those embodiments, the various components and/or features described herein with respect to a particular embodiment can be substituted, added and/or subtracted from among other described embodiments, unless the context dictates otherwise. Consequently, although several embodiments are described above, it will be appreciated that the invention is intended to cover all modifications and equivalents within the scope of the following claims.

Claims

WHAT IS CLAIMED IS:
1. A method comprising: generating, via a computer system, a controller ray in an extended reality environment, wherein the controller ray is configured to originate from a virtual representation of a controller and extend from the virtual representation of the controller in a first direction in the extended reality environment; generating, via the computer system, a sight view camera at a first position along a trajectory of the controller ray; and displaying, via the computer system, a sight view to a user in the extended reality environment, the sight view displaying a view of the sight view camera at the first position along the trajectory of the controller ray.
2. The method of claim 1, further comprising: generating, via the computer system, a remote anchor at a second position along the trajectory of the controller ray, wherein the remote anchor is a proxy virtual representation of the controller, wherein an orientation of the remote anchor is configured to be controlled by the controller; and generating, via the computer system, an anchor ray, wherein anchor ray is configured to originate from the remote anchor, and extend from the remote anchor in a second direction that is based on the orientation of the controller.
3. The method of claim 2, further comprising: determining, via the computer system, a collision point of the controller ray with a foreground object in the extended reality environment; wherein at least one of the first position or second position along the trajectory of the controller ray is the collision point.
4. The method of claim 2, further comprising: selecting, via the controller, an object with the anchor ray.
5. The method of claim 2, further comprising: adjusting, via the controller, a three dimensional position of the remote anchor in the extended reality environment based on an adjustment input.
6. The method of claim 1, further comprising: adjusting, via the controller, a three dimensional position of the sight view camera in the extended reality environment based on an adjustment input.
7. The method of claim 1, wherein the sight view camera is positioned such that a near clip plane of the sight view camera is at or beyond a collision point on a trajectory along the controller ray traveling in a direction from the controller towards a foreground object, wherein the controller ray makes contact with the foreground object at the collision point.
8. The method of claim 1, further comprising: determining, via the computer system, that an occluded object has been selected by a sight view ray, wherein the sight view ray continues as an extension of the controller ray, in the same direction as the controller ray, and from the collision point; wherein the sight view ray is displayed in the sight view, wherein the controller ray is blocked, in the extended reality environment, by a foreground object from reaching the occluded object.
9. An apparatus, comprising: a non-transitory computer readable medium in communication with the processor, the non-transitory computer readable medium having encoded thereon a set of instructions executable by the processor to: generate, via a computer system, a controller ray in an extended reality environment, wherein the controller ray is configured to originate from a virtual representation of a controller and extend from the virtual representation of the controller in a first direction in the extended reality environment; generate, via the computer system, a sight view camera at a first position along a trajectory of the controller ray; and display, via the computer system, a sight view to a user in the extended reality environment, the sight view displaying a view of the sight view camera at the first position along the trajectory of the controller ray.
10. The apparatus of claim 9, wherein the set of instructions is further executable by the processor to: generate, via the computer system, a remote anchor at a second position along the trajectory of the collision ray, wherein the remote anchor is a proxy virtual representation of the controller, wherein an orientation of the remote anchor is configured to be controlled by the controller; and generate, via the computer system, an anchor ray, wherein anchor ray is configured to originate from the remote anchor, and extend from the remote anchor in a second direction that is based on the orientation of the controller.
11. The apparatus of claim 10, wherein the set of instructions is further executable by the processor to: determine, via the computer system, a collision point of the controller ray with a foreground object in the extended reality environment; wherein at least one of the first position or second position along the trajectory of the controller ray is the collision point.
12. The apparatus of claim 10, wherein the set of instructions is further executable by the processor to: select, via the controller, an object with the anchor ray.
13. The apparatus of claim 10, wherein the set of instructions is further executable by the processor to: adjust, via the controller, a three dimensional position of the remote anchor in the extended reality environment based on an adjustment input; and adjust, via the controller, a three dimensional position of the sight view camera in the extended reality environment based on an adjustment input.
14. The apparatus of claim 9, wherein the sight view camera is positioned such that a near clip plane of the sight view camera is at or beyond a collision point on a trajectory along the controller ray traveling in a direction from the controller towards a foreground object, wherein the controller ray makes contact with the foreground object at the collision point.
15. The apparatus of claim 9, wherein the set of instructions is further executable by the processor to: determine, via the computer system, that an occluded object has been selected by a sight view ray, wherein the sight view ray continues as an extension of the controller ray, in the same direction as the controller ray, and from the collision point; wherein the sight view ray is displayed in the sight view, wherein the controller ray is blocked, in the extended reality environment, by a foreground object from reaching the occluded object.
16. A system for providing a sight view and remote anchor, the system comprising: a controller; a user device coupled to the controller, the user device comprising: a processor; and a non-transitory computer readable medium in communication with the processor, the non-transitory computer readable medium having encoded thereon a set of instructions executable by the processor to: generate a controller ray in an extended reality environment, wherein the controller ray is configured to originate from a virtual representation of the controller and extend from the virtual representation of the controller in a first direction in the extended reality environment; generate a sight view camera at a first position along a trajectory of the controller ray; and display a sight view to a user in the extended reality environment, the sight view displaying a view of the sight view camera at the first position along the trajectory of the controller ray.
17. The system of claim 16, wherein the set of instructions is further executable by the processor to: generate a remote anchor at a second position along the trajectory of the collision ray, wherein the remote anchor is a proxy virtual representation of the controller, wherein an orientation of the remote anchor is configured to be controlled by the controller; generate an anchor ray, wherein anchor ray is configured to originate from the remote anchor, and extend from the remote anchor in a second direction that is based on the orientation of the controller; and select, via the controller, an object with the anchor ray.
18. The system of claim 16, wherein the set of instructions is further executable by the processor to: determine a collision point of the controller ray with a foreground object in the extended reality environment; wherein at least one of the first position or second position along the trajectory of the controller ray is the collision point
19. The system of claim 16, wherein the sight view camera is positioned such that a near clip plane of the sight view camera is at or beyond a collision point on a trajectory along the controller ray traveling in a direction from the controller towards a foreground object, wherein the controller ray makes contact with the foreground object at the collision point.
20. The system of claim 16, wherein the set of instructions is further executable by the processor to: determine, via the computer system, that an occluded object has been selected by a sight view ray, wherein the sight view ray continues as an extension of the controller ray, in the same direction as the controller ray, and from the collision point; wherein the sight view ray is displayed in the sight view, wherein the controller ray is blocked, in the extended reality environment, by a foreground object from reaching the occluded object.
PCT/US2022/016463 2021-08-18 2022-02-15 X-ray sight view and remote anchor for selection tasks in xr environment WO2022174192A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202280046627.3A CN117597652A (en) 2021-08-18 2022-02-15 X-ray view and remote anchor for selection tasks in an XR environment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202163234341P 2021-08-18 2021-08-18
US63/234,341 2021-08-18

Publications (1)

Publication Number Publication Date
WO2022174192A1 true WO2022174192A1 (en) 2022-08-18

Family

ID=82837259

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/016463 WO2022174192A1 (en) 2021-08-18 2022-02-15 X-ray sight view and remote anchor for selection tasks in xr environment

Country Status (2)

Country Link
CN (1) CN117597652A (en)
WO (1) WO2022174192A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130038707A1 (en) * 2011-08-09 2013-02-14 Tyco Healthcare Group Lp Apparatus and Method for Using Augmented Reality Vision System in Surgical Procedures
CN110647238A (en) * 2018-06-08 2020-01-03 脸谱科技有限责任公司 Artificial reality interaction plane

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130038707A1 (en) * 2011-08-09 2013-02-14 Tyco Healthcare Group Lp Apparatus and Method for Using Augmented Reality Vision System in Surgical Procedures
US9123155B2 (en) * 2011-08-09 2015-09-01 Covidien Lp Apparatus and method for using augmented reality vision system in surgical procedures
CN110647238A (en) * 2018-06-08 2020-01-03 脸谱科技有限责任公司 Artificial reality interaction plane

Also Published As

Publication number Publication date
CN117597652A (en) 2024-02-23

Similar Documents

Publication Publication Date Title
US10657716B2 (en) Collaborative augmented reality system
US20200371665A1 (en) Collaborative augmented reality system
US10296186B2 (en) Displaying a user control for a targeted graphical object
EP3811183B1 (en) Methods and apparatuses for providing input for head-worn image display devices
WO2016109409A1 (en) Virtual lasers for interacting with augmented reality environments
JP5986261B1 (en) System and method for providing an efficient interface for screen control
US10466960B2 (en) Augmented reality audio mixing
Medeiros et al. A tablet-based 3d interaction tool for virtual engineering environments
US10754524B2 (en) Resizing of images with respect to a single point of convergence or divergence during zooming operations in a user interface
US11301124B2 (en) User interface modification using preview panel
CN112965773A (en) Method, apparatus, device and storage medium for information display
WO2022174192A1 (en) X-ray sight view and remote anchor for selection tasks in xr environment
US9927892B2 (en) Multiple touch selection control
CN113457144B (en) Virtual unit selection method and device in game, storage medium and electronic equipment
KR20180058097A (en) Electronic device for displaying image and method for controlling thereof
JP2021005873A (en) Method and system for positioning and controlling sound image in three-dimensional space
CN108499102B (en) Information interface display method and device, storage medium and electronic equipment
US11796959B2 (en) Augmented image viewing with three dimensional objects
KR20150094483A (en) Electro device configured to display three dimensional virtual space and method for controlling thereof
KR102392675B1 (en) Interfacing method for 3d sketch and apparatus thereof
CN116755563B (en) Interactive control method and device for head-mounted display equipment
US20240087255A1 (en) Information processing apparatus, system, control method, and non-transitory computer storage medium
JP2024018907A (en) XR multi-window control
CN115485734A (en) Interface method and device for drawing three-dimensional sketch
JP2019124994A (en) Image measuring instrument and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22753527

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202280046627.3

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE