GB2574899A - Mixed reality handsfree motion - Google Patents

Mixed reality handsfree motion Download PDF

Info

Publication number
GB2574899A
GB2574899A GB201810349A GB201810349A GB2574899A GB 2574899 A GB2574899 A GB 2574899A GB 201810349 A GB201810349 A GB 201810349A GB 201810349 A GB201810349 A GB 201810349A GB 2574899 A GB2574899 A GB 2574899A
Authority
GB
United Kingdom
Prior art keywords
user
head
mixed reality
sub
earth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB201810349A
Other versions
GB201810349D0 (en
Inventor
Hussenot-Desenonges Alexandra
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB201810349A priority Critical patent/GB2574899A/en
Publication of GB201810349D0 publication Critical patent/GB201810349D0/en
Publication of GB2574899A publication Critical patent/GB2574899A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A mixed reality software that enables hands free object motion such as rotation or scrolling and dive in (or zoom in). The software includes a motion controller 2, a master object 1 and at least one sub object 3 for the immersion to be possible. When the user’s gaze is on edge of the controller the image moves or rotates. When the user focuses on a data pin 4 then the image of that object is enlarged. The system may be used with 3D images such as models of the Earth. It is also necessary to be equipped with a hardware device such as a head mount and a smartphone or a head mounted display that enables to control the environment using the head to focus.

Description

Handsfree Earth diving in Mixed Reality
This invention relates to the ability for a user to travel the earth, totally handsfree, on a Mixed Reality software and be offered relevant information concerning the selected destination.
Mixed Reality (XR) software enables users to overlay digital information to reality (Augmented Reality or AR) or to immerse the users into fully man made digital environments (Virtual Reality or VR). VR environments could themselves onboard a number of real or virtual assets such as 360 videos, games, data. It is about revolutionizing the way we think and create digital user experiences. Today AR experiences are mainly brought to consumers by using a smartphone to augment the real environment. In Virtual Reality, specific VR Head-Mounted displays or Headmounts combined with a Smartphone are necessary to enter the experiences. We are also seeing emerge standalone Mixed Reality headsets. Most high quality interactive experiences require an expensive VR Head-Mounted display and VR controllers or gamepads. We believe this is one of the reasons why the VR market is taking time to take off. By offering end users the ability to use a simple Headmount, without controller, for a high quality interactive experience we can make VR more affordable, easier to use and mobile. This is the reason why we have worked on creating an experience where users could immerse into an interactive VR or XR world to travel the earth and dive into geo located information just using their head as control.
The main object of the experience is a 3D earth object. The core of the invention is the ability to manage on the one hand the motion and on the other hand the zooming in of the 3D earth object using the head only as control. In the specific application we developed, the earth is the object but managing motion and zooming in any object hands-free is a user interface function that could have numerous applications in XR in the near future as it allows for multitasking and mobility.
The invention will now be described solely by way of example and with reference to the accompanying drawings in which
Figure 1 shows the initial field of view and position of the different objects with no particular focus point by the user
Figure 2 shows the point on which the user is focusing on with his head
Figure 3 shows the effect of figure 2
Figure 4 shows the point on which the user is focusing on with his head
Figure 5 shows the effect of figure 3
In figure 1 we see the master object (1) surrounded by a handsfree motion controller (2). Inside the master object (1) there is a sub object (3) which is a first level dependent on the master object. Within sub object (3) there is a data pin (4) which indicates the availability of data or content related to that part of the master object. We see the user point of focus (5) which is not pointing at any specific functional area (hard line black areas of the drawing). Then (6) represents an example of an imaginary axis and there is an infinite amount of such axis passing by the centre of the central point of the master object. Our total Field of view FoV is outlined in the dotted line area (7).
In figure 2, when the user focuses on point (5) on an area of the handsfree motion controller (2), the master object rotates backwards in the direction of the focus point along the imaginary axis (6). The result of this action is represented in figure 3.
In figure 3, the sub object (3) and pin (4) have moved from their initial position in figure 2 to a new position in figure 3. If the user continuously focuses on that same area of the handsfree motion controller, sub object (3) and data pin (4) would disappear at the back of the visible area of the master object. The software display to the user information that is a match with what he has in his field of view.
In figure 4, when the user focuses (5) on the data pin (4), it zooms in the selected area and the sub object (3) becomes the main object in the Field of View (7 ) as per figure 5.
Sub object 3 in figure 5 has the same functionalities as master object (1) (rotation and display of information).

Claims (3)

Claims
1. The ability for a mixed reality software user to move or rotate a master object or its sub objects by focusing his head on the motion controller surrounding the object in a specific direction. For example the user could rotate an earth object to the left by turning his head in that direction so as to select the motion controller in that part of the environment to indicate the rotation way. By proceeding as is the user can rotate the object in any preferred direction, totally handsfree, and be offered relevant information concerning the selected destination in return.
2. The ability for a mixed reality software user to zoom into an object, totally hand free, by focusing one's head on a sub object. For example the user could dive from an earth object to a specific region sub object and further on to a specific country sub object (and so on...) and be offered relevant information concerning the selected destination in return.
3. The ability to perform any of the actions according to any of the preceding claims with the use of a high-resolution 3D earth object at the heart of the mixed reality software experience.
GB201810349A 2018-06-24 2018-06-24 Mixed reality handsfree motion Withdrawn GB2574899A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB201810349A GB2574899A (en) 2018-06-24 2018-06-24 Mixed reality handsfree motion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB201810349A GB2574899A (en) 2018-06-24 2018-06-24 Mixed reality handsfree motion

Publications (2)

Publication Number Publication Date
GB201810349D0 GB201810349D0 (en) 2018-08-08
GB2574899A true GB2574899A (en) 2019-12-25

Family

ID=63042802

Family Applications (1)

Application Number Title Priority Date Filing Date
GB201810349A Withdrawn GB2574899A (en) 2018-06-24 2018-06-24 Mixed reality handsfree motion

Country Status (1)

Country Link
GB (1) GB2574899A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150138244A1 (en) * 2013-11-18 2015-05-21 Tobii Technology Ab Component determination and gaze provoked interaction
CN107168516A (en) * 2017-03-31 2017-09-15 浙江工业大学 Global climate vector field data method for visualizing based on VR and gesture interaction technology
US20180150132A1 (en) * 2016-11-25 2018-05-31 Samsung Electronics Co., Ltd. Method and device for providing an image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150138244A1 (en) * 2013-11-18 2015-05-21 Tobii Technology Ab Component determination and gaze provoked interaction
US20180150132A1 (en) * 2016-11-25 2018-05-31 Samsung Electronics Co., Ltd. Method and device for providing an image
CN107168516A (en) * 2017-03-31 2017-09-15 浙江工业大学 Global climate vector field data method for visualizing based on VR and gesture interaction technology

Also Published As

Publication number Publication date
GB201810349D0 (en) 2018-08-08

Similar Documents

Publication Publication Date Title
Anthes et al. State of the art of virtual reality technology
CN109923462B (en) Sensing glasses
Lee et al. A user study on mr remote collaboration using live 360 video
Han et al. My Tai-Chi coaches: an augmented-learning tool for practicing Tai-Chi Chuan
Komiyama et al. JackIn space: designing a seamless transition between first and third person view for effective telepresence collaborations
CN108604175B (en) Apparatus and associated methods
US20200035206A1 (en) Compositing an image for display
US10181219B1 (en) Phone control and presence in virtual reality
CN105208368A (en) Method and device for displaying panoramic data
US9440484B2 (en) 3D digital painting
US20190259198A1 (en) Systems and methods for generating visual representations of a virtual object for display by user devices
Alaee et al. A user study on augmented virtuality using depth sensing cameras for near-range awareness in immersive vr
KR20010060233A (en) Device comprising a screen for stereoscopic images
US20190164323A1 (en) Method and program for generating virtual reality contents
L. Franz et al. Nearmi: A framework for designing point of interest techniques for VR users with limited mobility
Li et al. Augmented reflection of reality
Li et al. Omniglobevr: A collaborative 360-degree communication system for vr
Lee et al. Optical-reflection type 3d augmented reality mirrors
GB2574899A (en) Mixed reality handsfree motion
JP6999538B2 (en) Information processing methods, information processing programs, information processing systems, and information processing equipment
US20210158605A1 (en) Information processing device, information processing method, and program
JPWO2017068928A1 (en) Information processing apparatus, control method therefor, and computer program
Kot et al. Virtual operator station for teleoperated mobile robots
Brun et al. A 3D application to familiarize children with sign language and assess the potential of avatars and motion capture for learning movement
Samini et al. A user study on touch interaction for user-perspective rendering in hand-held video see-through augmented reality

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)