CN117234340A - Method and device for displaying user interface of head-mounted XR device - Google Patents
Method and device for displaying user interface of head-mounted XR device Download PDFInfo
- Publication number
- CN117234340A CN117234340A CN202311514304.5A CN202311514304A CN117234340A CN 117234340 A CN117234340 A CN 117234340A CN 202311514304 A CN202311514304 A CN 202311514304A CN 117234340 A CN117234340 A CN 117234340A
- Authority
- CN
- China
- Prior art keywords
- user
- head
- face
- interface
- virtual space
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 67
- 230000033001 locomotion Effects 0.000 claims description 25
- 230000001133 acceleration Effects 0.000 claims description 17
- 230000008569 process Effects 0.000 claims description 6
- 235000019577 caloric intake Nutrition 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 5
- 230000001815 facial effect Effects 0.000 claims description 4
- 230000004044 response Effects 0.000 claims description 4
- 210000003128 head Anatomy 0.000 description 31
- 238000004891 communication Methods 0.000 description 19
- 238000012545 processing Methods 0.000 description 17
- 238000010586 diagram Methods 0.000 description 16
- 230000006870 function Effects 0.000 description 16
- 230000003287 optical effect Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 7
- 238000013461 design Methods 0.000 description 6
- 239000011521 glass Substances 0.000 description 4
- 230000003190 augmentative effect Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 210000000887 face Anatomy 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 229920001621 AMOLED Polymers 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 239000006117 anti-reflective coating Substances 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses a user interface display method and device of a head-mounted XR device, and relates to the technical field of head-mounted XR devices. When a user wears the head-mounted XR device to navigate, the head-mounted XR device displays corresponding user interfaces respectively aiming at different orientations of the face of the user. The user rotates the head, and can conveniently view navigation information, map route details, travel information and other information through the virtual space provided by the head-mounted XR equipment, the user is not required to look at a mobile phone screen at low head, the user is not interfered to view a real scene, and convenient and safe navigation display and viewing experience are realized.
Description
Technical Field
The application relates to the technical field of head-mounted XR equipment, in particular to a user interface display method and device of head-mounted XR equipment.
Background
Augmented reality (XR) is a generic term of technology that combines a virtual world generated by computer graphics with a real world generated by a wearable device so that a user can perceive information that is not present in the real world. XR includes augmented reality (augmented reality, AR), virtual Reality (VR), mixed Reality (MR), and the like.
A headset that employs XR technology is referred to as a headset XR device. With the continuous maturity of XR technology, wear-type XR equipment gets into masses 'life gradually, brings the convenience for people's aspect of life. The head-mounted XR equipment is utilized for walking and riding navigation, which is an important scene in daily life, and a user can obtain direction or route guidance through a navigation interface displayed by the head-mounted XR equipment without looking down at a mobile phone screen, and the user is not influenced to see a real scene. In the scenario of navigating in a head-mounted XR device, it is a problem to be solved how to bring a satisfactory experience to the user in a reasonable display and interaction way.
Disclosure of Invention
The embodiment of the application provides a user interface display method and device for head-mounted XR equipment, which can provide more convenient, comfortable and efficient navigation experience for users.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical scheme:
in a first aspect, there is provided a method of displaying a user interface of a head mounted XR device, the method comprising: when a face of a user wearing the head-mounted XR device (a first user) faces a first direction, displaying a first interface in a virtual space of the head-mounted XR device, the first interface including navigation information for navigating a first user travel process in connection with a route in a real scene; when the face of the first user is facing the second direction, the virtual space displays a second interface that includes a navigation map and detailed information, or first information during travel of the first user (e.g., including at least one of a clock, a travel duration, a remaining duration, a speed of time, a state of motion, and a calorie consumption).
In the method, when a user wears the head-mounted XR device to navigate, the head-mounted XR device displays corresponding user interfaces respectively aiming at different orientations of the face of the user. The user rotates the head, and can conveniently view navigation information, map route details, travel information and other information through the virtual space provided by the head-mounted XR equipment, the user is not required to look at a mobile phone screen at low head, the user is not interfered to view a real scene, and convenient and safe navigation display and viewing experience are realized.
With reference to the first aspect, in some embodiments, the method further comprises: when the face of the first user faces a third direction, the virtual space displays a third interface; the third interface includes prompt information for prompting the first user for travel safety.
In this method, the user turns the head toward a third direction (e.g., the head turns to the left or right to a greater extent), and a safety prompt is displayed. The user is prompted for safety during travel.
With reference to the first aspect, in some embodiments, the facing of the face of the first user toward the first direction includes: the face of the first user is looking flat forward.
In this way, the user can conveniently travel according to the guidance of the navigation information by looking forward normally.
With reference to the first aspect, in some embodiments, the face of the first user facing the second direction includes: the face of the first user is looking up toward the front.
In some embodiments, the virtual space displays first information during travel of the first user when the first user's face is looking upward in a forward direction. In this way, the user can conveniently look up the information such as the clock, the travelling time, the residual time, the speed per hour, the movement state, the calorie consumption and the like by looking up the head.
With reference to the first aspect, in some embodiments, the face of the first user facing the second direction includes: the face of the first user is looking down forward.
In some embodiments, the virtual space displays the navigation map and the detail information when the face of the first user is looking forward at the plane. In this way, the user can look down and look down to conveniently view the navigation map and details.
With reference to the first aspect, in some embodiments, the face of the first user facing the third direction includes: the face of the first user is directed to the left, or the face of the first user is directed to the right.
In some embodiments, the headset XR device determines that the user face is facing to the left and the angle between the user face and the direction of travel is less than or equal to a preset threshold one, or the headset XR device determines that the user face is facing to the right and the angle between the user face and the direction of travel is less than or equal to a preset threshold two, the headset XR device still displaying the first navigation interface.
In some embodiments, the virtual space displays the third interface when an angle between the first user's face orientation and the first user's direction of travel is greater than a preset threshold (the user's face orientation is to the left and the angle between the user's face orientation and the direction of travel is greater than a preset threshold one, or the user's face orientation is to the right and the angle between the user's face orientation and the direction of travel is greater than a preset threshold two).
In this way, the safety warning information generated by the small-angle and small-range turning of the user can be avoided.
With reference to the first aspect, in some embodiments, a portion of the first interface other than the navigation information is displayed transparently. Thus, the navigation interface can avoid the interference of the real scenery to the advance of the user.
With reference to the first aspect, in some embodiments, the navigation information includes at least one of a travel direction, a travel distance.
With reference to the first aspect, in some embodiments, the navigation information is displayed in a virtual space at a location corresponding to a center of the user's travel route.
In this method, even if the user's head moves in a small range and at a small angle, the position of the virtual space where the navigation interface is displayed moves as the user's head rotates, and the navigation information is always displayed at a position (an upper, lower or central position) on the display interface corresponding to the center of the user's travel route. The normal use of the navigation function by the user is not affected.
With reference to the first aspect, in some embodiments, the head-mounted XR device comprises a sensor comprising a gyroscope sensor, an acceleration sensor, or a geomagnetic sensor, the method further comprising: the head-mounted XR device obtains a facial orientation of the first user via the sensor.
In some implementations, in response to detecting that the face of the first user is facing the second direction, the head-mounted XR device generates a second interface and displays the second interface in the virtual space.
That is, the head-mounted XR device is provided with data processing capabilities to generate and display a user interface based on the detected orientation of the user's face.
In some implementations, in response to detecting that the face of the first user is oriented in the second direction, the head-mounted XR device sends first information to the electronic device indicating that the face of the first user is oriented in the second direction; the head-mounted XR device receiving image data of a second interface from the electronic device, the image data of the second interface being generated from the first information; the head-mounted XR device displays the second interface in the virtual space based on the image data of the second interface.
That is, upon detecting that the user's face is facing, the head-mounted XR device generates a user interface for virtual space display via an electronic device connected thereto. The head-mounted XR device serves as a display device for the electronic device.
In a second aspect, there is provided a head mounted XR device having the functionality to implement the method of the first aspect described above. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a third aspect, there is provided a head mounted XR device comprising: a processor, a memory, and a sensor; the sensor is for detecting a facial orientation of a user wearing the head mounted XR device, the memory is for storing computer executable instructions, the processor executing the computer executable instructions stored by the memory when the head mounted XR device is in operation, to cause the head mounted XR device to perform the method as described in any one of the first aspects above.
In a fourth aspect, there is provided a computer readable storage medium having instructions stored therein which, when run on a computer, cause the computer to perform the method of any of the first aspects above.
In a fifth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of any of the first aspects above.
In a sixth aspect, there is provided an apparatus (e.g. the apparatus may be a system-on-a-chip) comprising a processor for supporting an electronic device to implement the functions referred to in the first aspect above. In one possible design, the apparatus further includes a memory for holding the program instructions and data necessary for the head mounted XR device. When the device is a chip system, the device can be formed by a chip, and can also comprise the chip and other discrete devices.
The technical effects of any one of the design manners of the second aspect to the sixth aspect may be referred to the technical effects of the different design manners of the first aspect, and will not be repeated here.
Drawings
FIG. 1 is a schematic diagram of a scenario in which a user interface display method of a head-mounted XR device according to an embodiment of the present application is applicable;
FIG. 2 is a schematic diagram of an example of a scenario of a method for displaying a user interface of a head-mounted XR device according to an embodiment of the present application;
FIG. 3 is a schematic diagram illustrating a user's face orientation in a method for displaying a user interface of a head-mounted XR device according to an embodiment of the present application;
FIG. 4 is a schematic diagram illustrating a user's face orientation in a method for displaying a user interface of a head-mounted XR device according to an embodiment of the present application;
FIG. 5 is a schematic diagram illustrating a user's face orientation in a method for displaying a user interface of a head-mounted XR device according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a method for displaying a user interface of a head-mounted XR device according to an embodiment of the application;
FIG. 7 is a schematic diagram of an example of a scenario of a method for displaying a user interface of a head-mounted XR device according to an embodiment of the present application;
FIG. 8 is a schematic diagram of an example of a scenario of a method for displaying a user interface of a head-mounted XR device according to an embodiment of the present application;
FIG. 9 is a schematic diagram of an example of a scenario of a method for displaying a user interface of a head-mounted XR device according to an embodiment of the present application;
FIG. 10 is a schematic diagram of an example of a scenario of a method for displaying a user interface of a head-mounted XR device according to an embodiment of the present application;
FIG. 11 is a schematic diagram of an example of a scenario of a method for displaying a user interface of a head-mounted XR device according to an embodiment of the present application;
FIG. 12 is a schematic diagram of an example of a scenario of a method for displaying a user interface of a head-mounted XR device according to an embodiment of the application;
FIG. 13 is a schematic diagram of a hardware structure of a head-mounted XR device according to an embodiment of the application;
FIG. 14 is a schematic diagram of a system architecture to which a method for displaying a user interface of a head-mounted XR device according to an embodiment of the present application is applicable;
Fig. 15 is a schematic structural diagram of a head-mounted XR device according to an embodiment of the present application.
Detailed Description
In the description of embodiments of the present application, the terminology used in the embodiments below is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in the following embodiments of the present application, "at least one", "one or more" means one or more than two (including two). The term "and/or" is used to describe an association relationship of associated objects, meaning that there may be three relationships; for example, a and/or B may represent: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise. The term "coupled" includes both direct and indirect connections, unless stated otherwise. The terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated.
In embodiments of the application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g." in an embodiment should not be taken as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
The head-mounted XR devices may include AR glasses, MR virtual reality devices, head Up Display (HUD) devices, riding helmet display devices, and the like. The head-mounted XR device presents virtual pictures, also referred to as virtual spaces, within which text, images, etc. may be displayed, using AR, VR, MR, etc. techniques, providing the user with auxiliary information for the real scene. The virtual space may be considered the field of view of the user wearing the head mounted XR device. Illustratively, referring to fig. 1 (a), the user wears AR glasses 100, and the AR glasses 100 display virtual information (text, images, etc.) in the virtual space 10 using AR technology. In one example, as shown in fig. 1 (b), the virtual space 10 is a rectangular planar space having a length X and a height Y. It will be appreciated that the virtual space 10 shown in fig. 1 (a) and (b) is a plane, and in actual use, the virtual space may be a curved space having a curvature.
In the navigation scene, route information, navigation information and the like can be displayed in the virtual space, and when a user wears the head-mounted XR equipment, the user does not need to check the route by means of a mobile phone, and the route information, the navigation information and the like can be directly obtained through the information displayed in the virtual space. The user can conveniently view the map route, the navigation details and the like without influencing the view of the real scenery. Illustratively, as shown in FIG. 2, a navigation interface 201 is displayed in the virtual space 10 for providing navigation information to a user; the navigation information is auxiliary information of the real scenery, and is combined with a route in the real scenery to navigate for the user. For example, the navigation information may include a traveling direction, a traveling distance, and the like. In one example, the navigation interface 201 includes navigation text 202 for prompting a user to advance a route in text form; the navigation interface 201 also includes a navigation icon 203 for graphically prompting the user to progress through the route. The user can conveniently confirm the route condition according to the real scene and the navigation information in the navigation interface 201.
The black rectangular box in fig. 2 is an example for illustrating the range of the virtual space 10. In practical applications, the black rectangular frame may not be displayed.
The embodiment of the application provides a user interface display method of a head-mounted XR device, in the navigation scene, the head-mounted XR device displays user interfaces corresponding to all directions according to different directions of the face of a user, and provides convenient, comfortable and efficient navigation experience for the user.
Illustratively, as shown in fig. 3, in a navigational scenario, the user is wearing headset XR device 100 to travel forward. During forward travel, the user's face may be oriented in different directions.
In the horizontal direction, the user's face may be directed forward, to the left, or to the right. In one example, when the user's face is facing forward, virtual space 10 of head-mounted XR device 100 is displayed in front of the user's direction of travel. In one example, when the user's face is facing left, virtual space 10 of head-mounted XR device 100 is displayed to the left of the user's direction of travel. In one example, when the user's face is facing to the right, virtual space 10 of head-mounted XR device 100 is displayed to the right of the user's direction of travel.
In the vertical direction, the user's face may be oriented upward (bottom view), toward the horizontal plane (head up view), or downward (top view). In one example, when the user's face is facing a horizontal plane (head-up), virtual space 10 of head-mounted XR device 100 is displayed in front of the user's direction of travel, the display height of virtual space 10 coinciding with the user's head-up height. In one example, when the user's face is facing upward (bottom view), virtual space 10 of head-mounted XR device 100 is displayed forward of the user's direction of travel, with the display height of virtual space 10 being higher than the user's head-up height. In one example, when the user's face is facing downward (in top view), virtual space 10 of head-mounted XR device 100 is displayed in front of the user's direction of travel, with the display height of virtual space 10 being lower than the user's head-up height.
That is, the virtual space 10 is located right in front of the user's line of sight.
In the embodiment of the present application, the face of the user is not strictly limited to be directed forward, and the forward direction refers to the traveling direction of the user. For example, referring to fig. 4, when the user face is oriented to the left in the direction of travel and the angle between the user face and the direction of travel is less than or equal to an angle one (e.g., 5 degrees), or alternatively, the user face is oriented to the right in the direction of travel and the angle between the user face and the direction of travel is less than or equal to an angle two (e.g., 5 degrees), this indicates that the user face is oriented forward. When the user face is oriented to the left in the direction of travel and the angle between the user face orientation and the direction of travel is greater than an angle of one, this means that the user face is oriented to the left. When the user face is oriented to the right in the direction of travel and the angle between the user face orientation and the direction of travel is greater than angle two, this means that the user face is oriented to the right. The value of the angle two may be the same as or different from the value of the angle one.
The user's head-up in the embodiments of the present application does not strictly define the user's face orientation (line of sight) to be parallel to the horizontal plane and perpendicular to the body. Illustratively, referring to fig. 5, when the user face is oriented above the horizontal plane and the angle between the user face and the horizontal plane is less than or equal to an angle three (e.g., 10 degrees), or the user face is oriented below the horizontal plane and the angle between the user face and the horizontal plane is less than or equal to an angle four (e.g., 15 degrees), this indicates that the user face is oriented towards the horizontal plane (head-up). When the user's face orientation is above the horizontal plane and the angle between the user's face orientation and the horizontal plane is greater than angle three, this means that the user's face is oriented upward (looking up). When the user face orientation is below the horizontal plane and the angle between the user face orientation and the horizontal plane is greater than angle four, this means that the user face is oriented downward (looking down). The value of the angle four may be the same as or different from the value of the angle three.
Optionally, in one implementation, angle four = angle three = angle two = angle one.
In one implementation, a gyroscope sensor, an acceleration sensor, or a geomagnetic sensor is provided in the head-mounted XR device. The head-mounted XR device may determine the pose of the head-mounted XR device by means of a gyroscope sensor and an acceleration sensor or a geomagnetic sensor, etc. The gyroscope sensor can acquire the motion speed of the head-mounted XR device, and the acceleration sensor can acquire the acceleration of the head-mounted XR device. From the data acquired by the gyroscopic sensor and the acceleration sensor, the pose of the head mounted XR device can be calculated. The geomagnetic sensor can measure the earth magnetic field, calculate the magnetic inclination angle and the magnetic declination angle according to the longitude and latitude and the altitude at the position and the measured magnetic field direction, and then calculate the attitude of the north pole and the head-mounted XR equipment. The head-mounted XR device can determine the user's face orientation from its own pose.
According to the user interface display method of the head-mounted XR equipment, the head-mounted XR equipment displays corresponding user interfaces with different directions in the virtual space according to different directions of the face of the user. Illustratively, referring to FIG. 6, interface one is displayed with the user's face oriented in front plan; when the face of the user faces forward and looks upward, a second interface is displayed; when the face of the user looks down forwards, displaying an interface III; displaying an interface IV when the face of the user faces to the left; when the face of the user faces to the right, displaying an interface five; wherein interface five may be the same as interface four. The user's face is oriented in different directions and the virtual space of the head-mounted XR device displays a corresponding user interface. In this way, the head-mounted XR device may display rich user interfaces in virtual space, including navigation information, travel information, prompt information, and the like. The user can conveniently view navigation information guide, map route details, travel information and the like through the virtual space provided by the head-mounted XR equipment without looking down at the mobile phone screen.
In some embodiments, the user wears the head-mounted XR device facing forward head-up. If it is determined that the user's face is looking forward, the head-mounted XR device displays a first navigation interface. The first navigation interface includes simple navigation information. Illustratively, as shown in FIG. 2, the virtual space 10 of the head-mounted XR device displays a navigation interface 201. In one example, navigation information in the navigation interface 201 is displayed in the center of the user's travel route. Alternatively, as shown in fig. 7, the navigation information may be displayed in a position corresponding to the center of the travel route of the user in the virtual space display interface, for example, in an upper, lower or middle position of the virtual space display interface. In one example, the navigation interface 201 is displayed transparently except for the navigation text 202 and the navigation icon 203, so that the navigation interface is prevented from shielding the real scene from the user.
In some embodiments, the user wears the head mounted XR device facing in a front plan view. If it is determined that the user's face is looking down in a forward direction, the head-mounted XR device displays a second navigation interface. The second navigation interface includes a navigation map and details. Illustratively, as shown in FIG. 8, virtual space 10 of the head-mounted XR device displays navigation interface 301. The navigation interface 301 is used to display a navigation map and details. In one example, navigation interface 301 fills the entire virtual space 10 display interface, a non-transparent display, facilitating the user to view navigation maps and details at any time.
In some embodiments, the user wears the head-mounted XR device facing forward looking up. If the face of the user is determined to look up in the front direction, the head-mounted XR device displays a travel information interface, and the travel information interface is used for displaying relevant information in the travel process of the user; the travel information interface includes first information such as a clock, a travel duration, a remaining duration, a speed of time, a state of motion, a calorie consumption, and the like. Illustratively, as shown in FIG. 9, virtual space 10 of the head-mounted XR device displays a travel information interface 401, travel information interface 401 including a hint information 402; wherein, "15:09" is a clock, "35 minutes have elapsed" represents a travel time period, "18 minutes remain on the journey" represents a remaining time period, "20 km/h per hour" represents a speed per hour, "78 calories" represents calorie consumption. The user can conveniently check various information in the travelling process only by lifting the head. In one example, the travel information interface 401 is displayed transparently except for the prompt message 402, so that the obstruction of the travel information interface to the user is avoided from being prevented from observing the surrounding real scenery.
When the user turns his head around to the left or right, the face of the user moves to the left or to the right, respectively.
In some embodiments, the head-mounted XR device determines that the user face is facing to the left and the angle between the user face and the direction of travel is less than or equal to a preset threshold one (e.g., 45 degrees), or the head-mounted XR device determines that the user face is facing to the right and the angle between the user face and the direction of travel is less than or equal to a preset threshold two (which may be the same as or different from threshold one; e.g., threshold two is 45 degrees), the head-mounted XR device still displays the first navigation interface. In one example, the virtual space display position of the head mounted XR device moves to the left or right with the user's face so that the virtual space display interface is directly in front of the user's line of sight. The navigation information in the first navigation interface is always displayed at a position (an upward position, a downward position or a central position) corresponding to the center of the travel route of the user on the virtual space display interface. Illustratively, as shown in fig. 10, the user's face moves to the left, and the virtual space 10 of the head-mounted XR device 100 displays a position that moves to the left as the user's face is directed. The head-mounted XR device 100 determines that the user's face is facing to the left and the angle between the user's face orientation and the direction of travel is less than a threshold of one (e.g., 45 degrees), the virtual space 10 displays a navigation interface 201, the navigation interface 201 comprising navigation text 202 and navigation icons 203, the navigation text 202 and navigation icons 203 being displayed at locations on the virtual space 10 display interface corresponding to the center of the user's path of travel. Illustratively, as shown in fig. 11, the user's face moves to the right, and the virtual space 10 of the head-mounted XR device 100 displays a position that moves to the right as the user's face is directed. The head-mounted XR device 100 determines that the face of the user is facing to the right and the angle between the face of the user and the direction of travel is less than a threshold of two (e.g., 45 degrees), the virtual space 10 displays a navigation interface 201, the navigation interface 201 comprising navigation text 202 and navigation icons 203, the navigation text 202 and navigation icons 203 being displayed at positions on the virtual space 10 display interface corresponding to the center of the user's path of travel. In one implementation, the head-mounted XR device obtains the current position information (longitude, latitude, altitude) of the head-mounted XR device through a positioning device (such as a GPS, a beidou navigation system, etc.), and determines the position of the head-mounted XR device on a map and further determines the travelling direction of the user according to the map information and the position information of the head-mounted XR device. The head-mounted XR equipment can also acquire the face orientation of the user according to a gyroscope sensor, an acceleration sensor, a geomagnetic sensor and other devices, and further determine the position of the virtual space display interface. The head-mounted XR device is combined with the position of the virtual space display interface, the position of the head-mounted XR device on the map and the travelling direction of the user, so that the specific position corresponding to the travelling route center of the user in the virtual space display interface can be determined.
In some embodiments, the head-mounted XR device determines that the user's face is facing to the left and the angle between the user's face and the direction of travel is greater than a preset first threshold (e.g., 45 degrees), or the head-mounted XR device determines that the user's face is facing to the right and the angle between the user's face and the direction of travel is greater than a preset second threshold (which may be the same as or different from the first threshold; e.g., 45 degrees), the head-mounted XR device displays a prompt interface for prompting the user of travel safety. In one example, the prompt interface includes prompt information for prompting the user for travel safety. Illustratively, as shown in fig. 12, the user's face moves to the left, and the virtual space 10 of the head-mounted XR device 100 displays a position that moves to the left as the user's face is directed. The head-mounted XR device 100 determines that the user's face is facing to the left and that the angle between the user's face orientation and the direction of travel is greater than a threshold of one (e.g., 45 degrees), the virtual space 10 displays a prompt interface 501, the prompt interface 501 including prompt information 502, "line of sight is off the direction of travel, please note safety. In one example, if a user face is detected to turn back from the left direction, determining that the angle between the user face orientation and the direction of travel is less than or equal to a threshold one; or detecting that the face of the user turns back from the front in the right direction, and determining that the included angle between the direction of the face of the user and the travelling direction is smaller than or equal to a second threshold value; the head-mounted XR device switches to displaying a first navigation interface (e.g., navigation interface 201).
According to the user interface display method of the head-mounted XR equipment, when a user wears the head-mounted XR equipment to navigate, corresponding user interfaces are respectively displayed by the head-mounted XR equipment aiming at scenes such as head-up, overlook, look-up, turn left and right and the like of the user. The user can conveniently check navigation information guide, map route details, travel information and other information through the virtual space provided by the head-mounted XR equipment without looking down at the mobile phone screen, and the user is not interfered with the observation of the real scenery, so that convenient and safe navigation display and observation experience are realized.
It will be appreciated that in the above embodiments, the user looks up, looks down, looks up, turns around, etc., and the user interface displayed by the head-mounted XR device is merely one example. In other embodiments, the corresponding user interface when the user's face is oriented in one direction may be different from the examples in the above embodiments. For example, in other embodiments, the second navigation interface is displayed when the user's face is looking up in the front and the travel information interface is displayed when the user's face is looking down in the front. For example, in still other embodiments, the second navigation interface is displayed when the user's face is looking up in the front and the user's face is looking down in the front. For example, in still other embodiments, the weather information interface is displayed when the user's face is looking up in the front. The embodiment of the application does not limit the corresponding relation between the face orientation of the user and the user interface.
The user interface display method of the head-mounted XR device provided by the embodiment of the application can be applied to head-mounted XR devices such as AR glasses, MR virtual reality devices, head-mounted HUD devices, riding helmet display devices and the like. The embodiments of the present application do not impose any limitations on the specific form of the head-mounted XR device.
Fig. 13 illustrates a hardware architecture of a head-mounted XR device 100 provided by an embodiment of the application. As shown in fig. 13, head-mounted XR device 100 may include: processor 110, memory 120, sensor system 130, communication module 140, camera 150, display device 160, audio device 170, and the like. The above components may be coupled and communicate with each other.
It is understood that the structure shown in fig. 13 does not constitute a particular limitation on the headset XR device 100. In other embodiments of the application, head-mounted XR device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. For example, head-mounted XR device 100 may also include physical keys such as a switch key, a volume key, a screen brightness adjustment key, and various types of interfaces, such as a USB interface, etc. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processors may include application processors (application processor, AP), modem processors, graphics processors (graphics processing unit, GPU), image signal processors (image signal processor, ISP), controllers, video codecs, digital signal processors (digital signal processor, DSP), and/or neural-network processors (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish instruction fetching and instruction execution control, so that each component executes corresponding functions, such as man-machine interaction, motion tracking/prediction, rendering display, audio processing and the like.
Memory 120 may store some executable instructions. The memory 120 may include a stored program area and a stored data area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. The storage data area may store data created during use of the head mounted XR device 100 (e.g., audio data, etc.), and so on. In addition, the memory 120 may include a high-speed random access memory, and may also include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. Processor 110 performs various functional applications and data processing of head mounted XR device 100 by executing instructions stored in memory 120 and/or instructions stored in a memory provided in the processor.
The sensor system 130 may include acceleration sensors (accelerometers), gyroscopic sensors, geomagnetic sensors (magnetometers), or other sensors for detecting motion, among others. Sensor system 130 is configured to collect corresponding data, such as acceleration sensor collecting head mounted XR device 100 acceleration, gyroscope sensor collecting head mounted XR device 100 movement speed, geomagnetic sensor collecting head mounted XR device 100 longitude and latitude, etc. The data collected by sensor system 130 may reflect the movement of the user's head (face) wearing the head-mounted XR device 100. In some embodiments, the sensor system 130 may be an inertial measurement unit (inertial measurement unit, IMU) disposed within the head-mounted XR device 100. In some embodiments, head-mounted XR device 100 may send data acquired by the sensor system to processor 110 for analysis. The movement of the user's head (face) may include: whether rotated, the direction of rotation, etc.
The communication module 140 may include a mobile communication module and a wireless communication module. The mobile communication module may provide a solution for wireless communication, including 2G/3G/4G/5G, as applied to the head-mounted XR device 100. The wireless communication module may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wi-Fi network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the head-mounted XR device 100. The wireless communication module may be one or more devices that integrate at least one communication processing module. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS), etc.
The camera 150 may be used to capture still images or video. The still image or video may be an image or video of the surroundings of the externally facing user or an internally facing image or video. The cameras 150 include, but are not limited to, conventional color cameras (RGB cameras), depth cameras (RGB depth cameras), dynamic vision sensor (dynamic vision sensor, DVS) cameras, and the like.
The audio device 170 is used to collect and output audio. The audio device 170 may include, but is not limited to: microphones, speakers, headphones, etc.
The head-mounted XR device 100 presents or displays images via a GPU, a display device 160, and an application processor or the like.
The GPU is a microprocessor for image processing, and is connected to the display device 160 and the application processor. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information. The GPU is used to perform mathematical and geometric calculations from data obtained from the processor 110, render images using computer graphics techniques, computer simulation techniques, and the like, to provide content for display on the display device 160. The GPU is also used to add correction or pre-distortion to the rendering process of the image to compensate or correct for distortion caused by optical components in the display device 160. The GPU may also adjust the content provided to the display device 160 based on data from the sensor system 130. For example, the GPU may add depth information to the content provided to the display device 160 based on the 3D position of the user's eyes, pupil distance, etc.
The display device 160 may include: one or more display screens, one or more optical components. Wherein the display screen may comprise a display panel that may be used to display images to present a stereoscopic virtual scene to a user. The display panel may be LCD, OLED, AMOLED, FLED, miniled, microLed, micro-oLed, QLED, etc. The optical assembly may be used to direct light from the display screen to the exit pupil for perception by a user. In some embodiments, one or more optical elements (e.g., lenses) in the optical assembly may have one or more coatings, such as an anti-reflective coating. The magnification of the image light by the optical assembly allows the display to be physically smaller, lighter, and consume less power. In addition, the magnification of the image light can increase the field of view of the content displayed by the display screen. For example, the optical assembly may cause the field of view of the content displayed by the display screen to be the full field of view of the user.
In an embodiment of the present application, head-mounted XR device 100 may display images in virtual space via display 160, so that the user perceives a 3D scene, providing the user with an AR/VR/MR experience. For example, a navigation interface, a travel information interface, a prompt interface, etc. are displayed.
It is appreciated that in other embodiments, head-mounted XR device 100 may include fewer components. For example, referring to fig. 14, head-mounted XR device 100 may be coupled to an electronic device 200 by wire or wirelessly. The wired connection may include a wired connection that communicates through a universal serial bus (universal serial bus, USB) interface, a high definition multimedia interface (high definition multimedia interface, HDMI) interface, or the like. The wireless connection may include one or more of a wireless connection that communicates via Bluetooth, wi-Fi direct (e.g., wi-Fi p2 p), wi-Fi softAP, wi-Fi LAN, radio frequency, etc. technology. The embodiment of the application does not limit the connection mode of the two. The electronic device 200 may be a cell phone, tablet, notebook, laptop (Laptop), desktop, vehicle-mounted device, or the like. Electronic device 200 may run a particular application program, such as a video application, a gaming application, a music application, a desktop application, a mirrored projection application, etc., to provide content for transmission to head-mounted XR device 100 for display. Or electronic device 200 may provide map data, computing power, etc. to head-mounted XR device 100.
In some embodiments, head-mounted XR device 100 corresponds to an epitaxial display of electronic device 200. Electronic device 200 provides display data for head-mounted XR device 100. Electronic device 200 may also function as an input device for head-mounted XR device 100. When the electronic device 200 is used as an input device, user input may be received through a variety of sensors configured thereto, such as a touch-sensitive sensor, an acceleration sensor, a gyro sensor, a magnetic sensor, a pressure sensor, and the like. The head-mounted XR device 100 may be configured with some physical keys to receive some user input, such as keys for switching screens, keys for adjusting screen brightness, keys for switching between spatial and mirror modes, etc. These user inputs may be transmitted to electronic device 200 through a wired or wireless communication connection between head-mounted XR device 100 and electronic device 200, which in turn triggers electronic device 200 to respond thereto. After the user sees the image displayed by head-mounted XR device 100 in the virtual space, the user may input a user operation to control the display content in the virtual space and the operating state of head-mounted XR device 100, such as the on-off state, screen brightness, etc., through input to electronic device 200 or head-mounted XR device 100.
In one embodiment, after head-mounted XR device 100 collects motion data of the user's head (face) via a gyroscope sensor, an acceleration sensor, a geomagnetic sensor, etc. for detecting motion, the motion data of the user's head (face) is transmitted to electronic device 200 via communication module 140. Electronic device 200 determines the pose of head-mounted XR device 100, i.e., determines the head (face) orientation of the user, from the motion data of the head (face) of the user. Further, electronic device 200 generates image data for a user interface (e.g., a first navigation interface, a second navigation interface, a travel information interface, a reminder interface, etc.) for virtual space display of headset XR device 100 based on a user head (face) orientation. Electronic device 200 transmits the generated image data to head-mounted XR device 100 via a wireless communication module or a wired interface. Head-mounted XR device 100 displays a corresponding user interface based on the received image data.
In another embodiment, after the head-mounted XR device 100 collects motion data of the user's head (face) through a gyroscope sensor, an acceleration sensor, a geomagnetic sensor, or the like for detecting motion, the posture of the head-mounted XR device 100, that is, the orientation of the user's head (face), is determined according to the motion data of the user's head (face). Head-mounted XR device 100 sends information of the head (face) orientation of the user to electronic device 200. Electronic device 200 generates image data for a user interface (e.g., a first navigation interface, a second navigation interface, a travel information interface, a hint interface, etc.) of a virtual space display of head-mounted XR device 100 based on a user head (face) orientation. Electronic device 200 transmits the generated image data to head-mounted XR device 100 via a wireless communication module or a wired interface. Head-mounted XR device 100 displays a corresponding user interface based on the received image data.
In another embodiment, after the head-mounted XR device 100 collects motion data of the user's head (face) through a gyroscope sensor, an acceleration sensor, a geomagnetic sensor, or the like for detecting motion, the posture of the head-mounted XR device 100, that is, the orientation of the user's head (face), is determined according to the motion data of the user's head (face). The head-mounted XR device 100 also obtains map information from the electronic device 200. The head-mounted XR device 100 obtains location information of a user via a geomagnetic sensor or obtains location information of a user from the electronic device 200. Further, the head-mounted XR device 100 generates and displays a user interface (e.g., a first navigation interface, a second navigation interface, a travel information interface, a prompt interface, etc.) for virtual space display according to a user head (face) orientation, map information, user position information, etc.
In another embodiment, after the head-mounted XR device 100 collects motion data of the user's head (face) through a gyroscope sensor, an acceleration sensor, a geomagnetic sensor, or the like for detecting motion, the posture of the head-mounted XR device 100, that is, the orientation of the user's head (face), is determined according to the motion data of the user's head (face). The head-mounted XR device 100 also holds map information. The head-mounted XR device 100 also obtains positional information of the user via a geomagnetic sensor. Further, the head-mounted XR device 100 generates and displays a user interface (e.g., a first navigation interface, a second navigation interface, a travel information interface, a prompt interface, etc.) for virtual space display according to a user head (face) orientation, map information, user position information, etc.
It will be appreciated that the head mounted XR device described above incorporates hardware structures and/or software modules for performing the respective functions in order to achieve the functions described above. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
The embodiment of the application can divide the functional modules of the head-mounted XR device according to the method example, for example, each functional module can be divided corresponding to each function, or two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
In the case of an integrated unit, fig. 15 shows a schematic diagram of one possible configuration of the head-mounted XR device involved in the above embodiment. The head-mounted XR device 1400 comprises: a processing unit 1401, a display unit 1402, a sensor unit 1403, and a storage unit 1404. Wherein the processing unit 1401 is configured to control and manage the motion of the head-mounted XR device 1400, for example, generate a corresponding user interface according to the face orientation of the user; the sensor unit 1403 is used to collect motion data during the user's travel; the display unit 1402 is used for displaying a user interface; the memory unit 1404 is used to hold instructions and data of the head-mounted XR device 1400, which may be used to perform various steps of corresponding embodiments of the application.
Of course, the unit modules in the head-mounted XR device 1400 described above include, but are not limited to, the processing unit 1401, the display unit 1402, the sensor unit 1403, and the storage unit 1404 described above. For example, a communication unit or the like may also be included in the head-mounted XR device 1400 for communicating with other electronic devices of the head-mounted XR device 1400.
The processing unit 1401 may be a processor or a controller, such as a central processing unit (central processing unit, CPU), a digital signal processor (digital signal processor, DSP), an application-specific integrated circuit (application-specific integrated circuit, ASIC), a field programmable gate array (field programmable gate array, FPGA) or other programmable logic device, transistor logic device, hardware components, or any combination thereof. The display unit 1402 may be a display screen. The sensor unit 1403 may include a gyro sensor, an acceleration sensor, a geomagnetic sensor, and the like. The storage unit 1404 may be a memory. The communication unit may be a transceiver, a transceiving circuit, etc.
The head-mounted XR device 1400 provided by embodiments of the application may be the head-mounted XR device 100 shown in FIG. 13. Wherein the processor, memory, display screen, communication interface, etc. may be coupled together, such as via a bus. The processor invokes the memory-stored program code to perform the steps in the method embodiments above.
Embodiments of the present application also provide a computer readable storage medium having computer program code stored therein, which when executed by the above-described processor, causes the head-mounted XR device to perform the method of the above-described embodiments.
The embodiments of the present application also provide a computer program product which, when run on a computer, causes the computer to perform the method of the above embodiments.
The head-mounted XR device 1400, the computer-readable storage medium, or the computer program product provided by the embodiments of the present application are used to perform the corresponding methods provided above, and therefore, the advantages achieved by the embodiments of the present application may refer to the advantages in the corresponding methods provided above, and are not described herein.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units described above may be implemented either in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a magnetic disk or an optical disk.
The foregoing is merely illustrative of specific embodiments of the present application, and the scope of the present application is not limited thereto, but any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (15)
1. A method of displaying a user interface of a head-mounted XR device, comprising:
displaying a first interface in a virtual space of the head-mounted XR device, wherein the first interface comprises navigation information, and the navigation information is used for navigating a first user travelling process in combination with a route in a real scene; the first user is a user wearing the head-mounted XR device; when the virtual space displays the first interface, the face of the first user faces a first direction;
when the face of the first user faces a second direction, the virtual space displays a second interface; the second interface comprises a navigation map and detailed information or first information in the travelling process of the first user; the first information includes at least one of a clock, a travel duration, a remaining duration, a speed of time, a movement status, and a calorie consumption.
2. The method according to claim 1, wherein the method further comprises:
when the face of the first user faces a third direction, the virtual space displays a third interface; the third interface includes prompt information for prompting the first user for travel safety.
3. The method of claim 1, wherein the first user's face facing in a first direction comprises:
the face of the first user is looking forward in a head-up direction.
4. The method of claim 1, wherein the first user's face facing in the second direction comprises:
the face of the first user is looking up in the front direction.
5. The method of claim 1, wherein the first user's face facing in the second direction comprises:
the face of the first user is looking down forward.
6. The method of claim 2, wherein the first user's face facing in a third direction comprises:
the face of the first user is directed to the left, or the face of the first user is directed to the right.
7. The method of claim 6, wherein the virtual space displaying a third interface when the face of the first user is oriented in a third direction comprises:
And when the included angle between the face orientation of the first user and the advancing direction of the first user is larger than a preset threshold value, displaying a third interface in the virtual space.
8. The method of any of claims 1-7, wherein the portion of the first interface other than the navigation information is displayed transparently.
9. The method of any of claims 1-7, wherein the navigation information includes at least one of a direction of travel, a distance of travel.
10. The method of any of claims 1-7, wherein the navigation information is displayed in the virtual space at a location corresponding to a center of a user travel route.
11. The method of any one of claims 1-7, wherein the head-mounted XR device comprises a sensor comprising a gyroscope sensor, an acceleration sensor, or a geomagnetic sensor, the method further comprising:
the head-mounted XR device obtains a facial orientation of the first user via the sensor.
12. The method of claim 11, wherein the virtual space displaying a second interface when the face of the first user is facing a second direction comprises:
In response to detecting that the face of the first user is facing a second direction, the head-mounted XR device generates the second interface and displays the second interface in the virtual space.
13. The method of claim 11, wherein the virtual space displaying a second interface when the face of the first user is facing a second direction comprises:
in response to detecting that the face of the first user is facing a second direction, the head-mounted XR device sends first information to an electronic device; the first information is used for indicating that the face of the first user faces a second direction;
the head-mounted XR device receiving image data of a second interface from the electronic device; the image data of the second interface is generated according to the first information;
the head-mounted XR device displays the second interface in the virtual space according to the image data of the second interface.
14. A head-mounted XR device, comprising: a processor, a memory, and a sensor; the sensor is for detecting a facial orientation of a user wearing the head mounted XR device, the memory having stored therein one or more computer programs comprising instructions which, when executed by the head mounted XR device, cause the head mounted XR device to perform the method of any one of claims 1-13.
15. A computer-readable storage medium comprising computer instructions; the computer instructions, when run on an electronic device, cause the electronic device to perform the method of any one of claims 1-13.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311514304.5A CN117234340A (en) | 2023-11-14 | 2023-11-14 | Method and device for displaying user interface of head-mounted XR device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311514304.5A CN117234340A (en) | 2023-11-14 | 2023-11-14 | Method and device for displaying user interface of head-mounted XR device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117234340A true CN117234340A (en) | 2023-12-15 |
Family
ID=89086462
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311514304.5A Pending CN117234340A (en) | 2023-11-14 | 2023-11-14 | Method and device for displaying user interface of head-mounted XR device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117234340A (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120188148A1 (en) * | 2011-01-24 | 2012-07-26 | Microvision, Inc. | Head Mounted Meta-Display System |
CN105209959A (en) * | 2013-03-14 | 2015-12-30 | 高通股份有限公司 | User interface for a head mounted display |
US20170092002A1 (en) * | 2015-09-30 | 2017-03-30 | Daqri, Llc | User interface for augmented reality system |
CN107167147A (en) * | 2017-05-02 | 2017-09-15 | 深圳市元征科技股份有限公司 | Air navigation aid, glasses and readable storage medium storing program for executing based on arrowband Internet of Things |
CN108062159A (en) * | 2016-11-07 | 2018-05-22 | 宏达国际电子股份有限公司 | Media can be read in the method, apparatus and non-transient computer of virtual reality or augmented reality |
KR20190016264A (en) * | 2017-08-08 | 2019-02-18 | 한국과학기술연구원 | Interraction device and method for navigating in virtual reality using walking in place |
US20200233487A1 (en) * | 2019-01-23 | 2020-07-23 | Samsung Electronics Co., Ltd. | Method of controlling device and electronic device |
CN113253843A (en) * | 2021-05-24 | 2021-08-13 | 哈尔滨工业大学 | Indoor virtual roaming implementation method and system based on panorama |
-
2023
- 2023-11-14 CN CN202311514304.5A patent/CN117234340A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120188148A1 (en) * | 2011-01-24 | 2012-07-26 | Microvision, Inc. | Head Mounted Meta-Display System |
CN105209959A (en) * | 2013-03-14 | 2015-12-30 | 高通股份有限公司 | User interface for a head mounted display |
US20170092002A1 (en) * | 2015-09-30 | 2017-03-30 | Daqri, Llc | User interface for augmented reality system |
CN108062159A (en) * | 2016-11-07 | 2018-05-22 | 宏达国际电子股份有限公司 | Media can be read in the method, apparatus and non-transient computer of virtual reality or augmented reality |
CN107167147A (en) * | 2017-05-02 | 2017-09-15 | 深圳市元征科技股份有限公司 | Air navigation aid, glasses and readable storage medium storing program for executing based on arrowband Internet of Things |
KR20190016264A (en) * | 2017-08-08 | 2019-02-18 | 한국과학기술연구원 | Interraction device and method for navigating in virtual reality using walking in place |
US20200233487A1 (en) * | 2019-01-23 | 2020-07-23 | Samsung Electronics Co., Ltd. | Method of controlling device and electronic device |
CN113253843A (en) * | 2021-05-24 | 2021-08-13 | 哈尔滨工业大学 | Indoor virtual roaming implementation method and system based on panorama |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11691079B2 (en) | Virtual vehicle control method in virtual scene, computer device, and storage medium | |
JP7268692B2 (en) | Information processing device, control method and program | |
EP3486707B1 (en) | Perception based predictive tracking for head mounted displays | |
US10482662B2 (en) | Systems and methods for mixed reality transitions | |
CA2913650C (en) | Virtual object orientation and visualization | |
EP3097552B1 (en) | Environmental interrupt in a head-mounted display and utilization of non field of view real estate | |
US20180018792A1 (en) | Method and system for representing and interacting with augmented reality content | |
EP3258698A1 (en) | Server, user terminal device, and control method therefor | |
EP3667622B1 (en) | Information processing device, information processing method, and program | |
WO2016031358A1 (en) | Display control device, display control method, and program | |
US20130265331A1 (en) | Virtual Reality Telescopic Observation System of Intelligent Electronic Device and Method Thereof | |
CN108351736B (en) | Wearable display, image display device, and image display system | |
CN111630852A (en) | Information processing apparatus, information processing method, and program | |
JP2019125278A (en) | Information processing device, information processing method, and recording medium | |
KR102578119B1 (en) | Smart glasses operation method interworking to mobile device | |
CN117234340A (en) | Method and device for displaying user interface of head-mounted XR device | |
CN111344776B (en) | Information processing device, information processing method, and program | |
KR20180055637A (en) | Electronic apparatus and method for controlling thereof | |
EP4325476A1 (en) | Video display system, information processing method, and program | |
US12061737B2 (en) | Image processing apparatus, image processing method, and storage device | |
WO2024057783A1 (en) | Information processing device provided with 360-degree image viewpoint position identification unit | |
CN118474442A (en) | Method, device, equipment and medium for adjusting interaction area | |
KR20220104976A (en) | Half-mirror virtual reality headset platform | |
CN113597632A (en) | Information processing apparatus, information processing method, and recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |