CN111813230A - Interaction method and device on AR glasses - Google Patents

Interaction method and device on AR glasses Download PDF

Info

Publication number
CN111813230A
CN111813230A CN202010957714.7A CN202010957714A CN111813230A CN 111813230 A CN111813230 A CN 111813230A CN 202010957714 A CN202010957714 A CN 202010957714A CN 111813230 A CN111813230 A CN 111813230A
Authority
CN
China
Prior art keywords
canvas
area
interaction
glasses
anchor point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010957714.7A
Other languages
Chinese (zh)
Other versions
CN111813230B (en
Inventor
欧阳磊
孙超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yutou Technology Hangzhou Co Ltd
Original Assignee
Yutou Technology Hangzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yutou Technology Hangzhou Co Ltd filed Critical Yutou Technology Hangzhou Co Ltd
Priority to CN202010957714.7A priority Critical patent/CN111813230B/en
Publication of CN111813230A publication Critical patent/CN111813230A/en
Application granted granted Critical
Publication of CN111813230B publication Critical patent/CN111813230B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to an interaction method and device on AR glasses, wherein the method comprises the following steps: acquiring and recording the real-time rotation angle of the AR glasses; dividing a canvas into a display area, an interaction area, and a shortcut operation area, wherein the canvas moves in response to the AR glasses rotation angle; performing an interaction according to a location of an anchor point on a display screen on the canvas.

Description

Interaction method and device on AR glasses
Technical Field
The present disclosure relates to the field of augmented reality technologies, and in particular, to an interaction method and apparatus on AR glasses.
Background
Along with the development of human-computer interaction technology, AR glasses are increasingly applied to daily life of people, such as watching movies, playing games, road condition navigation and the like, when the AR glasses are used, a user wears the AR glasses, browsing and interaction of interface contents are completed through vertical and horizontal rotation of the head in a software interface of a free space, all the interface contents are fixed at the relative position of the head, and elements preset in advance at the angle can be seen when the head rotates to a corresponding angle.
The existing interaction method on the AR glasses is inlet type interaction, no function partition or shortcut is arranged on an interface of the AR glasses, all functions and contents are displayed in a canvas, when a user needs to use some shortcut functions at high frequency, the use is inconvenient, the function searching is not fast and convenient enough, all interaction inlets are fixed at a certain position, the user needs to position the position of a certain function by memory, and when the user wants to trigger certain operation, the user needs to search the function inlets in the space back and forth.
Disclosure of Invention
The purpose of the present disclosure is to provide an interaction method and apparatus on AR glasses.
The purpose of the present disclosure is achieved by the following technical means. The interaction method on the AR glasses provided by the present disclosure includes the following steps: acquiring and recording the real-time rotation angle of the AR glasses; dividing a canvas into a display area, an interaction area, and a shortcut operation area, wherein the canvas moves in response to the AR glasses rotation angle; performing an interaction according to a location of an anchor point on a display screen on the canvas.
The object of the present disclosure can be further achieved by the following technical measures.
The above-mentioned interaction method on the AR glasses, wherein the real-time rotation angle of the AR glasses includes a horizontal rotation angle and a vertical rotation angle.
The above-mentioned interaction method on the AR glasses, wherein the canvas rotates in a direction opposite to the AR glasses rotation direction.
The interaction method on the AR glasses comprises the steps of providing a canvas with a coordinate system, and providing the content on the canvas with respective coordinate areas.
The above-mentioned interaction method on the AR glasses, wherein the anchor point is a center point of the display screen.
In the above interaction method for the AR glasses, the performing interaction according to the partition of the anchor point on the display screen on the canvas includes limiting the anchor point to an upper boundary of the interaction area if the anchor point is in the display area.
In the above interaction method on the AR glasses, during the movement of the canvas, the display area and the shortcut operation area are located in the center of the display screen all the time in the horizontal direction, and move along with the canvas in the vertical direction; or the interaction area and the display area are always positioned in the center of the display screen in the transverse direction and move along with the canvas in the longitudinal direction.
The interaction method on the AR glasses, wherein the performing interaction according to the partition where the anchor point is located comprises: if the anchor point is in the interaction area, then remove the canvas and will the anchor point with the coordinate region of the interactive control of needs execution in the interaction area compares, makes simultaneously swift operation area, display area horizontal along with the canvas displacement opposite direction remove with the distance that the canvas displacement equals makes swift operation area, display area vertically follow the canvas removes.
The interaction method on the AR glasses, wherein the performing interaction according to the partition where the anchor point is located comprises: if the anchor point is in swift operation area, then remove the canvas and will the anchor point with need carry out the coordinate region of interactive button and compare in the swift operation area, make simultaneously mutual district, display area on horizontal along with the canvas displacement opposite direction remove with the distance that the canvas displacement equals makes mutual district, display area follow on vertical the canvas removes.
In an embodiment of the present invention, if the stay time of the anchor point in the control area or the button area exceeds a specified time, the control or the button is executed.
The purpose of the present disclosure is also achieved by the following technical solutions. An apparatus for interaction on AR glasses according to the present disclosure includes: the three-axis gyroscope sensor is configured to acquire and record the real-time rotation angle of the AR glasses; the partition module is configured to divide the canvas into a display area, an interaction area and a shortcut operation area; the displacement calculation module is configured to calculate the transverse displacement S and the longitudinal displacement T of the canvas according to the difference values of the current horizontal rotation angle and the current vertical rotation angle and the last recorded horizontal rotation angle and vertical rotation angle; the canvas moving module is configured to move the canvas in the transverse direction or the longitudinal direction according to the calculation result of the displacement calculation module; a display area movement module configured to move the display area laterally in a direction opposite the canvas displacement by a distance equal to the canvas displacement, causing the display area to move longitudinally following the canvas; an interaction region moving module configured to, when the anchor point is at the shortcut operation region, move the interaction region laterally in a direction opposite to the canvas displacement by a distance equal to the canvas displacement, and move the interaction region longitudinally along with the canvas; the shortcut operation area moving module is configured to enable the shortcut operation area to move in a direction opposite to the canvas displacement in a transverse direction by a distance equal to the canvas displacement when the anchor point is in the interaction area, and enable the shortcut operation area to move along with the canvas in a longitudinal direction; and the anchor point limiting module is configured to limit the anchor point to the upper side of the interactive area to limit the position of the anchor point when the anchor point is in the display area.
The aforementioned apparatus for interaction on AR glasses further comprises: the filling module is configured to fill the display area, the interaction area and the shortcut operation area on the canvas; the comparison module is configured to compare the anchor point with a coordinate area of a control or a button; and the timing module is configured to calculate the time of the anchor point staying in the control area or the button area.
The object of the present disclosure can be further achieved by the following technical measures.
The above-mentioned device for interaction on AR glasses further comprises a module processor for executing any one of the above-mentioned methods for interaction on AR glasses.
The purpose of the present disclosure is also achieved by the following technical solutions. According to the present disclosure, an AR glasses comprises any one of the above-mentioned devices for interaction on AR glasses.
The method has the advantages that 1, under the condition that the AR glasses interaction interface exceeds the display screen, a user can be switched to the display area and the quick operation area from the interaction area at any time, interaction experience of the user under the oversized interaction interface is improved 2. the user can free hands when using the AR glasses, and all operations of the interface can be completed only by means of head movement.
The foregoing is a summary of the present disclosure, and for the purposes of promoting a clear understanding of the technical means of the present disclosure, the present disclosure may be embodied in other specific forms without departing from the spirit or essential attributes thereof.
Drawings
Fig. 1 is a schematic flow chart of an interaction method on AR glasses according to an embodiment of the present disclosure.
Figure 2 is a schematic diagram of a canvas and display screen on AR glasses according to one embodiment of the present disclosure.
Fig. 3 is a block diagram of an interaction device on AR glasses according to an embodiment of the present disclosure.
Detailed Description
To further illustrate the technical means and effects of the present disclosure adopted to achieve the predetermined objects, the following detailed description will be given to specific embodiments, structures, features and effects of the interaction method and device on AR glasses according to the present disclosure with reference to the accompanying drawings and preferred embodiments.
Fig. 1 is a schematic flow chart diagram of one embodiment of an interaction method on AR glasses of the present disclosure. Referring to fig. 1, an interaction method on AR glasses according to an example of the present disclosure mainly includes the following steps:
and step S11, acquiring and recording the real-time angle of the AR glasses.
Specifically, a three-axis gyroscope sensor is adopted to continuously record current motion data of the head of an AR glasses wearer, the three-axis gyroscope is also called as a micromechanical gyroscope and is characterized by being capable of simultaneously carrying out position determination work in six directions and determining moving tracks and acceleration of the directions. Thereafter, the process proceeds to step S12.
Step S12, dividing a canvas into a display area, an interaction area, and a shortcut operation area, wherein the canvas moves in response to the AR glasses rotation angle.
As shown in fig. 2, the canvas is a virtual display area formed on the AR glasses, and the dashed box represents a display screen of the AR glasses. After wearing the AR glasses, a user can view contents on the canvas through a display screen projected by the AR glasses, the size of the canvas can be calculated according to the contents filled into the three partitions, the distance between the contents in the partitions and the reserved boundary distance, and the whole canvas does not contain other areas except the three partitions. Specifically, the canvas is divided into a display area, an interaction area and a shortcut operation area, and corresponding contents are filled in each area, the display area, the interaction area and the shortcut operation area are respectively located at the top, the middle and the bottom of the canvas, and the contents in the display area generally include information of display states, such as electric quantity, time, WIFI, bluetooth and the like; the content of the interaction zone typically includes various application controls, such as: camera, sweep, flashlight, address book, etc., the bottom quick interaction area typically includes various setup buttons, such as: volume, display, power saving mode, etc. Thereafter, the process proceeds to step S13. It will be appreciated that in one embodiment, the canvas may also be partitioned from left to right, for example, the left portion of the canvas is the display area, the middle portion is the interactive area, and the right portion is the shortcut region. In another embodiment, a plurality of display areas or a plurality of shortcut operation areas or a plurality of interaction areas may be provided. Usually, the interaction area is arranged in the middle of the canvas, so that a user can conveniently and directly interact. The number and arrangement of the partitions can be selected as the case may be.
In an embodiment of the present disclosure, the canvas moves in response to the rotation angle of the AR glasses, specifically, a three-axis gyroscope sensor is employed to collect real-time horizontal rotation angle and vertical rotation angle of the AR glasses; the displacement calculation module calculates the transverse displacement S and the longitudinal displacement T of the canvas in the canvas according to the difference value between the current horizontal rotation angle and the current vertical rotation angle and the last recorded horizontal rotation angle and vertical rotation angle; the canvas moving module moves the canvas according to the transverse displacement S and the longitudinal displacement T, and it needs to be explained that the canvas moves along with the head part and the display screen is static, according to the common habit of a user, the upper content is checked by raising the head, the lower content is checked by lowering the head, the left content is checked by turning the head left, and the right content is checked by turning the right, so the canvas responds to the raising angle acquired by the three-axis gyroscope sensor, the canvas moving module moves the canvas downwards, the canvas responds to the left turning angle acquired by the three-axis gyroscope sensor, the canvas moving module moves the canvas to the right, and the lowering of the head and the right turning can be analogized, and the description is omitted.
Step S13, interaction is performed according to the partition of the anchor point on the display screen on the canvas.
Specifically, the anchor point is the center point of the display screen. After the user wears the AR glasses, the display area and the bottom quick interaction area are located in the center in the transverse direction of the display screen, and the user rotates the movable canvas through the head to find out the control or the button needing to be operated on the interaction area and the quick operation area.
Specifically, if in the moving process, the anchor point enters the display area, and the display area is a state display area, so that user operation is not needed, and the anchor point is limited at the upper boundary of the interaction area at the moment, so that the user cannot continue to look up, and the browsing efficiency of the user is improved.
Specifically, if remove the in-process, the anchor point gets into the interactive area, if the user needs look for the controlling part that needs the operation in the interactive area, then rotate through the head and remove the canvas so that the controlling part coincides with the position of anchor point, swift operating area, display area move the distance that equals with the canvas displacement along the direction opposite with the canvas displacement on horizontal simultaneously for swift operating area, display area are located the central authorities of display screen all the time in the horizontal direction, swift operating area, display area follow on vertical the canvas remove, the in-process that the user browsed at will in the interactive area like this, no matter browse to the position of which controlling part, as long as the new line can look over the display area, the swift operating area can be looked over to the low head.
Specifically, if in the moving process, the anchor point enters the shortcut operation area, if a user needs to find the button needing to be operated in the shortcut operation area, the canvas is moved through the rotation of the head so that the position of the button coincides with the position of the anchor point, meanwhile, the interaction area and the display area are moved in the transverse direction along the direction opposite to the canvas displacement by the distance equal to the canvas displacement, the interaction area and the display area are moved along the canvas in the longitudinal direction, and therefore the user can quickly return to the interaction area for interaction in the interaction area after the user operates the button in the shortcut operation area.
Specifically, the canvas has a coordinate system, the content of each partition has a respective coordinate area, and when a user interacts with a certain control or button in an interaction area or a shortcut operation area, if an anchor point stays in the control area or the button area and lasts for a set time (for example, 3 s), the control or button is automatically executed, so that the user can complete all interactions only by head control.
Fig. 3 is a schematic block diagram of one embodiment of an interaction device on AR glasses of the present disclosure. Referring to fig. 3, an interaction device 100 on AR glasses according to an example of the present disclosure mainly includes: the three-axis gyroscope comprises a three-axis gyroscope sensor 101, a partitioning module 102, a padding module 103, a displacement calculation module 104, a canvas movement module 105, a display area movement module 106, an interaction area movement module 107, a shortcut operation area movement module 108, an anchor point limiting module 109, a comparison module 110, and a timing module 111.
The three-axis gyroscope sensor 101 is used for acquiring and recording the real-time rotation angle of the AR glasses.
Specifically, the three-axis gyroscope sensor 101 collects and records the horizontal rotation angle and the vertical rotation angle of the AR glasses in real time.
The partition module 102 is configured to divide the canvas into a display area, an interaction area, and a shortcut region.
Specifically, the partition module 102 divides the display area, the interaction area, and the shortcut operation area into a top portion, a middle portion, and a bottom portion of the canvas, respectively, and a distance is reserved between each partition.
The filling module 103 is configured to fill the display area, the interaction area, and the shortcut operation area on the canvas.
The displacement calculation module 104 is configured to calculate the horizontal displacement S and the vertical displacement T of the canvas in real time according to the difference between the current horizontal rotation angle and the current vertical rotation angle of the AR glasses and the last recorded horizontal rotation angle and vertical rotation angle.
The canvas moving module 105 is configured to move the moving canvas in the lateral direction or the longitudinal direction according to the calculation result of the displacement calculation module.
The display area movement module 106 is configured to move the display area laterally in a direction opposite to the canvas displacement by a distance equal to the canvas displacement, and to move the display area longitudinally following the canvas.
The interaction zone moving module 107 is configured to, when the anchor point is at the shortcut operation zone, move the interaction zone in a lateral direction by a distance equal to the canvas displacement in a direction opposite to the canvas displacement, and move the interaction zone longitudinally along with the canvas.
The shortcut operation area moving module 108 is configured to, when the anchor point is at the interaction area, move the shortcut operation area in a direction opposite to the canvas displacement in a horizontal direction by a distance equal to the canvas displacement, and move the shortcut operation area along the canvas in a vertical direction.
The anchor point limiting module 109 is configured to limit the anchor point to an upper boundary of the interactive area when the anchor point is in the display area.
The comparing module 110 is configured to compare the position of the anchor point with a control coordinate area or a button coordinate area.
A timing module 111 configured to calculate the anchor point dwell time in the control area or the button area.
In the above, according to the interaction method on the AR glasses in the embodiment of the present disclosure, the canvas follows the head control, the display screen is static, and the display area, the interaction area, and the shortcut operation area are moved in the process of operating by moving the canvas to find the control or the button through the head control, so that the user can see the top state display area only by raising the head and the shortcut operation area only by lowering the head, thereby facilitating the user to browse, simplifying the interaction, and completing all interactions only by the head control.
The foregoing describes the general principles of the present disclosure in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in the present disclosure are merely examples and are not limiting, and they should not be considered essential to the various embodiments of the present disclosure. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the disclosure is not intended to be limited to the specific details so described.
The block diagrams of devices and apparatuses referred to in this disclosure are only used as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
Also, as used herein, "or" as used in a list of items beginning with "at least one" indicates a separate list, such that, for example, a list of "A, B or at least one of C" means A or B or C, or AB or AC or BC, or ABC (i.e., A and B and C). Furthermore, the word "exemplary" does not mean that the described example is preferred or better than other examples.
It is also noted that in the systems and methods of the present disclosure, components or steps may be decomposed and/or re-combined. These decompositions and/or recombinations are to be considered equivalents of the present disclosure.
Various changes, substitutions and alterations to the techniques described herein may be made without departing from the techniques of the teachings as defined by the appended claims. Moreover, the scope of the claims of the present disclosure is not limited to the particular aspects of the process, machine, manufacture, composition of matter, means, methods and acts described above. Processes, machines, manufacture, compositions of matter, means, methods, or acts, presently existing or later to be developed that perform substantially the same function or achieve substantially the same result as the corresponding aspects described herein may be utilized. Accordingly, the appended claims are intended to include within their scope such processes, machines, manufacture, compositions of matter, means, methods, or acts.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present disclosure. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the disclosure. Thus, the present disclosure is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, this description is not intended to limit embodiments of the disclosure to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

Claims (15)

1. A method of interaction on AR glasses, the method comprising:
acquiring and recording the real-time rotation angle of the AR glasses;
dividing a canvas into a display area, an interaction area, and a shortcut operation area, wherein the canvas moves in response to the AR glasses rotation angle;
performing an interaction according to a location of an anchor point on a display screen on the canvas.
2. The interactive method on AR glasses according to claim 1, wherein the real-time rotation angle of the AR glasses comprises a horizontal rotation angle and a vertical rotation angle.
3. The interaction method on AR glasses according to claim 1, wherein the canvas rotation direction is opposite to the AR glasses rotation direction.
4. The interaction method on AR glasses according to claim 1, wherein the canvas has a coordinate system, the content on the canvas each having a respective coordinate region.
5. The interactive method on AR glasses according to claim 1, wherein the anchor point is a center point of the display screen.
6. The interaction method on AR glasses according to claim 5, wherein said performing an interaction according to a partition of an anchor point on a display screen on the canvas comprises:
and if the anchor point is in the display area, limiting the anchor point to the upper boundary of the interactive area.
7. The interaction method on the AR glasses according to claim 5, wherein the canvas moves, the display area and the shortcut operation area are always located at the center of the display screen in the transverse direction and move along with the canvas in the longitudinal direction; or the interaction area and the display area are always positioned in the center of the display screen in the transverse direction and move along with the canvas in the longitudinal direction.
8. The interaction method on the AR glasses according to claim 7, wherein if the anchor point is in the interaction area, a canvas is moved and the anchor point is compared with a coordinate area of a control needing interaction in the interaction area, and meanwhile, the shortcut operation area and the display area are moved in a direction opposite to the canvas displacement in the transverse direction by a distance equal to the canvas displacement, so that the shortcut operation area and the display area move along with the canvas in the longitudinal direction.
9. The interaction method on the AR glasses according to claim 7, wherein if the anchor point is in the shortcut operation area, moving a canvas and comparing the anchor point with a coordinate area of a button to be interacted in the shortcut operation area, and moving the interaction area and the display area in a horizontal direction along a direction opposite to a displacement of the canvas by a distance equal to the displacement of the canvas, so that the interaction area and the display area move along with the canvas in a vertical direction.
10. The interactive method on AR glasses of claim 8, wherein the control is executed if a dwell time of the anchor point on the control area exceeds a specified time.
11. The interaction method on AR glasses according to claim 9, wherein the button is executed if the dwell time of the anchor point on the button area exceeds a specified time.
12. An apparatus for interaction on AR glasses, comprising:
the three-axis gyroscope sensor is configured to acquire and record the real-time rotation angle of the AR glasses;
the partition module is configured to divide the canvas into a display area, an interaction area and a shortcut operation area;
the displacement calculation module is configured to calculate the transverse displacement S and the longitudinal displacement T of the canvas according to the difference values of the current horizontal rotation angle and the current vertical rotation angle and the last recorded horizontal rotation angle and vertical rotation angle;
the canvas moving module is configured to move the canvas in the transverse direction or the longitudinal direction according to the calculation result of the displacement calculation module;
a display area movement module configured to move the display area laterally in a direction opposite the canvas displacement by a distance equal to the canvas displacement, causing the display area to move longitudinally following the canvas;
an interaction region moving module configured to, when an anchor point on a display screen is at the shortcut operation region, move the interaction region laterally in a direction opposite to the canvas displacement by a distance equal to the canvas displacement, and move the interaction region longitudinally along with the canvas;
the shortcut operation area moving module is configured to enable the shortcut operation area to move in a direction opposite to the canvas displacement in a transverse direction by a distance equal to the canvas displacement when the anchor point is in the interaction area, and enable the shortcut operation area to move along with the canvas in a longitudinal direction;
and the anchor point limiting module is configured to limit the anchor point to the upper side of the interactive area to limit the position of the anchor point when the anchor point is in the display area.
13. The apparatus of on-AR glasses interaction of claim 12, further comprising:
the filling module is configured to fill the display area, the interaction area and the shortcut operation area on the canvas;
the comparison module is configured to compare the anchor point with a coordinate area of a control or a button;
and the timing module is configured to calculate the time of the anchor point staying in the control area or the button area.
14. The apparatus of interacting on AR glasses according to claim 12, further comprising a module performing the interaction method of any of claims 2 to 11.
15. AR glasses comprising an apparatus for interaction on AR glasses according to claim 12.
CN202010957714.7A 2020-09-14 2020-09-14 Interaction method and device on AR glasses Active CN111813230B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010957714.7A CN111813230B (en) 2020-09-14 2020-09-14 Interaction method and device on AR glasses

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010957714.7A CN111813230B (en) 2020-09-14 2020-09-14 Interaction method and device on AR glasses

Publications (2)

Publication Number Publication Date
CN111813230A true CN111813230A (en) 2020-10-23
CN111813230B CN111813230B (en) 2021-03-19

Family

ID=72860133

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010957714.7A Active CN111813230B (en) 2020-09-14 2020-09-14 Interaction method and device on AR glasses

Country Status (1)

Country Link
CN (1) CN111813230B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955461A (en) * 2016-04-25 2016-09-21 乐视控股(北京)有限公司 Interactive interface management method and system
CN106998409A (en) * 2017-03-21 2017-08-01 华为技术有限公司 A kind of image processing method, head-mounted display and rendering apparatus
CN107957774A (en) * 2016-10-18 2018-04-24 阿里巴巴集团控股有限公司 Exchange method and device in virtual reality space environment
CN108073432A (en) * 2016-11-07 2018-05-25 亮风台(上海)信息科技有限公司 A kind of method for displaying user interface of head-mounted display apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955461A (en) * 2016-04-25 2016-09-21 乐视控股(北京)有限公司 Interactive interface management method and system
CN107957774A (en) * 2016-10-18 2018-04-24 阿里巴巴集团控股有限公司 Exchange method and device in virtual reality space environment
CN108073432A (en) * 2016-11-07 2018-05-25 亮风台(上海)信息科技有限公司 A kind of method for displaying user interface of head-mounted display apparatus
CN106998409A (en) * 2017-03-21 2017-08-01 华为技术有限公司 A kind of image processing method, head-mounted display and rendering apparatus

Also Published As

Publication number Publication date
CN111813230B (en) 2021-03-19

Similar Documents

Publication Publication Date Title
US11531791B2 (en) Virtual reality immersion with an architectural design software application
US8194073B2 (en) Image generation apparatus, image generation program, medium that records the program, and image generation method
JP5087532B2 (en) Terminal device, display control method, and display control program
EP3217258B1 (en) Method and system for editing scene in three-dimensional space
US10290155B2 (en) 3D virtual environment interaction system
EP3629133B1 (en) Interface interaction apparatus and method
CN109564495A (en) Display device, program, display methods and control device
CN102929556B (en) Method and equipment for interaction control based on touch screen
US20200294320A1 (en) Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method
CN111803945B (en) Interface rendering method and device, electronic equipment and storage medium
CN104238887B (en) The icon lookup method and device of conventional application program
CN106200899A (en) The method and system that virtual reality is mutual are controlled according to user's headwork
CN103426202A (en) Display system and display method for three-dimensional panoramic interactive mobile terminal
KR20150023702A (en) User interface interaction for transparent head-mounted displays
CN108830918A (en) For land, aerial and/or the visual manifold of crowdsourcing image zooming-out and based on the rendering of image
KR101745332B1 (en) Apparatus and method for controlling 3d image
JP6871880B2 (en) Information processing programs, information processing devices, information processing systems, and information processing methods
JP5445191B2 (en) Robot trajectory display device
CN110162258A (en) The processing method and processing device of individual scene image
CN104869317B (en) Smart machine image pickup method and device
CN111813230B (en) Interaction method and device on AR glasses
CN106844521A (en) Cross-terminal three-dimensional digital earth exchange method based on B/S framework
CN110458943A (en) Mobile object spinning solution and device, control equipment and storage medium
CN103927093B (en) A kind of information processing method and electronic equipment
CN106970734A (en) A kind of task start method and apparatus of display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant