CN109933195B - Interface three-dimensional display method and interaction system based on MR mixed reality technology - Google Patents

Interface three-dimensional display method and interaction system based on MR mixed reality technology Download PDF

Info

Publication number
CN109933195B
CN109933195B CN201910168623.2A CN201910168623A CN109933195B CN 109933195 B CN109933195 B CN 109933195B CN 201910168623 A CN201910168623 A CN 201910168623A CN 109933195 B CN109933195 B CN 109933195B
Authority
CN
China
Prior art keywords
circular
interface
spherical model
user
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910168623.2A
Other languages
Chinese (zh)
Other versions
CN109933195A (en
Inventor
高杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Sufen Digital Technology Co ltd
Original Assignee
Guangzhou Sufen Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Sufen Digital Technology Co ltd filed Critical Guangzhou Sufen Digital Technology Co ltd
Priority to CN201910168623.2A priority Critical patent/CN109933195B/en
Publication of CN109933195A publication Critical patent/CN109933195A/en
Application granted granted Critical
Publication of CN109933195B publication Critical patent/CN109933195B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses an interface three-dimensional display method based on an MR mixed reality technology, which comprises the following steps: establishing a spherical model and a plurality of circular patches as display objects based on a unity3D software platform, wherein the diameter of the spherical model is larger than that of the circular patches, the circular patches are hidden in the spherical model, and an initial state menu interface only presents the spherical model; receiving and recognizing a user trigger interface command, and controlling the circular patch to fly out of the spherical model according to a fixed path and then stay at a fixed spatial position, wherein the position of the spherical model is fixed and unchanged; displaying the content information to be displayed through the circular patch; based on MR mixed reality, the invention enhances the degree of contact between the three-dimensional menu interface and the real scene, and simultaneously improves the three-dimensional sense of the menu interface; in addition, the displayed content can be visually and clearly seen by the user through the arrangement of the layered display objects; the user can naturally interact with the interface, and the interface experience is better.

Description

Interface three-dimensional display method and interaction system based on MR mixed reality technology
Technical Field
The invention relates to the technical field of mixed reality, in particular to an interface three-dimensional display method and an interaction system based on an MR mixed reality technology.
Background
In recent years, with the rapid development of intelligent wearable device technology, new technical products represented by intelligent glasses gradually move into various industries. The intelligent wearable device is based on the auxiliary idea of 'liberation of both hands and high-efficiency interaction' in emerging applications, the work efficiency of industry personnel is improved, and the intelligent wearable device is favored in application in the industry field, and meanwhile, novel and practical applications are promoted to be developed by scientific creations. At present, wearable devices based on mixed reality technology have been introduced into the field service, equipment maintenance, medical, manufacturing and logistics industries in succession.
Mixed Reality (MR) technology is a further development of virtual reality technology, which builds an interactive feedback information loop between the real world, the virtual world and the user by presenting virtual scene information in the real scene to enhance the reality of the user experience. However, the software system developed for the mixed reality technology (MR) at present lacks stereoscopic impression on a menu interface, is flat in hierarchical distribution, still stays in the application idea of the two-dimensional interface in the past, and cannot really exert the characteristic of combining virtual and reality of the mixed reality technology. In addition, the interaction logic of the interface is very important for the user experience, and an interactive system is provided for the use habit of the user.
Disclosure of Invention
The invention provides an interface three-dimensional display method and an interaction system based on an MR mixed reality technology, which are used for solving the technical problems that the existing menu interface lacks three-dimensional sense, is flat in hierarchical distribution and cannot really exert the virtual and reality combination characteristic of the mixed reality technology, so that a three-dimensional display method is established by the mixed reality technology and the graphics processing technology and other related technologies, and a user can see a three-dimensional, well-graded and dynamic menu interface through mixed reality equipment.
In order to solve the above technical problem, an embodiment of the present invention provides an interface stereoscopic display method based on an MR mixed reality technology, including:
establishing a spherical model and a plurality of circular patches as display objects based on a unity3D software platform, wherein the diameter of the spherical model is larger than that of the circular patches, the circular patches are hidden in the spherical model, and an initial state menu interface only presents the spherical model;
receiving and recognizing a user trigger interface command, and controlling the circular patch to fly out of the spherical model according to a fixed path and then stay at a fixed spatial position, wherein the position of the spherical model is fixed and unchanged;
and displaying the content information to be displayed through the circular patch.
Preferably, the method further comprises the following steps: the circular surface patches are arranged right above the spherical model and form a spatial layout of an upper layer, a middle layer and a lower layer;
the circular surface patch of each layer uses a fixed coordinate as a reference circle center to surround a circular queue; the space distance between each circular patch is equal;
the diameter relation of three circular queues formed by the plurality of circular patches is that the first layer and the third layer are the same, and the diameters of the first layer and the third layer are smaller than that of the second layer.
Preferably, the method further comprises the following steps: each circular patch is perpendicular to the ground, namely perpendicular to the sight line plane of a user, so that the user can conveniently check the circular patches.
The embodiment of the invention also provides an interface three-dimensional display interactive system based on the MR mixed reality technology, which comprises a display module, a content module and an interactive module;
the display module is used for displaying the spherical model and the circular patch in space and displaying the content information on the circular patch after receiving the content information sent by the content module;
the content module is used for transmitting content information to be displayed to the display module so as to map the content information to the circular panel;
the interaction module is used for identifying an interface operation instruction of a user and sending a corresponding instruction to the display module, wherein the interface operation instruction of the user comprises the actions of identifying the movement, the click and the drag of a cursor in the MR equipment;
the display module is further configured to generate three-dimensional space coordinates of the spherical model and the circular patch, and arrange the spherical model and the circular patch in space according to the respective three-dimensional space coordinates to form a solid circle.
As a preferred scheme, the display module only displays the spherical model in an initial state, and after the interaction module acquires and identifies a menu interface command clicked by a user cursor, the interaction module sends a corresponding command to the display module to execute a menu interface opening action and trigger the circular patch to be unfolded.
As a preferred scheme, the interaction module sends a corresponding instruction to the display module according to a preset rule to select or rotate the circular patch by identifying an operation instruction that a user cursor moves to the circular patch.
As a preferred scheme, the interaction module is further configured to, after acquiring and recognizing an action instruction that the user selects to click the circular patch, send a corresponding instruction to the display module to control the selected circular patch to move forward and grow larger, so that the content on the circular patch is displayed to the user more clearly.
As a preferred scheme, the interaction module is further configured to send a corresponding instruction to the display module to execute an action of rotating the three-dimensional circle after acquiring and recognizing a dragging and rotating instruction of the user on the circular patch, so that the user can view the circular patch outside the field of view.
As a preferred scheme, the interaction module is further configured to, when the circular patches are outside the spherical model, obtain and recognize an action instruction for a user to click the spherical model by moving a cursor, and send a corresponding instruction to the display module to control all the circular patches to move back to the inside of the spherical model, so that the menu interface is closed and the initial state is recovered.
Preferably, the content information transmitted by the content module includes pictures, words or icons.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
based on MR mixed reality, a novel menu interface three-dimensional display method and an interaction system are designed, the degree of contact between a three-dimensional menu interface and a real scene is enhanced, and the three-dimensional effect of the menu interface is improved; in addition, the displayed content can be visually and clearly seen by the user through the arrangement of the layered display objects; through the interaction module function, the user can naturally interact with the interface, and the interface experience is better.
Drawings
Fig. 1 is a menu interface display effect diagram based on MR mixed reality technology according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of an interactive interface system according to an embodiment of the present invention;
fig. 3 is a specific flowchart of an interactive system according to a preferred embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a diagram illustrating a menu interface effect based on MR mixed reality technology according to an embodiment of the present invention. In a preferred embodiment of the present invention, as shown in fig. 1, the interface is displayed in a three-dimensional space, which is in the same dimension as the user, and is two independent bodies. The MR mixed reality technology is integrated, the interface is projected into a real scene through the MR equipment, and a user can view around the interface, so that the reality sense of a virtual interface is enhanced. As shown, 1 is a spherical model, and 2 is one of the circular patches. In the preferred embodiment, the initial coordinates of the spherical model and all the circular patches are (0,0,0), and the radius of the spherical model is larger than that of the circular patches, i.e., r-sphere > r-circle. When the menu interface is unfolded, all the circular patches move to corresponding spatial positions according to preset coordinates, namely, above the spherical model 2, and form a spatial layout of upper, middle and lower three layers, and the coordinates of one circular patch in the preferred example are changed from (0,0,0) to (5, 5, 5). As shown in the figure, all circular patches respectively use coordinates (0,0, 5), (0,0, 10) and (0,0, 15) as reference circle centers and form a circular queue according to fixed spatial coordinates. Wherein the spatial distance between each of the circular patches is equal. In addition, the radius relationship of three circular arrays formed by all circular patches is that the first layer and the third layer are the same, and the diameters of the first layer and the third layer are smaller than that of the second layer, namely r1 is r3< r 2. In the preferred embodiment, each circular patch should be perpendicular to the ground, i.e., perpendicular to the user's line of sight, which is more convenient for the user to view the content on the circular patch.
The menu interface layout method provided by the invention can be widely applied to various computers, mobile terminals, virtual reality, augmented reality, mixed reality equipment and the like. For example, the method is applied to MR mixed reality equipment, the mixed reality equipment can project the interface to a real scene, and meanwhile, the spatial position of a user can be positioned, and the user can freely walk around the interface in a real environment to view and interact. The three-dimensional menu interface can clearly display the displayed object to the user, is simple and convenient to operate and has stronger sense of reality.
Fig. 2 is a schematic structural diagram of an interactive interface system according to an embodiment of the present invention. As shown in fig. 2, the interactive system structure of the present invention includes a presentation module 110, a content module 120, and an interactive module 130.
First, the display module 110 displays the spherical model and all the circular patches in space, and presents the three-dimensional and hierarchical menu interface to the user. The display module 110 mainly assigns three-dimensional space coordinates to the spherical model and the circular patch, and performs layout in space according to the respective three-dimensional space coordinates. The circular patches form an upper-middle-lower three-layer layout according to fixed positions, and the front of each circular patch faces outwards, so that a user can see the contents mapped on the circular patches.
The content module 120 mainly stores related system content, which may be icons or software, and realizes that the content of the content module 120 is presented in the presentation module 110 by establishing a mapping relationship with the circular patch.
The interaction module 130 is mainly used for identifying a user to perform interface operation and function realization, and realizes an interaction function by identifying movement, clicking and dragging actions of a cursor in the MR device, so that the user has better experience.
Fig. 3 is a specific flowchart of an interactive system according to a preferred embodiment of the present invention. As shown in fig. 3, the interface interaction method provided for the preferred embodiment includes:
step S110: only the spherical model exists in the initial state, and when a user cursor moves to the spherical model and clicks, the menu interface can be triggered to be opened.
Step S120: all the circular patches fly out of the spherical model, and three layers of circular queues are formed in a three-dimensional space according to the three-dimensional space coordinates of each circular patch.
Step 130: and when the user selects the cursor to move to the circular patch, judging whether the user clicks once. If the user performs a single click, go to step 140; otherwise, step S150 is executed.
Step 140: the position of the selected circular patch is moved forward by a single click, mainly by increasing numbers on the X axis and the Y axis of the space coordinate, and the coordinate (5, 5, 5) of the circular patch in the preferred embodiment is changed into (8, 8, 5). In addition, the radius of the circular patch is increased, so that the user can clearly see the content displayed on the circular patch.
Step 150: the user cursor stays on the circular panel to judge whether to perform rotation operation. If the user performs the rotation operation, step S160 is executed; otherwise, step S120 is performed.
Step S160: after the user rotates, all round patches in the same round queue can rotate in the same direction, and round patches at other positions are checked by rotating the round queue. When the user gesture turns to the left, the circular queue rotates clockwise; when the user gesture turns to the right, the circular queue turns counterclockwise.
Step S170: when a user needs to close the menu interface to restore the initial state, the user only needs to move the cursor to the spherical model and click, and all the circular patches move back to the interior of the spherical model to be hidden.
In conclusion, the invention has the beneficial effects that a novel menu interface three-dimensional display method and an interaction system are designed based on the MR mixed reality technology, so that the degree of contact between the three-dimensional menu interface and a real scene is enhanced, and the three-dimensional effect of the menu interface is improved. In addition, the arrangement of the layered display objects enables users to visually and clearly see the displayed content. In the preferred embodiment, the interaction module functions enable the user to naturally interact with the interface, and the interface experience is better.
The method for displaying a menu interface and an interactive system based on MR mixed reality technology provided by the present application are described in detail above, and embodiments of the present application are described herein with specific examples, where the above description of the steps is only used to help understanding the method and the core idea of the present application, and the above-mentioned embodiments further describe the purpose, technical solution and beneficial effects of the present invention in detail, and it should be understood that the above description is only a specific example of the present invention and is not intended to limit the scope of the present invention. It should be understood that any modifications, equivalents, improvements and the like, which come within the spirit and principle of the invention, may occur to those skilled in the art and are intended to be included within the scope of the invention.

Claims (9)

1. An interface three-dimensional display method based on an MR mixed reality technology is characterized by comprising the following steps:
establishing a spherical model and a plurality of circular patches as display objects based on a unity3D software platform, wherein the diameter of the spherical model is larger than that of the circular patches, the circular patches are hidden in the spherical model, and an initial state menu interface only presents the spherical model;
receiving and recognizing a user trigger interface command, and controlling the circular patch to fly out of the spherical model according to a fixed path and then stay at a fixed spatial position, wherein the position of the spherical model is fixed and unchanged;
displaying the content information to be displayed through the circular patch;
the circular surface patches are arranged right above the spherical model and form a spatial layout of an upper layer, a middle layer and a lower layer;
the circular surface patch of each layer uses a fixed coordinate as a reference circle center to surround a circular queue; the space distance between each circular patch is equal;
the diameter relation of three circular queues formed by the plurality of circular patches is that the first layer and the third layer are the same, and the diameters of the first layer and the third layer are smaller than that of the second layer.
2. The MR mixed reality technology-based interface stereoscopic display method according to claim 1, further comprising: each circular patch is perpendicular to the ground, namely perpendicular to the sight line plane of a user, so that the user can conveniently check the circular patches.
3. An interactive system based on the MR mixed reality technology-based interface stereoscopic display method of claim 1, which is characterized by comprising a display module, a content module and an interactive module;
the display module is used for displaying the spherical model and the circular patch in space and displaying the content information on the circular patch after receiving the content information sent by the content module;
the content module is used for transmitting content information to be displayed to the display module so as to map the content information to the circular panel;
the interaction module is used for identifying an interface operation instruction of a user and sending a corresponding instruction to the display module, wherein the interface operation instruction of the user comprises the actions of identifying the movement, the click and the drag of a cursor in the MR equipment;
the display module is further configured to generate three-dimensional space coordinates of the spherical model and the circular patch, and arrange the spherical model and the circular patch in space according to the respective three-dimensional space coordinates to form a solid circle.
4. The MR mixed reality technology-based interface stereoscopic display interaction system according to claim 3, wherein the display module displays only the spherical model in an initial state, and after the interaction module obtains and recognizes a menu interface command clicked by a user cursor, sends a corresponding command to the display module to execute a menu interface opening action and trigger the circular patch to unfold.
5. The interface stereoscopic display interaction system based on the MR mixed reality technology as claimed in claim 4, wherein the interaction module sends a corresponding instruction to the display module according to a preset rule to perform the selection or rotation operation of the circular patch by recognizing an operation instruction that a user cursor moves to the circular patch.
6. The interface stereoscopic display interactive system based on the MR mixed reality technology as claimed in claim 5, wherein the interactive module is further configured to, after acquiring and recognizing an action instruction that the user selects to click on the circular patch, send a corresponding instruction to the display module to control the selected circular patch to move forward and become larger, so that the content on the circular patch is displayed to the user more clearly.
7. The MR mixed reality technology-based interface stereoscopic display interaction system according to claim 5, wherein the interaction module is further configured to, after acquiring and recognizing a dragging and rotating instruction of the user on the circular patch, send a corresponding instruction to the display module to execute an action of rotating the stereoscopic circle, so that the user views the circular patch outside the field of view.
8. The MR mixed reality technology-based interface stereoscopic display interaction system according to claim 3, wherein the interaction module is further configured to obtain and recognize an action instruction of a user to move a cursor to click on the spherical model when the circular patch is outside the spherical model, and send a corresponding instruction to the display module to control all the circular patches to move back inside the spherical model, so that the closed menu interface returns to the initial state.
9. The MR mixed reality technology-based interface stereoscopic presentation interactive system according to claim 3, wherein the content information transmitted by the content module includes pictures, words or icons.
CN201910168623.2A 2019-03-06 2019-03-06 Interface three-dimensional display method and interaction system based on MR mixed reality technology Active CN109933195B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910168623.2A CN109933195B (en) 2019-03-06 2019-03-06 Interface three-dimensional display method and interaction system based on MR mixed reality technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910168623.2A CN109933195B (en) 2019-03-06 2019-03-06 Interface three-dimensional display method and interaction system based on MR mixed reality technology

Publications (2)

Publication Number Publication Date
CN109933195A CN109933195A (en) 2019-06-25
CN109933195B true CN109933195B (en) 2022-04-22

Family

ID=66986459

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910168623.2A Active CN109933195B (en) 2019-03-06 2019-03-06 Interface three-dimensional display method and interaction system based on MR mixed reality technology

Country Status (1)

Country Link
CN (1) CN109933195B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112256185B (en) * 2020-10-23 2022-09-30 广东智源机器人科技有限公司 Display method and device of 3D menu, processor and display equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324400A (en) * 2013-07-15 2013-09-25 天脉聚源(北京)传媒科技有限公司 Method and device for displaying menus in 3D model
CN103503030A (en) * 2012-03-23 2014-01-08 松下电器产业株式会社 Image processing device for specifying depth of object present in real space by performing image processing, stereoscopic viewing device, integrated circuit, and program
CN105677275A (en) * 2015-12-31 2016-06-15 北京小鸟看看科技有限公司 Interface layout method and wraparound interface system
CN106940477A (en) * 2017-03-14 2017-07-11 联想(北京)有限公司 A kind of control method and electronic equipment
CN108803876A (en) * 2018-06-08 2018-11-13 华北水利水电大学 Hydraulic engineering displaying exchange method based on augmented reality and system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102819315B (en) * 2012-07-23 2016-04-13 中兴通讯股份有限公司 A kind of 3D man-machine interaction method and system
CN103218125A (en) * 2013-04-18 2013-07-24 广东欧珀移动通信有限公司 Operation method and system for sliding menu and mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103503030A (en) * 2012-03-23 2014-01-08 松下电器产业株式会社 Image processing device for specifying depth of object present in real space by performing image processing, stereoscopic viewing device, integrated circuit, and program
CN103324400A (en) * 2013-07-15 2013-09-25 天脉聚源(北京)传媒科技有限公司 Method and device for displaying menus in 3D model
CN105677275A (en) * 2015-12-31 2016-06-15 北京小鸟看看科技有限公司 Interface layout method and wraparound interface system
CN106940477A (en) * 2017-03-14 2017-07-11 联想(北京)有限公司 A kind of control method and electronic equipment
CN108803876A (en) * 2018-06-08 2018-11-13 华北水利水电大学 Hydraulic engineering displaying exchange method based on augmented reality and system

Also Published As

Publication number Publication date
CN109933195A (en) 2019-06-25

Similar Documents

Publication Publication Date Title
Grossman et al. Multi-finger gestural interaction with 3d volumetric displays
US10890983B2 (en) Artificial reality system having a sliding menu
CN103955308B (en) Touch interaction with a curved display
US10417812B2 (en) Systems and methods for data visualization using three-dimensional displays
Deering The HoloSketch VR sketching system
US10192363B2 (en) Math operations in mixed or virtual reality
US11551403B2 (en) Artificial reality system architecture for concurrent application execution and collaborative 3D scene rendering
US20160350972A1 (en) Multidimensional graphical method for entering and exiting applications and activities in immersive media
US20150067603A1 (en) Display control device
US11893702B2 (en) Virtual object processing method and apparatus, and storage medium and electronic device
US20140075370A1 (en) Dockable Tool Framework for Interaction with Large Scale Wall Displays
CN109725956B (en) Scene rendering method and related device
CN105808071A (en) Display control method and device and electronic equipment
US20080252661A1 (en) Interface for Computer Controllers
Thomas et al. Spatial augmented reality—A tool for 3D data visualization
CN114089784B (en) Unmanned aerial vehicle control method and system based on MR glasses
CN109933195B (en) Interface three-dimensional display method and interaction system based on MR mixed reality technology
Andujar et al. A cost-effective approach for developing application-control GUIs for virtual environments
CN117130518A (en) Control display method, head display device, electronic device and readable storage medium
CN109375866B (en) Screen touch click response method and system for realizing same
US20230147561A1 (en) Metaverse Content Modality Mapping
US20240143126A1 (en) Display method, apparatus, and electronic device
Xiao et al. Design of Hololens-based Scene System for Spacecraft Simulation
Bauer Large Display Interaction Using Mobile Devices
Selvi et al. GAMIFIED MOBILE HANDHELD DEVICE USING AUGMENTED REALITY

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant