CN116520997B - Mixed reality enhanced display and interaction system - Google Patents

Mixed reality enhanced display and interaction system Download PDF

Info

Publication number
CN116520997B
CN116520997B CN202310815517.5A CN202310815517A CN116520997B CN 116520997 B CN116520997 B CN 116520997B CN 202310815517 A CN202310815517 A CN 202310815517A CN 116520997 B CN116520997 B CN 116520997B
Authority
CN
China
Prior art keywords
display
layer
interface
interaction
functional domain
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310815517.5A
Other languages
Chinese (zh)
Other versions
CN116520997A (en
Inventor
戴健
吴锐
刘歆浏
祝本明
任珍文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China South Industries Group Automation Research Institute
Original Assignee
China South Industries Group Automation Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China South Industries Group Automation Research Institute filed Critical China South Industries Group Automation Research Institute
Priority to CN202310815517.5A priority Critical patent/CN116520997B/en
Publication of CN116520997A publication Critical patent/CN116520997A/en
Application granted granted Critical
Publication of CN116520997B publication Critical patent/CN116520997B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a mixed reality enhanced display and interaction system, which comprises a physical environment layer, a system driving layer, a data storage layer, an AI framework layer, a service component layer and an application layer, wherein all layers can be arranged on the same wearable device, and belongs to the technical field of mixed reality devices. The system improves and expands virtual-real display interaction capability, transmission type near-to-eye display capability and virtual-real combination enhancement display capability of the AR/MR equipment, and greatly improves application of an augmented reality technology and a mixed reality technology in operation guidance and control of an industrial production line, maintenance and guarantee of special equipment and various natural environment situations. In addition, the system has the advantages of low cost, low power consumption, strong system real-time performance advantage and the like.

Description

Mixed reality enhanced display and interaction system
Technical Field
The application relates to the technical field of mixed reality, in particular to a mixed reality enhanced display and interaction system.
Background
From the current global development analysis, the future world will show various forms of intellectualization, networking, unmanned, diversification and the like. The mixed reality technology (MR) is to build a bridge for interactive feedback information among the virtual world, the real world and the user by introducing real scene information into the virtual environment, so as to enhance the sense of reality of the user experience. Mixed reality technology is an in-depth and augmentation of virtual reality and augmented reality technologies.
The mixed reality technology can display the real environment and the virtual information in a superposition manner on the same picture and space, and fusion and interaction between the information are realized, so that the sensory experience exceeding reality is achieved when a user uses the mixed reality technology. When a user uses the head-mounted mixed reality augmented reality display device, in order to realize multi-view viewing of the three-dimensional model, user head position adjustment model rendering needs to be positioned in real time. The user may send simple commands to the system through gestures, such as selection, movement, and deletion, and may express more complex intents, such as switching the current interaction scenario, controlling virtual objects, performing virtual actions, and so forth.
The virtual-real combined three-dimensional scene matching technology is a novel display matching technology which starts to have application requirements along with the development of computer software and hardware. The virtual-real combination technology can blend the virtual environment into the real scene around the user, thereby providing visual and enhanced use experience, and the three-dimensional matching technology has higher operation freedom degree in the three-dimensional space, thereby forming more visual and real feeling.
The interaction of the bottom layer of the mixed reality depends on hardware equipment, and the performance and the sensor type of the equipment determine the basic interaction mode owned by the equipment. However, the products and functions of the prior art enhancement/hybrid enhancement (AR/MR) devices are poorly integrated, such that the enhancement/hybrid enhancement (AR/MR) devices are often technically and functionally demonstrated with fewer engineered utility products. With the development of mixed reality augmented display technology, the application of the object is not limited to a specific object or a specific environment; it should have the ability to handle a variety of complex and diverse environments.
Therefore, how to provide an enhancement/hybrid enhancement (AR/MR) device with high functional integration is a technical problem that needs to be solved by those skilled in the art.
Disclosure of Invention
In view of the above, the present application provides a mixed reality enhanced display and interaction system for overcoming, or at least partially solving, the above problems.
The application provides the following scheme:
a mixed reality augmented display and interaction system, comprising:
a physical environment layer configured to provide system hardware platform support;
the system driving layer is configured to provide power management of the whole machine and basic file system service, and provide system running environment and software and hardware resource management support for an upper layer database, an AI framework layer, a service component and an application layer;
a data storage layer configured to provide support for database creation, query, deletion, modification operations related functions of the AI framework layer, the service components, and the applications;
an AI framework layer configured to provide system intelligence service components and applications to provide core algorithm support;
the service component layer is configured to connect with an AI core algorithm, data storage, network exchange and sensing equipment, and simultaneously provides various interoperable, reusable and combinable service platforms for the system;
and an application layer configured for mixed reality based augmented display and interaction.
Preferably: the physical environment layer comprises an information processing functional domain, a perception functional domain, a human-computer interaction functional domain and an integrated matching functional domain, wherein the information processing functional domain is connected with the perception functional domain through a scout image interface and a frame synchronization control interface, and the information processing functional domain is connected with the human-computer interaction functional domain through a display interface, an audio input/output interface, a gesture image interface and a functional key interface.
Preferably: the information processing functional domain comprises an information processing unit formed by constructing an ARM framework core processor, a RAM memory, a ROM memory and a power management IC.
Preferably: the ARM framework core processor expands internal sensing components such as a geomagnetic sensor, a gyroscope, an accelerometer, an ambient light sensor and the like and a Beidou positioning module through an internal interface, and is used for acquiring ambient light, gesture, position, direction and ambient state information; the ARM architecture core processor expands the Wifi/BT wireless communication module, and is externally connected with a Wifi/BT antenna through a radio frequency interface, so that the wireless networking communication function with other informationized modules is realized.
Preferably: the sensing functional domain comprises two sensor modules, the two sensor modules are connected to the ARM framework core processor through the MIPI_CSI interface, and the ARM framework core processor expands a frame synchronization control interface through the GPIO interface to realize time synchronization of signals acquired by the two sensor modules.
Preferably: the man-machine interaction functional domain comprises a functional key, a TOF camera sensing module and a binocular display information functional module, wherein the binocular display information functional module comprises an earphone, a microphone and two optical waveguide modules.
Preferably: the ARM framework core processor realizes the earphone and microphone analog audio expansion through an audio input/output interface, realizes function key expansion through a GPIO interface, realizes TOF camera sensing module expansion through an MIPI_CSI interface and an SPI depth sensing information interface, is connected with a binocular display driving assembly through an MIPI_DSI display interface, performs binocular display distribution of MIPI signals through the display driving assembly, and drives two optical waveguide module binocular imaging.
Preferably: the optical waveguide module comprises a tripolar photoelectric holographic optical waveguide display module.
Preferably: the system driving layer comprises a mobile terminal operating system kernel and equipment driving software; the data storage layer comprises a database system, a management tool and various forms and records; the AI framework layer comprises an algorithm framework, a network model and an image, voice, text and situation type algorithm; the service component layer comprises a system service component, a device service component and an application service component; the application layer comprises mixed reality-based enhanced display and interaction system diversified application software.
According to the specific embodiment provided by the application, the application discloses the following technical effects:
the mixed reality enhanced display and interaction system provided by the embodiment of the application combines the mixed reality technology, the three-dimensional reconstruction technology, the gesture recognition technology, the binocular imaging technology, the man-machine interaction technology and the optical waveguide display technology, and forms the mixed reality enhanced display and interaction system integrating the information processing, the fusion display, the man-machine control, the information communication and other informationized functions and the eye protection function. The virtual-real display interaction capability, the transmission type near-to-eye display capability and the virtual-real combined enhancement display capability of the AR/MR equipment are improved and expanded, and the application of the augmented reality technology and the mixed reality technology in the operation guidance and control of an industrial production line, the maintenance and the guarantee of special equipment and under various natural environment situations is greatly improved. In addition, the system has the advantages of low cost, low power consumption, strong system real-time performance advantage and the like.
Of course, it is not necessary for any one product to practice the application to achieve all of the advantages set forth above at the same time.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the embodiments will be briefly described below. It is evident that the drawings in the following description are only some embodiments of the present application and that other drawings may be obtained from these drawings by those of ordinary skill in the art without inventive effort.
FIG. 1 is a block diagram of a mixed reality enhanced display and interaction system provided by an embodiment of the present application;
FIG. 2 is a schematic block diagram of the system according to an embodiment of the present application;
fig. 3 is a diagram of a system internal interface relationship provided by an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, embodiments of the application. All other embodiments, which are derived by a person skilled in the art based on the embodiments of the application, fall within the scope of protection of the application.
Referring to fig. 1, fig. 2, and fig. 3, a mixed reality enhanced display and interaction system provided in an embodiment of the present application, as shown in fig. 1, fig. 2, and fig. 3, may include:
a physical environment layer configured to provide system hardware platform support; further, the physical environment layer comprises an information processing functional domain, a perception functional domain, a human-computer interaction functional domain and an integrated matching functional domain, wherein the information processing functional domain is connected with the perception functional domain through a scout image interface and a frame synchronization control interface, and the information processing functional domain is connected with the human-computer interaction functional domain through a display interface, an audio input/output interface, a gesture image interface and a functional key interface. The information processing functional domain comprises an information processing unit formed by constructing an ARM framework core processor, a RAM memory, a ROM memory and a power management IC. The ARM framework core processor expands internal sensing components such as a geomagnetic sensor, a gyroscope, an accelerometer, an ambient light sensor and the like and a Beidou positioning module through an internal interface, and is used for acquiring ambient light, gesture, position, direction and ambient state information; the ARM architecture core processor expands the Wifi/BT wireless communication module, and is externally connected with a Wifi/BT antenna through a radio frequency interface, so that the wireless networking communication function with other informationized modules is realized. The sensing functional domain comprises two sensor modules, the two sensor modules are connected to the ARM framework core processor through the MIPI_CSI interface, and the ARM framework core processor expands a frame synchronization control interface through the GPIO interface to realize time synchronization of signals acquired by the two sensor modules. The man-machine interaction functional domain comprises a functional key, a TOF camera sensing module and a binocular display information functional module, wherein the binocular display information functional module comprises an earphone, a microphone and two optical waveguide modules. The ARM framework core processor realizes the earphone and microphone analog audio expansion through an audio input/output interface, realizes function key expansion through a GPIO interface, realizes TOF camera sensing module expansion through an MIPI_CSI interface and an SPI depth sensing information interface, is connected with a binocular display driving assembly through an MIPI_DSI display interface, performs binocular display distribution of MIPI signals through the display driving assembly, and drives two optical waveguide module binocular imaging. The optical waveguide module comprises a tripolar photoelectric holographic optical waveguide display module.
The system driving layer is configured to provide power management of the whole machine and basic file system service, and provide system running environment and software and hardware resource management support for an upper layer database, an AI framework layer, a service component and an application layer;
a data storage layer configured to provide support for database creation, query, deletion, modification operations related functions of the AI framework layer, the service components, and the applications;
an AI framework layer configured to provide system intelligence service components and applications to provide core algorithm support;
the service component layer is configured to connect with an AI core algorithm, data storage, network exchange and sensing equipment, and simultaneously provides various interoperable, reusable and combinable service platforms for the system;
and an application layer configured for mixed reality based augmented display and interaction.
Further, the system driving layer comprises a mobile terminal operating system kernel and equipment driving software; the data storage layer comprises a database system, a management tool and various forms and records; the AI framework layer comprises an algorithm framework, a network model and an image, voice, text and situation type algorithm; the service component layer comprises a system service component, a device service component and an application service component; the application layer comprises mixed reality-based enhanced display and interaction system diversified application software.
The system provided by the application can be used for civil use, aims at helping a user obtain excellent visual experience, constructs an open AI application computing service architecture, and realizes the expansion of various AI application functions by means of advanced computing architectures such as big data, edge computing and the like, thereby improving the use efficiency.
The mixed reality enhanced display and interaction system provided by the embodiment of the application adopts an integrated design, the wearable equipment adopts the design ideas of modularization, generalization and software and hardware decoupling, forms transverse partition and longitudinal decoupling, and has a software and hardware system architecture which is portable, can be increased and decreased and has mutual operability. The various tiers may be arranged on the same wearable device. The system is composed of six levels, namely a physical environment, a system driver, data storage, an AI framework, a service component and an application from bottom to top.
First, the physical environment layer.
The physical environment layer is mainly composed of perception, information control and man-machine interaction functional components forming a system, and provides a hardware platform support for the whole system.
The physical environment layer mainly comprises four functional domains matched with sensing, displaying and interacting, information control and integration.
The information processing functional domain is an information processing core functional module based on a mixed reality enhanced display and interaction system, and the information processing core functional module adopts an ARM architecture core processor, and constructs an information processing minimum system with a RAM memory, a ROM memory and a power management IC. The core processor expands internal sensing components such as geomagnetic sensors, gyroscopes, accelerometers, ambient light sensors and the like and Beidou positioning modules through internal interfaces, and achieves ambient light, attitude, position, direction and other self and ambient state information acquisition. The core processor expands the Wifi/BT wireless communication module, and is externally connected with a Wifi/BT antenna through a radio frequency interface, so that the wireless networking communication function with other informationized modules is realized.
The sensing functional domain consists of two sensor modules, is accessed to a core processor in the information processing functional domain through an MIPI_CSI interface, and the core processor expands a frame synchronization control interface through a GPIO (general purpose input/output) port to realize time synchronization of acquisition signals of the two sensor modules.
The man-machine interaction functional domain is composed of functional components such as functional keys, TOF camera sensing, binocular display information and the like, and realizes key gesture, instruction input, audio frequency and optical waveguide display information output. The core processor realizes analog audio expansion of headphones, microphones and the like through the audio input/output interface; realizing function key expansion through a GPIO interface; the TOF camera sensing module expansion is realized through an MIPI_CSI interface and an SPI depth sensing information interface; the display driving assembly is connected with the binocular display driving assembly through the MIPI_DSI display interface, and binocular display distribution of MIPI signals is carried out by the display driving assembly to drive the binocular imaging of the two optical waveguide modules.
Second, the system drives the layer.
The system driving layer is composed of a mobile terminal operating system kernel and equipment driving software, provides equipment driving service for each functional device, a communication interface and a sensor assembly, provides complete machine power management and basic file system service, and provides system running environment and software and hardware resource management support for an upper layer database, an AI framework, a service component and an application layer.
Third, a data storage layer.
The data storage layer is composed of a database system, a management tool, various forms, records and the like, provides basic operation related function support such as database creation, inquiry, deletion, modification and the like for an upper AI framework, service components and applications, and can be divided into seven logic databases such as a geographic information resource database, a primitive information database, a data dictionary, an AI algorithm model base, an environment/target element model base, a business application database and a process information database according to data types and professional scene application requirements.
Fourth, AI framework layer.
The AI framework layer is composed of an algorithm framework, a network model and image, voice, text and situation type algorithm components, and provides core algorithm support for the intelligent service components and applications of the augmented display and interaction system based on mixed reality. And supporting an edge-end computing architecture and external computing power expansion, and realizing collaborative computing according to computing tasks.
Fifth, service component layer.
The service component layer is the most important layer in the enhanced display and interaction system based on mixed reality, is a tie for connecting resources such as an AI core algorithm, data storage, network exchange, sensing equipment and the like, provides a generalized and flexible service platform for the system, and can realize interoperation, reusability and combinability among various resources. The service component layer comprises three major components of system service, equipment service and application service, and also comprises a cross-level AI computing task migration scheduling service for supporting an edge-end computing architecture and external computing power expansion to realize collaborative computing.
Sixth, application layer.
The application tool layer is composed of a system interface and various terminal application software, is built on a service component layer, comprises mixed reality-based enhanced display and interaction system diversified application software, and mainly comprises system management, media application, perception application, situation map application and information application.
The information control functional domain is connected with the perception functional domain through a scout image interface and a frame synchronization control interface; the human-computer interaction functional domain is connected with the human-computer interaction functional domain through a display interface, an audio input/output interface, a gesture image interface and a functional key interface.
In the overall scheme design process, development of autonomous and controllable design work is focused, and autonomous and controllable designs of an optical waveguide display module, an embedded processing platform and an audio and video sensor can be provided, so that the scheme design is ensured to meet the requirement of autonomous and controllable.
TABLE 1 autonomous controllable design table of core devices and components
In a word, the mixed reality enhanced display and interaction system provided by the application combines the mixed reality technology, the three-dimensional reconstruction technology, the gesture recognition technology, the binocular imaging technology, the man-machine interaction technology and the optical waveguide display technology, and forms the mixed reality enhanced display and interaction system integrating information processing, fusion display, man-machine control, information communication and other informationized functions and eye protection functions. The virtual-real display interaction capability, the transmission type near-to-eye display capability and the virtual-real combined enhancement display capability of the AR/MR equipment are improved and expanded, and the application of the augmented reality technology and the mixed reality technology in the operation guidance and control of an industrial production line, the maintenance and the guarantee of special equipment and under various natural environment situations is greatly improved. In addition, the system has the advantages of low cost, low power consumption, strong system real-time performance advantage and the like.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
From the above description of embodiments, it will be apparent to those skilled in the art that the present application may be implemented in software plus a necessary general hardware platform. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the embodiments or some parts of the embodiments of the present application.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for a system or system embodiment, since it is substantially similar to a method embodiment, the description is relatively simple, with reference to the description of the method embodiment being made in part. The systems and system embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present application without undue burden.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application are included in the protection scope of the present application.

Claims (7)

1. A mixed reality augmented reality display and interaction system, comprising:
a physical environment layer configured to provide system hardware platform support; the physical environment layer comprises an information processing functional domain, a perception functional domain, a human-computer interaction functional domain and an integrated matching functional domain, wherein the information processing functional domain is connected with the perception functional domain through a scout image interface and a frame synchronization control interface, and the information processing functional domain is connected with the human-computer interaction functional domain through a display interface, an audio input/output interface, a gesture image interface and a functional key interface;
the system driving layer is configured to provide power management of the whole machine and basic file system service, and provides system running environment and software and hardware resource management support for an upper layer database, an AI framework layer, a service component layer and an application layer; the system driving layer comprises a mobile terminal operating system kernel and equipment driving software; the data storage layer comprises a database system, a management tool and various forms and records;
a data storage layer configured to provide support for database creation, query, deletion, modification operations related functions of the AI framework layer, the service components, and the applications;
an AI framework layer configured to provide system intelligence service components and applications to provide core algorithm support; the AI framework layer comprises an algorithm framework, a network model and an image, voice, text and situation type algorithm;
the service component layer is configured to connect with an AI core algorithm, data storage, network exchange and sensing equipment, and simultaneously provides various interoperable, reusable and combinable service platforms for the system; the service component layer comprises a system service component, a device service component and an application service component;
an application layer configured for mixed reality based augmented display and interaction, the application layer comprising mixed reality based augmented display and interaction system application software.
2. The augmented reality augmentation display and interaction system of claim 1, wherein the information processing functional domain comprises an information processing unit formed using an ARM architecture core processor and RAM memory, ROM memory, and power management IC.
3. The augmented reality enhanced display and interaction system according to claim 2, wherein the ARM architecture core processor extends an internal sensing component and a beidou positioning module through an internal interface, and is configured to implement ambient light, gesture, position, direction itself and ambient state information collection, and the internal sensing component includes a geomagnetic sensor, a gyroscope, an accelerometer, and an ambient light sensor; the ARM architecture core processor expands the Wifi/BT wireless communication module, and is externally connected with a Wifi/BT antenna through a radio frequency interface, so that the wireless networking communication function with other informationized modules is realized.
4. The augmented reality display and interaction system according to claim 2, wherein the perception function domain includes two sensor modules, the two sensor modules are connected to the ARM architecture core processor through a mipi_csi interface, and the ARM architecture core processor extends a frame synchronization control interface through a GPIO port to achieve time synchronization of signals acquired by the two sensor modules.
5. The augmented reality augmentation display and interaction system of claim 2, wherein the human-machine interaction functional domain comprises a functional key, a TOF camera sensing module, and a binocular display information functional component comprising an earphone, a microphone, and two optical waveguide modules.
6. The augmented reality display and interaction system according to claim 5, wherein the ARM architecture core processor implements the earphone and the microphone analog audio expansion through an audio input/output interface, implements function key expansion through a GPIO interface, implements the TOF camera sensing module expansion through an mipi_csi interface and an SPI depth sensing information interface, and connects a binocular display driving assembly through an mipi_dsi display interface, wherein the binocular display driving assembly is used for performing binocular display allocation of MIPI signals, and drives two optical waveguide module binocular imaging.
7. The augmented reality display and interaction system of claim 5, wherein the optical waveguide module comprises a three-pole electro-optical volume holographic optical waveguide display module.
CN202310815517.5A 2023-07-05 2023-07-05 Mixed reality enhanced display and interaction system Active CN116520997B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310815517.5A CN116520997B (en) 2023-07-05 2023-07-05 Mixed reality enhanced display and interaction system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310815517.5A CN116520997B (en) 2023-07-05 2023-07-05 Mixed reality enhanced display and interaction system

Publications (2)

Publication Number Publication Date
CN116520997A CN116520997A (en) 2023-08-01
CN116520997B true CN116520997B (en) 2023-09-26

Family

ID=87392588

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310815517.5A Active CN116520997B (en) 2023-07-05 2023-07-05 Mixed reality enhanced display and interaction system

Country Status (1)

Country Link
CN (1) CN116520997B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110362209A (en) * 2019-07-23 2019-10-22 辽宁向日葵教育科技有限公司 A kind of MR mixed reality intelligent perception interactive system
CN110634188A (en) * 2018-06-22 2019-12-31 幻视互动(北京)科技有限公司 Method for realizing interaction with virtual 3D model and MR mixed reality intelligent glasses
CN111729283A (en) * 2020-06-19 2020-10-02 杭州赛鲁班网络科技有限公司 Training system and method based on mixed reality technology
CN113632030A (en) * 2018-12-27 2021-11-09 奇跃公司 System and method for virtual reality and augmented reality
CN114967926A (en) * 2018-05-30 2022-08-30 太若科技(北京)有限公司 AR head display device and terminal device combined system
CN115016642A (en) * 2016-02-16 2022-09-06 微软技术许可有限责任公司 Reality mixer for mixed reality
CN115359222A (en) * 2022-08-22 2022-11-18 深圳市邦康工业机器人科技有限公司 Unmanned interaction control method and system based on augmented reality
CN115661412A (en) * 2022-10-30 2023-01-31 西北工业大学 Aero-engine auxiliary assembly system and method based on mixed reality
CN116203733A (en) * 2023-04-28 2023-06-02 中国兵器装备集团自动化研究所有限公司 Mixed reality wearing equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10783666B2 (en) * 2017-03-20 2020-09-22 SK Commercial Construction, Inc. Color analysis and control using an electronic mobile device transparent display screen integral with the use of augmented reality glasses
KR102619607B1 (en) * 2019-08-08 2023-12-29 엘지전자 주식회사 Xr device and method for controlling the same
KR20190101323A (en) * 2019-08-12 2019-08-30 엘지전자 주식회사 Xr device for providing ar mode and vr mode and method for controlling the same
KR102637417B1 (en) * 2019-11-11 2024-02-16 엘지전자 주식회사 Xr device for providing ar mode and vr mode and method for controlling the same

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115016642A (en) * 2016-02-16 2022-09-06 微软技术许可有限责任公司 Reality mixer for mixed reality
CN114967926A (en) * 2018-05-30 2022-08-30 太若科技(北京)有限公司 AR head display device and terminal device combined system
CN110634188A (en) * 2018-06-22 2019-12-31 幻视互动(北京)科技有限公司 Method for realizing interaction with virtual 3D model and MR mixed reality intelligent glasses
CN113632030A (en) * 2018-12-27 2021-11-09 奇跃公司 System and method for virtual reality and augmented reality
CN110362209A (en) * 2019-07-23 2019-10-22 辽宁向日葵教育科技有限公司 A kind of MR mixed reality intelligent perception interactive system
CN111729283A (en) * 2020-06-19 2020-10-02 杭州赛鲁班网络科技有限公司 Training system and method based on mixed reality technology
CN115359222A (en) * 2022-08-22 2022-11-18 深圳市邦康工业机器人科技有限公司 Unmanned interaction control method and system based on augmented reality
CN115661412A (en) * 2022-10-30 2023-01-31 西北工业大学 Aero-engine auxiliary assembly system and method based on mixed reality
CN116203733A (en) * 2023-04-28 2023-06-02 中国兵器装备集团自动化研究所有限公司 Mixed reality wearing equipment

Also Published As

Publication number Publication date
CN116520997A (en) 2023-08-01

Similar Documents

Publication Publication Date Title
US11798222B2 (en) Virtual scene switching method and apparatus, terminal device, and storage medium
Höllerer et al. Mobile augmented reality
MacIntyre et al. Future multimedia user interfaces
Schmalstieg et al. The studierstube augmented reality project
Caggianese et al. Natural interaction and wearable augmented reality for the enjoyment of the cultural heritage in outdoor conditions
CN108334199A (en) The multi-modal exchange method of movable type based on augmented reality and device
CN103544724A (en) System and method for realizing fictional cartoon character on mobile intelligent terminal by augmented reality and card recognition technology
CN107209572A (en) Method and electronic equipment for display picture
CN111880648B (en) Three-dimensional element control method and terminal
US20230119849A1 (en) Three-dimensional interface control method and terminal
Sheller Mobile art: Out of your pocket
Wani et al. Augmented reality for fire and emergency services
Shao et al. A natural interaction method of multi-sensory channels for virtual assembly system of power transformer control cabinet
CN116520997B (en) Mixed reality enhanced display and interaction system
CN107305432A (en) A kind of terminal and the method for touch feedback, device are realized in terminal
Goktan et al. Augmented reality appendages for robots: Design considerations and recommendations for maximizing social and functional perception
Kurata et al. Tangible tabletop interface for an expert to collaborate with remote field workers
CN110956702A (en) 3D visual editor and editing method based on time axis
Siewiorek Wearable Computing: Retrospectives on the first decade
Antoniac Augmented reality based user interface for mobile applications and services
Chauhan et al. Augmented Reality
Biocca et al. Evolution of the mobile infosphere: Iterative design of a highinformation bandwidth, mobile augmented reality interface
Di Capua et al. Rapid prototyping of mobile applications for augumented reality interactions
US11093100B2 (en) Virtual reality device with varying interactive modes for document viewing and editing
Nováková et al. Methodical procedure for creating content for interactive augmented reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant