CN113409472A - AR presentation method and system of shared interactive information in space based on real object - Google Patents

AR presentation method and system of shared interactive information in space based on real object Download PDF

Info

Publication number
CN113409472A
CN113409472A CN202110752701.0A CN202110752701A CN113409472A CN 113409472 A CN113409472 A CN 113409472A CN 202110752701 A CN202110752701 A CN 202110752701A CN 113409472 A CN113409472 A CN 113409472A
Authority
CN
China
Prior art keywords
real object
model
interactive information
information
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110752701.0A
Other languages
Chinese (zh)
Inventor
江婷
肖筱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Youka Interconnection Technology Co ltd
Original Assignee
Wuhan Youka Interconnection Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Youka Interconnection Technology Co ltd filed Critical Wuhan Youka Interconnection Technology Co ltd
Priority to CN202110752701.0A priority Critical patent/CN113409472A/en
Publication of CN113409472A publication Critical patent/CN113409472A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses an AR presentation method and system in space based on shared interactive information of a real object, which comprises the steps of firstly carrying out multi-directional scanning shooting on the real object, establishing a 3D model of the real object after characteristic analysis and storing the model in a platform database; after scanning the corresponding real object through the platform, determining spatial position data of the interactive information relative to the real object, wherein the spatial position data is calculated based on a spatial simulation algorithm, and the interactive information is issued at the spatial position; and finally, rendering the interactive information to a corresponding spatial position of the real object 3D model data, integrating and generating a second 3D model which takes the real object 3D model as a center, wherein the interactive information surrounds the second 3D model around the real object 3D model, and displaying the second 3D model in a superposition manner to a real scene where the real object is located, so that the function of AR (augmented reality) display of the shared interactive information based on the real object in the space is realized, and meanwhile, independent spatial positions of information transmitted among a plurality of users can be obtained.

Description

AR presentation method and system of shared interactive information in space based on real object
Technical Field
The invention belongs to the technical field of information, and particularly relates to an AR presentation method and system of shared interactive information in space based on a real object.
Background
Augmented reality, namely augmented reality, AR for short, is to enhance the user's perception of the real world by superimposing auxiliary information presented by a computer system into the real scene. A virtual object is constructed by utilizing a plurality of computer basic technologies such as a display technology, an interaction technology, a computer graphics technology and the like, and the constructed virtual object and related auxiliary information are accurately and real-timely superposed on a real scene, so that a more real visual effect with richer scene information is presented. The virtual-real combined technology is a key technology of augmented reality, and can realize the expansion of a real scene on a display device, and a user sees not a complete virtual world through the display device but a combined space for seamlessly integrating virtual world objects into a real situation.
And (4) an interactive technology, namely a technology of man-machine real-time interaction. Interactivity is important in AR research because all research is for applications that emphasize user experience where interactivity plays an extremely important role. Along with the development of the AR technology, the display equipment is more various, small and portable, and in addition, the rapid development of the intelligent mobile handheld equipment makes the importance of the man-machine interaction technology to the AR system larger and larger. The interaction in the AR system mainly emphasizes the three-dimension and real-time interaction, the three-dimension is mainly embodied in three dimensions of an interaction mode and an interaction environment, in the interaction based on the augmented reality, a virtual object and a real situation are used as two necessary elements to determine the real three-dimension, a user feels a three-dimensional world formed by overlapping the virtual object and the real situation, and the three-dimensional world with the real situation enables the user to have brand-new interaction experience. The interaction between the user and the device is also three-dimensional, such as gesture control, motion control, and the like. Real-time interactivity is mainly embodied in that after a user interacts with a device, in order to ensure a realistic user experience, input and output of the device should be timely. With the continuous and deep research of the AR technology, it is far from sufficient to realize only static fusion, and it is also necessary to realize dynamic complementation, which is the true meaning of virtual and real fusion.
Most of the existing AR interaction technologies only support a mask-based AR presentation technology, and compared with a single-click game, the existing AR interaction technologies are poor in interactive experience and weak in offline convergence. At present, when a user appreciates a three-dimensional real object, only the three-dimensional real object can be commented and exchanged singly, and different users can not exchange interactive information with the three-dimensional real object.
Disclosure of Invention
In view of the above-mentioned deficiencies in the prior art, the present invention aims to provide an AR presentation method and system for sharing interactive information in space based on real objects. The method solves the problems that at present, when a user appreciates a three-dimensional real object, only the three-dimensional real object can be commented and exchanged singly, and different users can not exchange information with the three-dimensional real object interactively, and the like.
In order to solve the problems in the prior art, the invention is realized by the following technical scheme:
an AR presentation method of shared interactive information in space based on real objects comprises the following steps:
s1, carrying out multi-azimuth scanning shooting on the real object, establishing a real object 3D model after characteristic analysis, and storing the model into a platform database;
s2, after scanning the corresponding real object through the platform, the first terminal user determines the spatial position data of the interactive information relative to the real object, the spatial position data is calculated based on a spatial simulation algorithm, then the interactive information is issued at the spatial position, and finally the spatial position data and the interactive information are stored in a platform database;
s3, after scanning the corresponding real object through a platform, automatically matching and inquiring the real object 3D model data, the corresponding spatial position data and the interaction information stored in the platform database, then rendering the interaction information to the spatial position corresponding to the real object 3D model data, integrating and generating a second 3D model taking the real object 3D model as the center, wherein the interaction information surrounds the periphery of the real object 3D model, and therefore the function of AR presentation of the shared interaction information in the space based on the real object is achieved.
Further, in step S1, the multi-azimuth scanning shooting is implemented based on a 3D depth camera.
Further, in step S2, the spatial simulation algorithm specifically includes the following steps:
s21, after scanning the corresponding real object through the platform, the terminal user A prestores the image feature information of the real object, decomposes the image feature information into N feature node pictures, and takes the hash table mode [ k, value ]]Recording, namely recording the picture of each characteristic node as k and the corresponding plane coordinate position (l) of the k for each characteristic node1,l2) Marking as value; the matrix data set of the current real object picture is as follows:
Figure BDA0003145521890000031
can obtain the current object feature picture
Figure BDA0003145521890000032
The corresponding hash value is the plane coordinate position (l) of the information data presentation1,l2);
S22, after scanning the corresponding real object, calculating the size of the real object, recording as R, wherein the actual size of the real object 3D model is R, and according to a formula
Figure BDA0003145521890000033
Obtaining a value of D, wherein D is a constant parameter for proportional calculation of the object model;
s23, finally obtaining the position coordinate (l) of the interactive information presented in the space1,l2,d)。
Further, in step S2, the first end user is one of the same end user or a different end user, the mutual information issued by the same end user or the different end user is independent of each other, and the mutual information is located at the corresponding spatial position of the physical 3D model and remains unchanged.
Further, in step S3, the second end user is the same end user or one of different end users, and after the second 3D model is formed, the second end user may continue to issue the interaction information based on the second 3D model, thereby implementing the function of sharing the interaction information among multiple users.
Another object of the present invention is to provide an AR presentation system in space based on shared interactive information of real objects.
The AR presentation system of the real object-based shared interactive information in the space comprises:
the object image acquisition module is used for carrying out multi-azimuth scanning shooting on the object, establishing a 3D model of the object after characteristic analysis and storing the model into a platform database;
the interactive information uploading module is used for fixing the interactive information issued by the first terminal user on the corresponding spatial position of the real object 3D model;
and the interactive information presentation module is used for integrating the real object 3D model data, the corresponding spatial position data and the interactive information and generating a second 3D model which takes the real object 3D model as a center and surrounds the real object 3D model with the interactive information, so that the function of AR presentation in space based on the real object shared interactive information is realized.
Further, the object image acquisition module performs multi-azimuth scanning shooting through a 3D depth camera.
Further, the mutual information in the mutual information uploading module is based on a space simulation algorithm to realize positioning.
Further, the spatial simulation algorithm specifically includes the following steps:
s21, after scanning the corresponding real object through the platform, the terminal user A prestores the image feature information of the real object, decomposes the image feature information into N feature node pictures, and takes the hash table mode [ k, value ]]Recording, namely recording the picture of each characteristic node as k and the corresponding plane coordinate position (l) of the k for each characteristic node1,l2) Marking as value; the matrix data set of the current real object picture is as follows:
Figure BDA0003145521890000041
can obtain the current object feature picture
Figure BDA0003145521890000042
The corresponding hash value is the plane coordinate position (l) of the information data presentation1,l2);
S22, after scanning the corresponding real object, calculating the size of the real object, recording as R, wherein the actual size of the real object 3D model is R, and according to a formula
Figure BDA0003145521890000051
Obtaining a value of D, wherein D is a constant parameter for proportional calculation of the object model;
s23, finally obtaining the position coordinate (l) of the interactive information presented in the space1,l2,d)。
Further, the first end user is one of the same end user or different end users, the mutual information issued by the same end user or different end users is independent of each other, and the mutual information is located at the corresponding spatial position of the physical 3D model and remains unchanged.
Compared with the prior art, the invention has the following advantages:
the invention provides an AR presentation method and system in space based on shared interactive information of a real object, which comprises the steps of firstly carrying out multi-directional scanning shooting on the real object, establishing a 3D model of the real object after characteristic analysis and storing the model in a platform database; then, after scanning the corresponding real object through a platform, determining spatial position data of the interactive information relative to the real object, wherein the spatial position data is obtained by calculation based on a spatial simulation algorithm, then issuing the interactive information at the spatial position, and then storing the spatial position data and the interactive information into a platform database; and finally, rendering the interactive information to a corresponding spatial position of the real object 3D model data, integrating and generating a second 3D model which takes the real object 3D model as a center, wherein the interactive information surrounds the second 3D model around the real object 3D model, and displaying the second 3D model in a superposition manner to a real scene where the real object is located, so that the function of AR (augmented reality) display of the shared interactive information based on the real object in the space is realized, and meanwhile, independent spatial positions of information transmitted among a plurality of users can be obtained, and multi-user information interaction is realized.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a flowchart of an AR presentation method in space based on shared interactive information of a real object according to the present invention;
FIG. 2 is a schematic diagram of an AR presentation system in space based on shared interactive information of a real object according to the present invention.
Detailed Description
The technical solution of the present invention will be clearly and completely described below with reference to specific embodiments. It is to be understood that the described embodiments are merely a few embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without any inventive step, are within the scope of the present invention.
Example 1
Referring to fig. 1, an AR presentation method for sharing interactive information in a space based on a real object includes the following steps:
s1, carrying out multi-azimuth scanning shooting on the real object, establishing a real object 3D model after characteristic analysis, and storing the model into a platform database;
s2, after scanning the corresponding real object through the platform, the first terminal user determines the spatial position data of the interactive information relative to the real object, the spatial position data is calculated based on a spatial simulation algorithm, then the interactive information is issued at the spatial position, and finally the spatial position data and the interactive information are stored in a platform database;
s3, after scanning the corresponding real object through a platform, automatically matching and inquiring the real object 3D model data, the corresponding spatial position data and the interaction information stored in the platform database, then rendering the interaction information to the spatial position corresponding to the real object 3D model data, integrating and generating a second 3D model taking the real object 3D model as the center, wherein the interaction information surrounds the periphery of the real object 3D model, and therefore the function of AR presentation of the shared interaction information in the space based on the real object is achieved.
Further, in step S1, the multi-azimuth scanning shooting is implemented based on a 3D depth camera. In the prior art, other scanning and shooting modes are available.
Further, in step S2, the spatial simulation algorithm specifically includes the following steps:
s21, after scanning the corresponding real object through the platform, the terminal user A prestores the image feature information of the real object, decomposes the image feature information into N feature node pictures, and takes the hash table mode [ k, value ]]Recording, namely recording the picture of each characteristic node as k and the corresponding plane coordinate position (l) of the k for each characteristic node1,l2) Marking as value; the matrix data set of the current real object picture is as follows:
Figure BDA0003145521890000071
can obtain the current object feature picture
Figure BDA0003145521890000072
The corresponding hash value is the plane coordinate position (l) of the information data presentation1,l2);
S22, after scanning the corresponding real object, calculating the size of the real object, recording as R, wherein the actual size of the real object 3D model is R, and according to a formula
Figure BDA0003145521890000073
Obtaining a value of D, wherein D is a constant parameter for proportional calculation of the object model;
s23, finally obtaining the position coordinate (l) of the interactive information presented in the space1,l2,d)。
It should be noted that, after the same terminal user or different terminal users scan the real object for multiple times, based on the spatial simulation algorithm, a spatial position relative to the real object that is independent of each other is obtained, the spatial simulation algorithm is only one of the algorithms of the present invention, and similar spatial simulation algorithms in other prior arts are all included in the protection scope of the present invention.
Further, in step S2, the first end user is one of the same end user or a different end user, the mutual information issued by the same end user or the different end user is independent of each other, and the mutual information is located at the corresponding spatial position of the physical 3D model and remains unchanged.
Further, in step S3, the second end user is the same end user or one of different end users, and after the second 3D model is formed, the second end user may continue to issue the interaction information based on the second 3D model, thereby implementing the function of sharing the interaction information among multiple users.
Example 2
Referring to fig. 2, an AR presentation system for sharing interactive information in space based on real objects is disclosed. The method comprises the following steps:
the object image acquisition module 101 is used for carrying out multi-directional scanning shooting on an object, establishing a 3D model of the object after characteristic analysis and storing the model into a platform database;
the interactive information uploading module 102 is configured to fix the interactive information issued by the first terminal user at a corresponding spatial position of the real object 3D model;
and the interactive information presentation module 103 is configured to integrate the real object 3D model data, the corresponding spatial position data, and the interactive information and generate a second 3D model centered on the real object 3D model, where the interactive information surrounds the real object 3D model, so that the function of AR presentation in space based on the shared interactive information of the real object is realized.
Further, the real object image acquiring module 101 performs multi-directional scanning shooting through a 3D depth camera.
Further, the mutual information in the mutual information uploading module 102 is based on a spatial simulation algorithm to realize positioning.
Further, the spatial simulation algorithm specifically includes the following steps:
s21, after scanning the corresponding real object through the platform, the terminal user A prestores the image feature information of the real object, decomposes the image feature information into N feature node pictures, and takes the hash table mode [ k, value ]]Recording, namely recording the picture of each characteristic node as k and the corresponding plane coordinate position (l) of the k for each characteristic node1,l2) Marking as value; the matrix data set of the current real object picture is as follows:
Figure BDA0003145521890000081
can obtain the current object feature picture
Figure BDA0003145521890000082
The corresponding hash value is the plane coordinate position (l) of the information data presentation1,l2);
S22, after scanning the corresponding real object, calculating the size of the real object, recording as R, wherein the actual size of the real object 3D model is R, and according to a formula
Figure BDA0003145521890000091
Obtaining a value of D, wherein D is a constant parameter for proportional calculation of the object model;
s23, finally obtaining the position coordinate (l) of the interactive information presented in the space1,l2,d)。
Further, the first end user is one of the same end user or different end users, the mutual information issued by the same end user or different end users is independent of each other, and the mutual information is located at the corresponding spatial position of the physical 3D model and remains unchanged.
Although the present invention has been described in detail with reference to the foregoing embodiments, those skilled in the art will understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein.

Claims (10)

1. An AR presentation method of shared interactive information in space based on a real object is characterized by comprising the following steps:
s1, carrying out multi-azimuth scanning shooting on the real object, establishing a real object 3D model after characteristic analysis, and storing the model into a platform database;
s2, after scanning the corresponding real object through the platform, the first terminal user determines the spatial position data of the interactive information relative to the real object, the spatial position data is calculated based on a spatial simulation algorithm, then the interactive information is issued at the spatial position, and finally the spatial position data and the interactive information are stored in a platform database;
s3, after scanning the corresponding real object through a platform, automatically matching and inquiring the real object 3D model data, the corresponding spatial position data and the interaction information stored in the platform database, then rendering the interaction information to the spatial position corresponding to the real object 3D model data, integrating and generating a second 3D model taking the real object 3D model as the center, wherein the interaction information surrounds the periphery of the real object 3D model, and therefore the function of AR presentation of the shared interaction information in the space based on the real object is achieved.
2. The method for AR presentation of shared real-object-based mutual information in space according to claim 1, wherein in step S1, the multi-azimuth scanning shooting is implemented based on a 3D depth camera.
3. The method for AR presentation of shared interactive information in space based on physical objects according to claim 1, wherein in step S2, the spatial modeling algorithm specifically includes the following steps:
s21, after scanning the corresponding real object through the platform, the terminal user A prestores the image feature information of the real object, decomposes the image feature information into N feature node pictures, and takes the hash table mode [ k, value ]]Recording, namely recording the picture of each characteristic node as k and the corresponding plane coordinate position (l) of the k for each characteristic node1,l2) Marking as value; the matrix data set of the current real object picture is as follows:
Figure FDA0003145521880000011
can obtain the current object feature picture
Figure FDA0003145521880000021
The corresponding hash value is the plane coordinate position (l) of the information data presentation1,l2);
S22, after scanning the corresponding real object, calculating the size of the real object, recording as R, wherein the actual size of the real object 3D model is R, and according to a formula
Figure FDA0003145521880000022
Obtaining a value of D, wherein D is a constant parameter for proportional calculation of the object model;
s23, finally obtaining the position coordinate (l) of the interactive information presented in the space1,l2,d)。
4. The method for AR presentation of shared real object-based interactive information in space according to claim 1, wherein in step S2, the first end user is one of the same end user or different end users, the interactive information published by the same end user or different end users is independent of each other, and the interactive information is located at the corresponding spatial position of the real object 3D model and remains unchanged.
5. The method for AR presentation of shared real-object-based interactive information in space according to claim 1, wherein in step S3, the second end user is one of the same end user or a different end user, and after forming the second 3D model, the second end user can continue to issue interactive information based on the second 3D model, thereby implementing an interactive information sharing function between multiple users.
6. An AR presentation system in space based on shared interactive information of real objects, comprising:
the object image acquisition module 101 is used for carrying out multi-directional scanning shooting on an object, establishing a 3D model of the object after characteristic analysis and storing the model into a platform database;
the interactive information uploading module 102 is configured to fix the interactive information issued by the first terminal user at a corresponding spatial position of the real object 3D model;
and the interactive information presentation module 103 is configured to integrate the real object 3D model data, the corresponding spatial position data, and the interactive information and generate a second 3D model centered on the real object 3D model, where the interactive information surrounds the real object 3D model, so that the function of AR presentation in space based on the shared interactive information of the real object is realized.
7. The object-based AR rendering system of shared interactive information in space of claim 6, wherein said object image acquisition module 101 is multi-azimuth scanned shot with a 3D depth camera.
8. The object-based AR presentation system for sharing mutual information in space according to claim 6, wherein the mutual information in the mutual information uploading module 102 is based on a space simulation algorithm to realize positioning.
9. The object-based AR presentation system for sharing interactive information in space according to claim 8, wherein the spatial simulation algorithm specifically comprises the steps of:
s21, after scanning the corresponding real object through the platform, the terminal user A prestores the image feature information of the real object, decomposes the image feature information into N feature node pictures, and takes the hash table mode [ k, value ]]Recording, namely recording the picture of each characteristic node as k and the corresponding plane coordinate position (l) of the k for each characteristic node1,l2) Marking as value; the matrix data set of the current real object picture is as follows:
Figure FDA0003145521880000031
can obtain the current object feature picture
Figure FDA0003145521880000032
The corresponding hash value is the plane coordinate position (l) of the information data presentation1,l2);
S22, after scanning the corresponding real object, calculating the size of the real object, recording as R, wherein the actual size of the real object 3D model is R, and according to a formula
Figure FDA0003145521880000033
Obtaining a value of D, wherein D is a constant parameter for proportional calculation of the object model;
s23, finally obtaining the position coordinate (l) of the interactive information presented in the space1,l2,d)。
10. The real object-based AR presentation system of shared interactive information in space of claim 6, wherein said first end user is one of the same end user or different end users, interactive information published by the same end user or different end users is independent of each other, and said interactive information is located at a corresponding spatial location of said real object 3D model and remains unchanged.
CN202110752701.0A 2021-07-02 2021-07-02 AR presentation method and system of shared interactive information in space based on real object Pending CN113409472A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110752701.0A CN113409472A (en) 2021-07-02 2021-07-02 AR presentation method and system of shared interactive information in space based on real object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110752701.0A CN113409472A (en) 2021-07-02 2021-07-02 AR presentation method and system of shared interactive information in space based on real object

Publications (1)

Publication Number Publication Date
CN113409472A true CN113409472A (en) 2021-09-17

Family

ID=77681137

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110752701.0A Pending CN113409472A (en) 2021-07-02 2021-07-02 AR presentation method and system of shared interactive information in space based on real object

Country Status (1)

Country Link
CN (1) CN113409472A (en)

Similar Documents

Publication Publication Date Title
CN102270275B (en) The method of selecting object and multimedia terminal in virtual environment
CN110163942B (en) Image data processing method and device
CN106355153A (en) Virtual object display method, device and system based on augmented reality
US20070291035A1 (en) Horizontal Perspective Representation
CN105719343A (en) Method for constructing virtual streetscape map
CN112954292B (en) Digital museum navigation system and method based on augmented reality
KR20160013928A (en) Hud object design and method
KR20090117531A (en) System for constructing mixed reality and method thereof
CN112138386A (en) Volume rendering method and device, storage medium and computer equipment
CN106204746A (en) A kind of augmented reality system realizing 3D model live paint
CN108881886A (en) A method of it is realized based on camera Matrix Technology and carries out the lossless interactive application of big data in display end
CN109859100A (en) Display methods, electronic equipment and the computer readable storage medium of virtual background
CN116057577A (en) Map for augmented reality
Sandnes Sketching 3D immersed experiences rapidly by hand through 2D cross sections
US11385856B2 (en) Synchronizing positioning systems and content sharing between multiple devices
JP2020532022A (en) Sphere light field rendering method in all viewing angles
CN111949904B (en) Data processing method and device based on browser and terminal
Kurillo et al. Teleimmersive 3D collaborative environment for cyberarchaeology
CN113409472A (en) AR presentation method and system of shared interactive information in space based on real object
CN115686202A (en) Three-dimensional model interactive rendering method across Unity/Optix platform
Siegl et al. An augmented reality human–computer interface for object localization in a cognitive vision system
CN111476873A (en) Mobile phone virtual doodling method based on augmented reality
CN116055708B (en) Perception visual interactive spherical screen three-dimensional imaging method and system
Lin et al. Breakthroughs and Applications of Augmented Reality (AR) Technology in the Digital Media Field
CN116704097B (en) Digitized human figure design method based on human body posture consistency and texture mapping

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination