CN110531847B - Social contact method and system based on augmented reality - Google Patents

Social contact method and system based on augmented reality Download PDF

Info

Publication number
CN110531847B
CN110531847B CN201910681955.0A CN201910681955A CN110531847B CN 110531847 B CN110531847 B CN 110531847B CN 201910681955 A CN201910681955 A CN 201910681955A CN 110531847 B CN110531847 B CN 110531847B
Authority
CN
China
Prior art keywords
target
information
module
user
augmented reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910681955.0A
Other languages
Chinese (zh)
Other versions
CN110531847A (en
Inventor
邓宝松
印二威
李靖
鹿迎
唐荣富
闫野
桂健钧
宋立
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center
National Defense Technology Innovation Institute PLA Academy of Military Science
Original Assignee
Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center
National Defense Technology Innovation Institute PLA Academy of Military Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center, National Defense Technology Innovation Institute PLA Academy of Military Science filed Critical Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center
Priority to CN201910681955.0A priority Critical patent/CN110531847B/en
Publication of CN110531847A publication Critical patent/CN110531847A/en
Application granted granted Critical
Publication of CN110531847B publication Critical patent/CN110531847B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking

Abstract

The invention discloses a novel social contact method and a novel social contact system based on augmented reality, which are used for really enhancing the visual perception and interactive experience of a user through virtual-real fusion, do not influence the normal social contact behavior of the user, can interactively set and switch different social contact groups and modes, and have wide applicability. The system comprises the following modules: the system comprises an environment perception module, a target identification module, a network service module, an enhanced display module and a man-machine interaction module. The beneficial effects are that: the wearable augmented reality glasses are utilized to achieve natural information sharing and comment communication in daily life, the traditional network social mode is overturned, the virtual space object and the real physical world are organically integrated, virtual-real interaction is achieved on the basis of traditional network forum interaction, social experience and interaction bandwidth of people and people are greatly improved, network social is organically integrated into daily life, and compared with the traditional network social mode, the wearable augmented reality glasses have wider applicability and development prospect.

Description

Social contact method and system based on augmented reality
Technical Field
The invention belongs to a social contact method and a social contact system, and particularly relates to a social contact method and a social contact system based on augmented reality.
Background
location-Based services (L position Based Service, L BS) are value-added services which acquire the location information of a mobile terminal user through a radio communication network of a mobile operator or an external positioning mode (such as Beidou and GPS) and provide corresponding services for the user under the support of a geographic information platform.
With the development of Augmented Reality technology and the gradual maturity of wearable computing devices, the application of combining the two is receiving wide attention, Mobile Augmented Reality (MAR) has become a popular direction for domestic and foreign research in recent years, wearable Augmented Reality glasses are likely to serve as a necessary communication and social terminal for people, cooperate with a smart phone or even replace the smart phone, and become a necessary means for people to acquire external information in normal production activities.
With regard to information query and comment on real world targets, people often share the information in the form of forums and social software, but the method has several disadvantages: firstly, people can only log in corresponding software through terminals such as a computer, a smart phone and the like, and input characters or uploaded pictures for sharing, which needs to be finished in a specific occasion and needs to be focused on the input and output with heads down, so that normal social contact or operation activities need to be interrupted; secondly, when people inquire and comment, sight lines must be separated from objective targets, the sight line and the objective targets cannot share visual field space, and the sight lines and the objective targets cannot share specific detail characteristics, so that the interaction mode is serial in nature and is not in line with the reality of perception of physical world of people; and thirdly, the inquired or shared information is not combined with the objective physical position of the target, so that people cannot be associated with the spatial topological relation during feeling, and cannot actively push the physical position of the user at present.
With the gradual maturity of the augmented reality technology, the augmented reality head-mounted device will be rapidly popularized, even as common as people wear eyes or carry smart phones daily, which provides objective conditions for the application of the novel social method and system mentioned in the scheme.
Disclosure of Invention
The invention aims to provide a social contact method and a social contact system based on augmented reality, which are used for really enhancing the visual perception and interactive experience of a user through virtual-real fusion, do not influence the normal social contact behavior of the user, can interactively set and switch different social groups and modes, and have wide applicability.
The technical scheme of the invention is as follows: an augmented reality based social system comprising the following modules: an environment perception module, a target identification module, a network service module, an enhanced display module and a man-machine interaction module,
the environment perception module acquires a surrounding border geometric structure through a vision camera, a depth camera and a radar means, so that accurate three-dimensional reconstruction of a physical world is realized;
the target identification module extracts the appearance characteristics of the target based on the target image acquired by the sensing module, and realizes accurate identification of the target by combining information such as the current position and the like;
the network service module retrieves the target attribute and the comment information thereof in the wireless network cloud data server according to the ID;
the enhanced display module organizes the returned information in multimedia modes such as characters, pictures, figures, voice, video and the like;
the human-computer interaction module can enable a user to watch and listen to the acquired target multimedia information, and can support the user to input information through interaction means such as voice and gestures according to comments on the target.
The environment sensing module is used for determining the absolute position of an observer and acquiring the current sight direction of the observer by global positioning means such as Beidou and GPS based on an environment three-dimensional reconstruction result and signals such as an Inertial Measurement Unit (IMU).
The environment perception module is formed by combining one or more of a camera, a depth camera, a small radar and an IMU.
The target identification module determines the unique ID identification of the target identification module based on the target identification result so as to be used for network information query and retrieval.
The target identification module is composed of a computing unit taking a CPU and a GPU as cores.
The network service module determines whether the user can access the information according to the authority of the user and returns corresponding information according to the authority.
The network service module is composed of a mobile communication unit and a corresponding software service interface.
The enhanced display module superimposes the multimedia information to the user visual field in a user interface mode and realizes the alignment with the real target.
The enhanced display module is composed of optical processing units such as a free-form surface or an optical waveguide.
The human-computer interaction module converts information into character, voice or picture information, and can store the information into the cloud data server for other users to share and access.
The human-computer interaction module is composed of voice, gesture, eye movement, myoelectricity or other wearable input units, and the modules can be connected in a wired or wireless mode to ensure that a complete wearable software and hardware system is formed.
An augmented reality based social method comprising the steps of,
(1) the user sees the attributes and various comments about the target through the augmented reality glasses in the outdoor environment, and the information is superposed on the target and keeps a correct perspective relation;
(2) the user adds comment information to the target by various means such as characters, voice and the like, and the information is stored in a network database in a character or multimedia mode;
(3) other users can share the data stored in the network database according to the authority, and the shared information can be presented on the target in an augmented reality mode.
The invention has the beneficial effects that: the wearable augmented reality glasses are utilized to achieve natural information sharing and comment communication in daily life, the traditional network social mode is overturned, the virtual space object and the real physical world are organically integrated, virtual-real interaction is achieved on the basis of traditional network forum interaction, social experience and interaction bandwidth of people and people are greatly improved, network social is organically integrated into daily life, and compared with the traditional network social mode, the wearable augmented reality glasses have wider applicability and development prospect. In addition, different social groups can ensure different information acquisition authorities and ensure privacy safety; and different social groups and different virtual objects can share a common physical space at the same time, which can lead the display content of the target to be infinitely extended.
Drawings
FIG. 1 is a schematic diagram of a social networking system network deployment based on augmented reality according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an actual application scenario;
fig. 3 is an information processing flowchart.
Detailed Description
The invention is described in further detail below with reference to the figures and the embodiments.
The social contact system and method based on augmented reality enhance the virtual object to a specific physical space (can be fused with a fixed marker in a certain geographic position of a real environment, such as a scenery spot statue, a building facility or other) through a visual means by relying on a wireless mobile communication network, a user based on augmented reality equipment and having a specific authority can see a scene with fused virtual and real in the real physical space and can comment and communicate the virtual object through various modes such as voice, gestures, characters, pictures, videos and the like, the communication content also exists in a virtual object mode, and the virtual object can be enhanced to other user visual field spaces through the network, so that sharing is realized in different social groups. Although highly integrated with the real environment, the virtual objects do not occupy real physical space, and virtual objects provided to different social groups may share the same physical space. The virtual objects and comment information thereof are stored in a networked cloud database, and when the viewpoint of the user is changed, the information can be dynamically loaded through a network in real time according to the change of the visual field space of the user, so that the seamless fusion of reality and virtuality is ensured.
Several terms are involved in the present invention:
(1) augmented reality device (e.g., augmented reality glasses): in addition to enabling people to see the normal physical world, the generated virtual objects can be overlaid on the real physical world, and when people move, the virtual physical objects keep the correct perspective relation with the real physical world as if they were there. The appearance of the augmented reality glasses is similar to that of normal glasses, but the augmented reality glasses are provided with an environment sensing unit, a calculation processing unit, a data storage unit and a network communication unit, information display can be naturally superposed to the visual field of people, and normal operation activities are prevented from being influenced by looking down at a mobile phone.
(2) The target is as follows: it refers to typical fixed facilities in the objective physical world, such as scenic spot signatory buildings, hot business storefronts, major event places, etc., which may be large or small, but each target is determined by a specific spatial geographic location (which may be determined by latitude and longitude information, for example), and comments or related attribute information thereof can be managed, indexed and distinguished by the physical location and the unique identification thereof.
(3) Virtual object: the virtual information is enhanced to the visual field space of the user, and the information is drawn in a graphic mode, is displayed in the visual field space of the user through the augmented reality glasses and is simultaneously superposed on the real physical world.
(4) Social group: refers to a specific group of people who are interested in some target facilities existing in the real world and their functional roles, for example, people who are interested in some scenic spot identifications, facility cultural relics, and people who are interested in shopping stores, and food commodities in restaurants. Similar in nature to different topic forums on our social networks, comments can be made for a certain topic, a certain target, a certain business.
(5) And (3) user roles: each user has own authority, and can participate in which forums to see corresponding contents, and users without the authority cannot see the contents. Of course, the authorized user may also choose to turn on to view the information, or may turn off the content by default.
For a specific group (called a social group) wearing the augmented reality device (glasses), superimposed virtual objects can be seen on the basis of a real physical space and a correct perspective relation is kept, wherein the virtual objects can be three-dimensional virtual models, text and picture attributes or forums, and the social group can comment and communicate with the virtual objects; for users who do not wear augmented reality devices, this information is not visible. Further, the content displayed by the users can be judged based on the IDs of the users and the rights of the users, so that different types of social groups can be formed.
An augmented reality based social system comprising the following modules: the system comprises an environment perception module, a target identification module, a network service module, an enhanced display module and a man-machine interaction module.
The environment perception module acquires a surrounding border geometric structure through a vision camera, a depth camera and a radar means, so that accurate three-dimensional reconstruction of a physical world is realized; the environment sensing module is used for determining the absolute position of an observer and acquiring the current sight direction of the observer by global positioning means such as Beidou and GPS based on an environment three-dimensional reconstruction result and signals such as an Inertial Measurement Unit (IMU). The environment perception module is composed of all or part of components such as a camera, a depth camera, a small radar and an IMU.
The target identification module extracts the appearance characteristics of the target based on the target image acquired by the sensing module, and realizes accurate identification of the target by combining information such as the current position and the like; the target identification module determines the unique ID identification thereof based on the target identification result so as to be used for network information query retrieval. The target identification module is composed of a computing unit with a CPU and a GPU as cores.
The network service module retrieves the target attribute and the comment information thereof in the wireless network cloud data server according to the ID; the network service module determines whether the user can access the information according to the authority of the user and returns corresponding information according to the authority. The network service module is composed of a mobile communication unit and a corresponding software service interface.
The enhanced display module organizes the returned information in multimedia modes such as characters, pictures, figures, voice, video and the like; the enhanced display module superimposes the multimedia information to the user visual field in a user interface mode and realizes the alignment with the real target. The enhanced display module is composed of optical processing units such as a free-form surface or an optical waveguide;
the human-computer interaction module can enable a user to see and listen to the acquired target multimedia information, and can support the user to input information through interaction means such as voice and gestures according to comments on the target through interaction; the human-computer interaction module supports the conversion of the information into character, voice or picture information, and can store the information into the cloud data server for other users to share and access. The human-computer interaction module consists of voice, gesture, eye movement, myoelectricity or other wearable input units, and the modules can be connected in a wired or wireless mode to ensure that a complete wearable software and hardware system is formed.
The environment perception module realizes perception of surrounding environment three-dimensional information, obtains a current position of a user and a viewpoint sight direction, and further obtains a target list in a user visual field space, meanwhile, a visual sensor obtains a target image in a visual field area, further determines a point of interest target through an image recognition technology, obtains attributes and related comment information of the point of interest target in real time from a network database through an ID of the point of interest target, the information is displayed in the user visual field space in an enhanced mode through a graph drawing algorithm and an optical module, and the information is similar to a billboard and the like, is arranged on the upper portion of a target entity and keeps a correct geometric perspective relation, and is as if the information is really placed on the target. The information isolation, sharing and privacy protection can be ensured among different social groups through attribute definition and access authority of users. The user can browse the information through various interaction means such as voice, gestures and eye movements, the information can be watched in a multimedia mode, after real experience, various comment information such as characters, pictures, voice and videos can be generated in various interaction modes, the information is stored in a network space database, other users can browse the information according to the authority, and therefore sharing interaction of the information across space and time is achieved, and a novel augmented reality social mode is generated.
The following are 2 typical examples:
the method is an application scene aiming at cultural relics of certain scenic spots. When a user approaches and watches the cultural relic, corresponding attribute information, famous sentences written by different names of different languages for the user can be displayed on the periphery of the cultural relic through the augmented reality glasses, and even the representation of the related classical events can be presented on the periphery of the real object, and even the interaction can be generated with the real object, so that the cognitive understanding of the user on the cultural relic is greatly enhanced; the user can make comments and thoughts of the user and even leave multimedia information of the user, and the information is stored in a database associated with the cultural relic through a network.
And the other is an application scene aiming at some shopping places. When a user searches for some commodities and is keen on some special foods, the user can search for the positions of the shops according to the guide provided by the augmented reality glasses, and the indication information is automatically superposed in the visual field space of the user; when a great number of selectable objects exist, a user can make correct selection judgment through historical comments of places and commodities, the comments can be displayed in multimedia modes such as pictures, graphics and videos besides text and voice, and the user can make decisions more intuitively; of course, the user may also submit the comment information generated by the user, and the comment information is stored in the network database and is associated with the place and the physical location thereof.
An augmented reality based social method comprising the steps of:
(1) the user is interested in a target in the real physical world in an outdoor environment, the user sees the attributes and various comments about the target through the augmented reality glasses, and the information is superposed on the target and keeps a correct perspective relation as if the information exists in a real space;
(2) the user adds comment information to the target by various means such as characters, voice and the like, and the information is stored in a network database in a character or multimedia mode;
(3) other users can share the data stored in the network database according to the authority, and the shared information can be presented on the target in an augmented reality mode.
This social approach naturally combines person-to-person communication about a certain topic with a target or location in physical space, not only for easy comparison but also for intuitive visualization as if the information were a large screen of advertisements placed on top of the target.
The virtual information can share a physical space and is not limited by the size of the space; information among different social groups can be managed according to authority, and the strategy enables the information (advertisement large screen) to be stored and managed without limitation to a certain extent.

Claims (12)

1. An augmented reality based social system, comprising: the system comprises the following modules: the system comprises an environment perception module, a target identification module, a network service module, an enhanced display module and a man-machine interaction module;
the environment perception module acquires a surrounding border geometric structure through a vision, a depth camera and a radar means, so that accurate three-dimensional reconstruction of a physical world is realized, and meanwhile, real-time perception of the position, the posture and the visual field parameters of a user is realized;
the target identification module extracts the appearance characteristics of the target based on the target image acquired by the sensing module, and realizes the accurate identification of the target by combining the current position and the network geographic information;
the network service module retrieves the target related attributes and the hot comment information thereof in the wireless network cloud data server according to the target ID;
the enhanced display module organizes the return information of the network cloud database in a multimedia mode, wherein the multimedia mode comprises characters, pictures, graphs, voice and video, is enhanced and displayed in a user visual field space and is aligned with an actual physical space position;
the human-computer interaction module is used for enabling a user to obtain multimedia information about a target through watching and listening, supporting the user to comment on the target through voice and gesture interaction means, and inputting evaluation information through the voice and gesture interaction means;
the environment perception module realizes perception of surrounding environment three-dimensional information, obtains the current position and the view point sight direction of a user, obtains a target list in a user view space, simultaneously obtains a target image in a view area, further determines an interest point target through an image recognition technology, obtains attributes and related comment information of the interest point target in real time from a network database through an interest point target ID, and the information is displayed in the user view space in an enhanced mode, is placed on the upper portion of a target entity and keeps a correct geometric perspective relation.
2. An augmented reality based social system as claimed in claim 1, wherein: the environment perception module is based on an environment three-dimensional reconstruction result and an Inertial Measurement Unit (IMU) signal, and realizes determination of the absolute position of an observer and acquisition of the current sight direction of the observer through a Beidou and Global Positioning System (GPS).
3. An augmented reality based social system as claimed in claim 1 or 2, wherein: the environment perception module is formed by one or more of a camera, a depth camera, a small radar and an IMU.
4. An augmented reality based social system as claimed in claim 1, wherein: the target identification module determines the unique ID identification of the target identification module based on the target identification result so as to be used for network information query and retrieval.
5. An augmented reality based social system as claimed in claim 1 or 4, wherein: the target identification module is composed of a computing unit taking a CPU and a GPU as cores.
6. An augmented reality based social system as claimed in claim 1, wherein: the network service module determines whether the user can access the information according to the authority of the user and returns corresponding information according to the authority.
7. An augmented reality based social system as claimed in claim 1 or 6, wherein: the network service module is composed of a mobile communication unit and a corresponding software service interface.
8. An augmented reality based social system as claimed in claim 1, wherein: the enhanced display module superimposes the multimedia information to the user visual field in a user interface mode and realizes the alignment with the real target.
9. An augmented reality based social system as claimed in claim 1 or 8, wherein: the enhanced display module is composed of an optical processing unit, and the optical processing unit comprises a free-form surface or an optical waveguide.
10. An augmented reality based social system as claimed in claim 1, wherein: the human-computer interaction module converts information into character, voice or picture information, and can store the information into the cloud data server for other users to share and access.
11. An augmented reality based social system as claimed in claim 1 or 10, wherein: the human-computer interaction module is composed of wearable input units of voice, gestures, eye movements and myoelectricity, and the modules are connected in a wired or wireless mode to ensure that a complete wearable software and hardware system is formed.
12. An augmented reality-based social method, characterized by: comprises the following steps of (a) carrying out,
(1) the user sees the attributes and various comments about the target through the augmented reality glasses in the outdoor environment, and the information is superposed on the target and keeps a correct perspective relation; the environment perception module realizes perception of surrounding environment three-dimensional information, acquires the current position and the view point sight direction of a user, acquires a target list in a user view space, simultaneously acquires a target image in a view area, further determines an interest point target through an image recognition technology, acquires the attribute and related comment information of the interest point target in real time from a network database through an interest point target ID, and the information is enhanced and displayed in the user view space, is arranged on the upper part of a target entity and keeps a correct geometric perspective relation;
(2) the user adds comment information to the target by means of characters and voice, and the information is stored in a network database in a multimedia mode;
(3) and sharing the data stored in the network database according to the authority by other users, and presenting the shared information on the target in an augmented reality manner.
CN201910681955.0A 2019-07-26 2019-07-26 Social contact method and system based on augmented reality Active CN110531847B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910681955.0A CN110531847B (en) 2019-07-26 2019-07-26 Social contact method and system based on augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910681955.0A CN110531847B (en) 2019-07-26 2019-07-26 Social contact method and system based on augmented reality

Publications (2)

Publication Number Publication Date
CN110531847A CN110531847A (en) 2019-12-03
CN110531847B true CN110531847B (en) 2020-07-14

Family

ID=68661919

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910681955.0A Active CN110531847B (en) 2019-07-26 2019-07-26 Social contact method and system based on augmented reality

Country Status (1)

Country Link
CN (1) CN110531847B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111445314A (en) * 2020-02-19 2020-07-24 上海萃钛智能科技有限公司 Visual communication system and visual communication method for mixed reality shopping environment
CN111340598B (en) * 2020-03-20 2024-01-16 北京爱笔科技有限公司 Method and device for adding interactive labels
CN111562845B (en) * 2020-05-13 2022-12-27 如你所视(北京)科技有限公司 Method, device and equipment for realizing three-dimensional space scene interaction
WO2021228200A1 (en) * 2020-05-13 2021-11-18 贝壳技术有限公司 Method for realizing interaction in three-dimensional space scene, apparatus and device
CN113010009B (en) 2021-02-08 2022-07-22 北京蜂巢世纪科技有限公司 Object sharing method and device
CN112947756A (en) * 2021-03-03 2021-06-11 上海商汤智能科技有限公司 Content navigation method, device, system, computer equipment and storage medium
CN113573085B (en) * 2021-07-21 2023-12-19 广州繁星互娱信息科技有限公司 Virtual resource acquisition method and device, storage medium and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103002410A (en) * 2012-11-21 2013-03-27 北京百度网讯科技有限公司 Augmented reality method and system for mobile terminals and mobile terminals
CN103942049A (en) * 2014-04-14 2014-07-23 百度在线网络技术(北京)有限公司 Augmented reality realizing method, client-side device and server
CN104063039A (en) * 2013-03-18 2014-09-24 朱慧灵 Human-computer interaction method of wearable computer intelligent terminal
CN107247510A (en) * 2017-04-27 2017-10-13 成都理想境界科技有限公司 A kind of social contact method based on augmented reality, terminal, server and system
CN107493228A (en) * 2017-08-29 2017-12-19 北京易讯理想科技有限公司 A kind of social interaction method and system based on augmented reality
CN109976519A (en) * 2019-03-14 2019-07-05 浙江工业大学 A kind of interactive display unit and its interactive display method based on augmented reality

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9977572B2 (en) * 2014-04-01 2018-05-22 Hallmark Cards, Incorporated Augmented reality appearance enhancement
CN109934929A (en) * 2017-12-15 2019-06-25 深圳梦境视觉智能科技有限公司 The method, apparatus of image enhancement reality, augmented reality show equipment and terminal
CN109144239B (en) * 2018-06-13 2021-12-14 华为技术有限公司 Augmented reality method, server and terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103002410A (en) * 2012-11-21 2013-03-27 北京百度网讯科技有限公司 Augmented reality method and system for mobile terminals and mobile terminals
CN104063039A (en) * 2013-03-18 2014-09-24 朱慧灵 Human-computer interaction method of wearable computer intelligent terminal
CN103942049A (en) * 2014-04-14 2014-07-23 百度在线网络技术(北京)有限公司 Augmented reality realizing method, client-side device and server
CN107247510A (en) * 2017-04-27 2017-10-13 成都理想境界科技有限公司 A kind of social contact method based on augmented reality, terminal, server and system
CN107493228A (en) * 2017-08-29 2017-12-19 北京易讯理想科技有限公司 A kind of social interaction method and system based on augmented reality
CN109976519A (en) * 2019-03-14 2019-07-05 浙江工业大学 A kind of interactive display unit and its interactive display method based on augmented reality

Also Published As

Publication number Publication date
CN110531847A (en) 2019-12-03

Similar Documents

Publication Publication Date Title
CN110531847B (en) Social contact method and system based on augmented reality
US11120628B2 (en) Systems and methods for augmented reality representations of networks
CN109426333B (en) Information interaction method and device based on virtual space scene
US20170337744A1 (en) Media tags - location-anchored digital media for augmented reality and virtual reality environments
KR101691985B1 (en) Personal information communicator
US20200257121A1 (en) Information processing method, information processing terminal, and computer-readable non-transitory storage medium storing program
CN110908504B (en) Augmented reality museum collaborative interaction method and system
CN111833454B (en) Display method, device, equipment and computer readable storage medium
US20140351284A1 (en) System for performing a personalized information search
CN111242682B (en) Article display method
WO2020157995A1 (en) Program, information processing method, and information processing terminal
CN111815782A (en) Display method, device and equipment of AR scene content and computer storage medium
JP2022507502A (en) Augmented Reality (AR) Imprint Method and System
TW201837794A (en) Method and system for managing viewability of location-based spatial object
JP2016200884A (en) Sightseeing customer invitation system, sightseeing customer invitation method, database for sightseeing customer invitation, information processor, communication terminal device and control method and control program therefor
CN112684893A (en) Information display method and device, electronic equipment and storage medium
WO2023241154A1 (en) Interaction method and apparatus based on news feed advertisement, and device and medium
CN111918114A (en) Image display method, image display device, display equipment and computer readable storage medium
JP6889304B1 (en) Augmented reality display system, augmented reality display method, and computer program
CN112055034B (en) Interaction method and system based on optical communication device
US11656835B1 (en) Systems and methods for spatial conversion and synchronization between geolocal augmented reality and virtual reality modalities associated with real-world physical locations
US11037332B2 (en) Systems and methods for presenting map and other information based on pointing direction
WO2021090715A1 (en) Information provision service program and information distribution device for information provision service
AU2011101085A4 (en) Method and system for sharing data
Hoang et al. Web 2.0 meets wearable augmented reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant