CN107276883A - Terminal installation, network side apparatus and information interacting method for mixed reality - Google Patents
Terminal installation, network side apparatus and information interacting method for mixed reality Download PDFInfo
- Publication number
- CN107276883A CN107276883A CN201710463511.0A CN201710463511A CN107276883A CN 107276883 A CN107276883 A CN 107276883A CN 201710463511 A CN201710463511 A CN 201710463511A CN 107276883 A CN107276883 A CN 107276883A
- Authority
- CN
- China
- Prior art keywords
- information
- user
- identified
- back end
- identification
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/56—Unified messaging, e.g. interactions between e-mail, instant messaging or converged IP messaging [CPM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/10—Protocols in which an application is distributed across nodes in the network
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention provides a kind of information interacting method for the terminal installation of mixed reality, network side apparatus and based on object identification.The terminal installation includes:Authentication module, it is configured to verify that user profile and/or activation user;Environmental perception module, it is configured as gathering the environmental information around the terminal installation;Object identification module, it is configured as being based on environmental information, carries out object identification;Tool model, it is configured as the input instruction according to user, the generation identification information associated with identified object and interactive information;Display module, it is configured as three-dimensionally showing interactive information;And communication module, it is configured as sending environmental information, identification information, interactive information and the user profile associated with identified object.Based on this, the presentation of the interactive information between user and transfer mode become three-dimensional by traditional two dimension, have stronger feeling of immersion during user's receive information, Consumer's Experience is obviously improved.
Description
Technical field
The present invention relates to communication technical field, more specifically, it is related to a kind of terminal installation, network for mixed reality
Side device and the information interacting method based on object identification.
Background technology
Carried out between current most common user information exchange be by various instant messagings (Instant Messenger,
IM) system/software/instrument is carried out, such as QQ, wechat etc..User can be carried out each other using these immediate communication tools
Word, picture, voice, the interaction of video.
Email based on server, the non-instant communication system such as blog can also realize that the information between user is handed over
Mutually.For example user can send Email each other, can be left a message below other people blog, so that between realizing user
The information exchange of non-instant.
In the prior art, either communicated, given for interactive information by instant messaging or non-instant between user
The feeling of immersion of user is not strong, and the experience to user is not deep in other words.Disappear for example, user A have sent one to user B
Breath, content be A in thoughts somewhere, at this time user B is not in itself in the place described by that user A, so only
Description and the imagination of oneself that can be by user A go to experience impression described in A, and information exchange is brought to user in this way
Experience be not most deep it is basic in other words just without feeling of immersion so that the feeling of immersion to user is not strong.
In addition, the mode that information exchange is carried out between existing user is also only limitted to word, picture, voice, video etc. two
The information organization form of dimension.These information often lead to surrounding environment during with user's receive information due to the two dimensional attributes of itself
It is not in contact with, the alternatively experience to user is not three-dimensional, and it is not strong that this will also result in the feeling of immersion that user experiences
It is strong.
The content of the invention
It is an object of the invention to the combination by object identification and network-side service, there is provided give user a kind of new information
Interactive mode.User is when receive information is with sending information, and the information of interaction is between the environment and user residing for user
It is merged, and user is showed by way of mixed reality, makes user's receive information in the way of three-dimensional, so as to give
The experience of user has stronger feeling of immersion.
According to an aspect of the present invention, a kind of terminal installation for mixed reality includes:
Authentication module, it is configured to verify that user profile and/or activation user;
Environmental perception module, it is configured as gathering the environmental information around the terminal installation;
Object identification module, it is configured as being based on environmental information, carries out object identification;
Tool model, it is configured as input instruction and the result according to user, generates and identified object phase
The identification information and interactive information of association;
Display module, it is configured as three-dimensionally showing interactive information;And
Communication module, it is configured as sending environmental information, the identification information associated with identified object, interaction letter
Breath and user profile.
According to another aspect of the present invention, a kind of network side apparatus for mixed reality includes:
Authentication module, it is configured as registered user and/or checking user profile;
Node generation module, it is configured as based on environmental information, the identification information associated with identified object with
And interactive information generates back end, the back end is associated with user profile;
Node administration module, it is configured as that the back end of generation is classified and/or retrieved based on environmental information;
And
Communication module, it is configured as receiving environmental information, the identification information associated with identified object, interaction letter
Breath and user profile, and send the back end associated with user profile or environmental information.
According to another aspect of the invention, a kind of information interacting method based on object identification includes:
Activate user and/or checking user profile;
Environmental information around gathering;
Based on environmental information, object identification is carried out;
Input instruction based on user, generation and/or editor's identification information associated with identified object and friendship
Mutual information;
Three-dimensionally show interactive information;And
Send the environmental information, the identification information associated with identified object, interactive information, and user's letter
Breath.
According to other aspects of the invention, a kind of information interacting method based on object identification includes:
Receive environmental information, identification information, interactive information and the user profile associated with identified object;
Registered user and/or checking user profile;
Identification information and interactive information generation back end based on environmental information, associated with identified object,
The back end is associated with user profile;
The back end of generation is classified and/or retrieved based on environmental information;
Send the back end associated with user profile or environmental information.
It is according to embodiments of the present invention for the terminal installation of mixed reality, network side apparatus and the letter based on object identification
The presentation and transfer mode for ceasing the interactive information between exchange method, user become three dimensional constitution by traditional two-dimensional approach, make
Obtaining information can be presented in the form of diversification.In addition, in environment and environment residing for interactive information, user between user
Object be mutual auxiliary, complementary relation, so that having stronger feeling of immersion, Consumer's Experience during user's receive information
It is obviously improved.
Brief description of the drawings
Fig. 1 is the structured flowchart of the mixed reality scene according to the embodiment of the present invention;
Fig. 2 is the structured flowchart of the terminal installation for mixed reality according to the embodiment of the present invention;
Fig. 3 is the structured flowchart of the network side apparatus for mixed reality according to one embodiment of the invention;
Fig. 4 is the structured flowchart of the network side apparatus for mixed reality according to another embodiment of the present invention;
Fig. 5 is the flow chart of the information interacting method based on object identification according to one embodiment of the invention;And
Fig. 6 is the flow chart of the information interacting method based on object identification according to another embodiment of the present invention.
Embodiment
Embodiments of the invention are described in detail with specific embodiment below in conjunction with the accompanying drawings.
According to embodiments of the present invention information interacting method and device based on object identification are typically suitable for mixed reality
(Mixed Reality, MR) scene, i.e., by introducing reality scene information in virtual environment, in virtual world, real world
The information circuits of an interaction feedback are set up between user, with the scene for the sense of reality for strengthening Consumer's Experience.
In addition, according to embodiments of the present invention information interacting method and device based on object identification can be applicable to
The similar scene of mixed reality scene, for example, the scene that can show virtual three-dimensional data together with real world object.
The structured flowchart of mixed reality (Mixed Reality, MR) scene according to embodiments of the present invention is as shown in Figure 1.
Virtual three-dimensional data interaction is carried out between user and mixed reality equipment.Mixed reality equipment can perceive environment and know
Other object.The data that cloud management system can be sent based on mixed reality equipment generate and manage back end, and by data section
Point is sent to mixed reality equipment, final that interacting for the virtual three-dimensional data between user and user can be achieved.
It should be noted that mixed reality equipment described here generally refers to those with one or more environment
The smart machine of sensing module and intelligent identification module, includes but are not limited to smart mobile phone, intelligent glasses, intelligent helmet
Deng.
Environmental perception module described here includes but is not limited only to camera, Inertial Measurement Unit (Inertial
Measurement Unit, IMU), laser radar, global positioning system (Global Positioning System, GPS), with
And various sensors (for example, temperature sensor, pressure sensor, voltage sensor, current sensor) etc..
Intelligent identification module described here can possess image recognition, the various modules of the function such as object identification.
In the present invention, mixed reality equipment is also referred to as the terminal installation for mixed reality, the end for mixed reality
End equipment, terminal installation or terminal device.
It is worth noting that, the data that cloud management system described herein can be sent based on mixed reality equipment are generated and managed
Back end is managed, and back end is sent to mixed reality equipment.
Cloud management system can be cloud server, cloud management system, cloud management server, server or other management
System.In the present invention, cloud management system is also referred to as network side apparatus, the network for mixed reality for mixed reality
Side apparatus, network side apparatus or network side equipment.
Fig. 2 shows the structured flowchart of mixed reality equipment 10 according to embodiments of the present invention.
As shown in Fig. 2 a kind of mixed reality equipment 10 includes but are not limited to following enumerated functional module:
User log-in block 11 (also referred to as " authentication module "), it is configured to verify that user profile (is also referred to as " account
Family information ") and/or activation user, and user profile is sent to by cloud management system by communication module 16.For example, for not
Registered user, points out user to be registered, and is responsible for activation user;For registered users, then user profile is verified.
Environmental perception module 12, it is configured as gathering the environmental information around the terminal installation.For example, collection is geographical
Positional information, image information, action message, azimuth information, temperature information, pressure information, current information, information of voltage etc..
Intelligent identification module 13 (also referred to as " object identification module "), it is configured as being based on environmental information, carries out thing
Body is recognized.
It should be noted that in another embodiment of the invention, object identification module 13 is additionally configured to joint cloud pipe
Object in terminal installation surrounding environment is identified reason system 20 (that is, network side).
For example, when terminal installation performance is not good, identification object needs to consume the long period, or when ring residing for terminal installation
Border is complicated, it is difficult to when object identification to be identified is come out, and the data of object to be identified can now be passed through communication by terminal installation
Module 16 is sent to cloud management system 20.Cloud management system 20 is known by the deep learning network of deployment beyond the clouds to object
Not and classification, and recognition result returned to the object identification module 13 of mixed reality equipment 10.
Tool model 14, it is configured as input instruction and the result according to user, generation and identified object
Associated identification information (also referred to as " Marker information ", " Marker data ") and interactive information.In other words, user is worked as
After registering or being verified, tool model 14 is responsible for providing a series of specific instruments, interface or port to user, uses
Family be free to make Marker data and mutual information data.Herein specific instrument, interface or port include but
It is not limited to CAD, CG software etc..
It should be noted that in another embodiment of the invention, tool model 14 may be additionally configured to be based on environmental information,
Marker data and/or mutual information data are extracted in the back end sent from cloud management system 20, and to extraction
Marker data and/or mutual information data enter edlin.
Herein, the specific implication of Marker data includes the two-dimensional image data characterized, the graphics figurate number characterized
It is used for characterizing the data acquisition system of some object according to, the trade mark that characterizes, the packaging etc. characterized.
Herein, the specific implication of mutual information data includes word, picture, voice, video and passes through specific software system
The virtual three-dimensional data of work (for example, data derived from the software such as CAD, CG).
Display module 15, it is configured as by way of mixed reality three-dimensionally showing interactive information.By interactive information
Combine displaying with identification object, for example, interactive information is identified on the ad-hoc location of identification object virtual image, with reference to identification
Interactive information is showed user by the virtual image of object, and the stereo-picture that such user sees from different perspectives is different.
It should be noted that in another embodiment of the invention, display module 15 can also be by the side of mixed reality
Formula three-dimensionally shows the interactive information associated with the object being identified, and is made and quilt by other users for example, three-dimensionally showing
The associated interactive information of the object of identification.
It should be noted that in another embodiment of the present invention, display module 15 can also be by the side of mixed reality
Formula three-dimensionally shows the content in content shop in cloud management system 20.
Communication module 16, it is configured as sending environmental information, the identification information associated with identified object, interaction
Information and user profile are to cloud management system 20.
It should be noted that the communication between communication module 16 and cloud management system 20 can be wired connection, or also
It can be the wireless connection of the wireless communication protocol via any other suitable types such as WiFi, bluetooth, near-field communications (NFC).
Communication module 16 can also be via network connection to cloud management system 20.Network can be LAN, wide area network, have
Gauze network, wireless network, Personal Area Network or its combining form, and internet can also be included.
Fig. 3 shows the structured flowchart of cloud management system 20 according to an embodiment of the invention.
As shown in figure 3, cloud management system 20 according to an embodiment of the invention include but are not limited to it is following enumerated
Functional module:
Login module 21 (also referred to as " authentication module "), it is configured as registered user and/or checking user profile.When
When user is new user, user's registration is carried out, user profiles are generated.When user is registered users, then user profile is verified.
Node generation module 22, it is configured as the environmental information received based on communication module 24 and identified thing
The identification information and interactive information generation back end of body phase association, for the unique mark of back end assignment one of generation
Symbol, back end is associated with user profile.Arrived for example, this back end is stored below the account corresponding to user.
It should be noted that a back end specifically refers to a data acquisition system present in cloud management system 20.
Here data acquisition system is sent to cloud comprising above-mentioned Marker data, mutual information data and mixed reality equipment
User profile data, the environmental information data at end etc. are all with the data acquisition system necessarily associated.
Node administration module 23, it is configured as classifying to the back end of generation based on environmental information, inquires about, manages
Reason etc..Node administration module 23 may be additionally configured to store the data that communication module 24 is received, for example, storage is therein
Marker data and mutual information data etc..
Communication module 24, it is configured as receiving the environmental information from mixed reality equipment 10, the object with being identified
Associated identification information, interactive information and user profile, and send and user profile or associated with environmental information
Back end.The communication mode of mixed reality equipment 10 and cloud management system 20 will not be repeated here.
Fig. 4 shows the structured flowchart of cloud management system 20 according to another embodiment of the present invention.
It is with Fig. 3 difference, the cloud management system 20 in Fig. 4 also includes following functions module:
Aid module 25, it is configured as based on the identification information life associated with identified object received
Into the network side identification information associated with identified object.
In this embodiment, node generation module 22 is further configured to the object phase based on environmental information, with being identified
Identification information, network side identification information and the interactive information of association generate back end jointly.
Assist in identifying module 26, its be configured as based on the object to be identified received data (for example, with thing to be identified
The identification information and interactive information of body phase association), object is identified by deployment deep learning network beyond the clouds and
Classification, and recognition result is returned to mixed reality equipment 10.
In other embodiments of the invention, the storage environment data of cloud management system 20 (alternatively referred to as " environmental information ") and
The Marker data of object, mixed reality equipment 10 assists in identifying module 26 with reference to both data aggregate cloud management systems 20
Joint identification is carried out to object.
Price transaction modules 27, it is configured as determining back end (for example, interactive information in back end)
Valency is simultaneously traded based on back end (for example, interactive information in back end), thus can the homemade interaction of trade user
Information data.Price to back end can be completed by user, can also be complete according to the attribute of back end by Cloud Server
Into.
Price transaction modules 27 can also be embodied in the form of content shop.Content shop can combine mixed reality equipment 10
Browsing mode is provided to user, user is may browse through the mutual information data in shop.In addition, content shop can also combine
Mixed reality equipment provides trading function, buys user or sells the mutual information data oneself made.It is interior
It can be types of applications shop to hold shop, and online shopping mall etc. is traded the platform of the operations such as management.
Service module 28 is customized, it is configured as based on user's request/hobby, and there is provided the service of customization.For example, carrying
Technical support of identification information, the interactive information of customization and/or customization for customization etc..
Fig. 5 shows the flow chart of the information interacting method according to an embodiment of the invention based on object identification.
As shown in figure 5, being somebody's turn to do the information interacting method based on object identification is related to user UE, mixed reality equipment 10 and cloud pipe
Information exchange between reason system 20, it includes but is not limited to following steps:
S1, UE initiate registration or logging request by mixed reality equipment 10;
S2, mixed reality equipment 10 are to the request of cloud management system 20 activation user or verify user profile;
S3, cloud management system 20 return to activation or the result according to operating result to mixed reality equipment 10;
Environmental information around S4, the collection of mixed reality equipment 10;
S5, mixed reality equipment 10 are based on environmental information, carry out object identification;
S6, the input instruction based on user, the generation of mixed reality equipment 10 Marker associated with identified object
Data and mutual information data;
S7, mixed reality equipment 10 three-dimensionally show interactive information by way of mixed reality to UE;For example, will interaction
Message identification shows interactive information with reference to the virtual image of identification object on the ad-hoc location of identification object virtual image
UE;
S8, mixed reality equipment 10 send environmental information, the mark associated with identified object to cloud management system 20
Know information, interactive information, and user profile.
S9, cloud management system 20 are based on environmental information, identification information, interactive information generation back end, the back end
With unique mark and associated with user profile, for example, the back end is storable in user profile.
In step s 9, cloud management system 20 is also based on environmental information the back end of generation is classified, looked into
Ask and manage etc..
In step s 5, when factors cause object identification to take a long time, mixed reality equipment 10 can be to
Cloud management system 20 sends object data to be identified, and cloud management system 20 independently recognizes object according to object data to be identified,
And recognition result is returned into mixed reality equipment 10, or the joint cloud management of mixed reality equipment 10 system 20 recognizes thing together
Body.
In step s 6, can also be by browsing content shop, and interactive information therein is chosen, so that with the friendship chosen
Mutual information is as the interactive information of the user, or the user further edits on the basis of the interactive information chosen, from
And form the interactive information of the user.
Before step S9, the information interacting method may also include based on the Marker received from mixed reality equipment 10
The step of data generating network side Marker data.Now, S9 can be adjusted to the environment letter sent based on mixed reality equipment 10
Breath, Marker data, mutual information data and network side Marker data generation back end.
Application of the information interacting method under concrete scene will illustrate that those skilled in the art are not by following example
Indigestibility, the application scenarios of information interacting method according to embodiments of the present invention are not limited only to this.
Example 1:
User A comes a certain place L (such as shop, restaurant, bar etc.) in actual life.User A location L are non-
It is often beautiful, while an object O (some inside some goods of furniture for display rather than for use of such as shop doorway, restaurant or bar at the L of place
Goods of furniture for display rather than for use etc.) it is especially interesting, so wanting, now object O thoughts are stayed in this place in place L, to allow other to arrive ground
Point L people can realize thoughts of the user A to place L and object O.User A is by using the mixed reality equipment of oneself
10 (such as smart mobile phone, intelligent glasses, intelligent helmets etc.) sign in cloud management system 20.User A passes through mixed reality equipment
The aid module of tool model and cloud management system 20 on 10 is combined makes Marker data to object O.
The instrument that user A is provided thoughts now by mixed reality equipment 10 simultaneously is with word (mutual information data)
Record, user A selections afterwards are preserved.At this moment mixed reality equipment 10 obtains current location L's by environmental perception module
Environmental information data, such as obtain place L and object O positional information by GPS, and place L and object O is obtained by camera
Image information etc..(this shows the mutual information data that mixed reality equipment 10 makes object O Marker data, user
Example in be word) and place L environmental information (GPS position information, image information etc.) together with active user A account
Information sends cloud management system 20 in the lump.
Cloud management system 20 receives the data acquisition system that mixed reality equipment 10 sends over and (includes active user A account
Family information, object O Marker information, interactive information, place L environmental information) data acquisition system is processed, firstly generate one
Individual back end (including Marker information, interactive information, environmental information), and the unique identifier of assignment one is newborn to this
Into back end, then this back end store arrive corresponding to user A account below.While cloud management system 20
Mutual information data in back end can be placed into the content shop in cloud management system 20.
These operation terminate after, cloud management system 20 can send messages to mixed reality equipment 10 notify preserve into
Work(, mixed reality equipment 10 is received after the successful message of preservation, by display module with a certain mode (for example, sound,
Vibration etc.) inform that user A has just made the content of preservation and succeeded.At this moment, user A can be by mixed reality equipment 10
The interactive information oneself just kept of display module preview be bound together with object O in the way of mixed reality
Show.
Example 2:
On the wall that many scenic spots and historical sites tourist attractions are often can see in actual life, or with other objects, always
There is visitor to write in various manners above, such as xxx has visited this place, I Love You by xxx.This behavior was both uncivil, also to name
Shenggu mark is a kind of infringement.
By using the method for the embodiment of the present invention, visitor B can be by for the specific place in reputation historic site, specific
Material object manufacture Marker data, while the instrument provided by mixed reality equipment is the thoughts of oneself, such as " xxx to this one
Trip " is fabricated to mutual information data, Marker data that mixed reality equipment 10 makes visitor B, mutual information data and
Current environmental information data are together sent to cloud management system 20.
Cloud management system 20 receives the data acquisition system that mixed reality equipment 10 sends over and (includes current visitor B account
Family information, Marker information, interactive information, the environmental information of visitor B making) data acquisition system is processed, firstly generate one
Back end, comprising Marker information, interactive information, environmental information, and the unique identifier of assignment one are newly-generated to this
Back end, then this back end store arrive corresponding to visitor B account below.Cloud management system 20 also can simultaneously
Mutual information data in back end is placed into the content shop in cloud management system.
These operation terminate after, cloud management system 20 can send messages to mixed reality equipment notify preserve into
Work(, mixed reality equipment is received after the successful message of preservation, and visitor B has been informed just in a certain mode by display module
The content preserved is made to have succeeded.At this moment, visitor B can be by the display module preview in mixed reality equipment oneself just
What the interactive information kept was shown in the way of mixed reality.The scenic spots and historical sites thus can be both protected, and certainly
Oneself impression is kept down.
Fig. 6 shows the flow chart of the information interacting method according to another embodiment of the present invention based on object identification.
It is with the difference of information interacting method in Fig. 5, after step S4 or step S5, side shown in Fig. 6
Method also includes step S10, and cloud management system 20 is based on environmental information, inquires about back end, and the back end inquired is sent out
Give mixed reality equipment 10.
After step S5 and step S10, method shown in Fig. 6 also includes step S11, and mixed reality equipment 10 is based on
Environmental information, extracts Marker data and mutual information data from the back end received.
After step s 11, this method also include step S12, mixed reality equipment 10 by way of mixed reality to
UE three-dimensionally shows interactive information.Interactive information is combined into displaying with identification object, for example, by interactive information mark in identification thing
On the ad-hoc location of body virtual image, interactive information showed into user with reference to the virtual image of identification object.
Now, step S6 after step s 12 is adjusted to:10 pairs of mixed reality equipment extracts Marker data and interaction
Information data enters edlin, so as to form new Marker data and mutual information data.Editor herein can be in extraction
New data are added on the basis of Marker data and mutual information data or to the Marker data of extraction and interaction
Partial data is replaced in information data.When user loses interest in the Marker data and mutual information data of extraction,
The instrument that user can also be provided by mixed reality equipment 10 generates new Marker data and mutual information data.
For the simplicity of statement, in method shown in Fig. 6 and in Fig. 5, same or similar step S1-S5, S7-S9 be herein
Repeat no more.
Application of the information interacting method under concrete scene will illustrate that those skilled in the art are not by following example
Indigestibility, the application scenarios of information interacting method according to embodiments of the present invention are not limited only to this.
Example 3:
User B comes a certain place L in actual life, and user B is by opening the displaying mould in mixed reality equipment 10
Block, (GPS position information, camera are carried the current environmental information that at this moment mixed reality equipment is obtained by environmental perception module
The image information of confession) and cloud management system 20 is sent to, the inquiry of cloud management system 20 corresponds to when mixed reality equipment 10 is sent
All back end of the environmental information that comes over, and these back end found are sent to mixed reality equipment 10.
The back end that mixed reality equipment 10 is received shows user B by display module first, informs user B
Current location L has those back end, can't now show the interactive information of some specific object of user B, simply thick
The displaying of summary is how many back end.
Mixed reality equipment 10 is combined the back end returned from cloud management system 20 by object identification module and believed simultaneously
Breath, Intelligent Recognition is carried out to object in current environment, while the module that assists in identifying for combining cloud management system 20 is combined
Object identification.When the module cooperative that assists in identifying that mixed reality equipment 10 combines cloud management system 20 works, identify and work as front ring
After the specific object O of some in border, mixed reality equipment will extract corresponding from the corresponding back end of this object O
All preservation mutual information datas, and user B is showed by display module binding object O.
If user B is registered user, it is possible to which the instrument provided by mixed reality equipment is existing to object O
Increase new interactive information (such as word, picture, voice, video etc.) in back end.Mixed reality equipment 10 can be user B
The interactive information newly increased uploads to cloud management system 20 again, and 20 interactive information newly increased of cloud management system increase to correspondence
Inside object O back end.It is exactly to contain to newly increase interaction when mixed reality equipment obtains back end next time
The back end of information data.
According to one embodiment of the invention, it is logical that user sets up safety by mixed reality equipment 10 and cloud management system 20
News, are registered and/or are verified.Mixed reality equipment 10 carries out Intellisense by environmental perception module to residing environment.
User is that the certain objects in current environment make corresponding Marker data by mixed reality equipment 10, while user is logical
Cross mixing real world devices 10 and generate mutual information data, or by connecting the content shop in cloud management system 20, from interior
Hold in shop and have selected mutual information data.Marker data that mixed reality equipment 10 makes user, mutual information data,
And the current context information obtained by environmental perception module is packed pass to cloud management system 20 in the lump.Cloud management system 20
These information datas are carried out with Intelligent treatment, including:A back end is generated for the data that each user uploads, is simultaneously
This back end sets up a unique mark, and the mutual information data in this back end is added to content shop
In, then each back end is classified and arranged.
According to another embodiment of the present invention, user passes through mixed reality equipment 10 login module and cloud management system 20
The communication of safety is set up, is registered and/or is verified.The user that cloud management system 20 is sended over according to mixed reality equipment 10
Information, verified by login module, and the result is sent to mixed reality equipment 10.If registered use
Family, then cloud management system can be according to user profile, the function of search provided by cloud management system 20, the currently active use
The corresponding back end in family is also sent to mixed reality equipment 10.In addition, cloud management system 20 also can be active user in content
The mutual information data chosen in shop issues mixed reality equipment 10 in the lump.Mixed reality equipment 10 receives cloud management system
After the result of 20 users sended over, according to different situations, classification processing is carried out.If nonregistered user,
Can then active user be pointed out to note.If registered users, then start to receive the correspondence that cloud management system 20 sends over current
Data in the back end of user, and the mutual information data in content shop;Meanwhile, mixed reality equipment 10 opens work
Instrument in tool module is used to active user, and the information data received is showed using display module current use
Family.
If the corresponding back end of active user includes the mutual information data in content shop, then, active user
The instrument that can be provided using mixed reality equipment 10 enters edlin to these data, is then saved in cloud management system.If
Active user did not make corresponding back end and mutual information data, then active user can be set using mixed reality
Environmental perception module in standby 10 is perceived to residing environment, then is carried by the tool model 14 in mixed reality equipment 10
Aid module 25 in the implement cloud management system 20 of confession is the certain objects joint production pair in current environment
The Marker data answered, while active user can generate mutual information data by instrument, or by connecting cloud management system
Content shop in system 20, obtains mutual information data, mixed reality equipment 10 is Marker data, interaction from content shop
Information data and the current environmental information obtained by environmental perception module are packed pass to cloud management system 20 in the lump.
In summary, it is according to embodiments of the present invention for the terminal installation of mixed reality, network side apparatus and based on thing
The presentation of interactive information between the information interacting method of body identification, user and transfer mode become three-dimensional by traditional two dimension,
Information is presented in the form of diversification.In addition, environment and environment residing for interactive information, user between user
In object be mutual auxiliary, complementary relation, so that having stronger feeling of immersion, user's body during user's receive information
Test and be obviously improved.
It should be appreciated that the above-mentioned functional module referred to can be merged based on needs or further divided.Herein
What described device and/or method was exemplary in nature, and these specific embodiments or example are not considered as limit
Qualitatively, there are a variety of modifications in them.Each step in method described herein can be performed by shown order, by other times
Sequence performs, is performed in parallel or can be omitted in some cases.Similarly, the order of said process can be changed.
Above-described embodiment, has been carried out further to the purpose of the present invention, technical scheme and beneficial effect
Describe in detail, should be understood that the embodiment that the foregoing is only the present invention, be not intended to limit the present invention
Protection domain, within the spirit and principles of the invention, any modification, equivalent substitution and improvements done etc. all should be included
Within protection scope of the present invention.
Claims (24)
1. a kind of terminal installation for mixed reality, it includes:
Authentication module, it is configured to verify that user profile and/or activation user;
Environmental perception module, it is configured as gathering the environmental information around the terminal installation;
Object identification module, it is configured as being based on environmental information, carries out object identification;
Tool model, it is configured as input instruction and the result according to user, generates and/or edits and is identified
The associated identification information of object and interactive information;
Display module, it is configured as three-dimensionally showing interactive information;And
Communication module, its be configured as sending environmental information, be identified the associated identification information of object, interactive information with
And user profile.
2. terminal installation according to claim 1, wherein, the environmental information includes:Geographical location information, image letter
Breath, action message, and/or azimuth information.
3. terminal installation according to claim 1, wherein, the object identification module is additionally configured to joint network side pair
Object in the terminal installation surrounding environment is identified.
4. terminal installation according to claim 1, wherein, identification information be characterized associated with identified object
Two-dimensional image data, characterize 3 D graphic data, characterize trade mark, and/or characterization packaging.
5. terminal installation according to claim 1, wherein, the interactive information includes:Word, picture, voice, video,
And/or virtual three-dimensional data.
6. terminal installation according to claim 1, wherein, tool model is additionally configured to be based on environmental information, from reception
To back end in extract interactive information.
7. terminal installation according to claim 1, wherein, the display module is additionally configured to three-dimensionally show described logical
The interactive information associated with identified object that letter module is received.
8. a kind of network side apparatus for mixed reality, it includes:
Authentication module, it is configured as registered user and/or checking user profile;
Node generation module, it is configured as identification information and friendship based on environmental information, associated with identified object
Mutual information generates back end, and the back end is associated with user profile;
Node administration module, it is configured as that the back end of generation is classified and/or retrieved based on environmental information;And
Communication module, its be configured as receiving environmental information, be identified the associated identification information of object, interactive information with
And user profile, and send the back end associated with user profile or environmental information.
9. network side apparatus according to claim 8, wherein, the network side apparatus also includes aid module, its
It is configured as associated with identified object based on the identification information generation associated with identified object received
Network side identification information, the node generation module is further configured to based on environmental information, associated with identified object
Identification information, network side identification information and interactive information generation back end.
10. network side apparatus according to claim 8, wherein, the network side apparatus also includes assisting in identifying module, its
It is configured as, based on the identification information associated with object to be identified received and interactive information, knowing object
Not.
11. network side apparatus according to claim 8, wherein, the network side apparatus also includes price transaction modules, its
It is configured as fixing a price to the back end and being traded based on back end.
12. network side apparatus according to claim 8, wherein, the network side apparatus also includes customizing service module,
It is configured as based on user's request there is provided the identification information of customization, customize interactive information and/or customization technology
Support.
13. a kind of information interacting method based on object identification, including:
Activate user and/or checking user profile;
Environmental information around gathering;
Based on environmental information, object identification is carried out;
Input instruction based on user, generation and/or editor's identification information associated with identified object and interaction are believed
Breath;
Three-dimensionally show interactive information;And
Send the environmental information, the identification information associated with identified object, interactive information, and user profile.
14. information interacting method according to claim 13, wherein, the environmental information includes:Geographical location information, figure
As information, action message, and/or azimuth information.
15. information interacting method according to claim 13, wherein, the object in surrounding environment is identified including connection
Network side is closed the object in the terminal installation surrounding environment is identified.
16. information interacting method according to claim 13, wherein, the identification information associated with identified object is
The two-dimensional image data of characterization, the 3 D graphic data characterized, the packaging of the trade mark characterized, and/or characterization.
17. information interacting method according to claim 13, wherein, the interactive information includes:Word, picture, voice,
Video, and/or virtual three-dimensional data.
18. information interacting method according to claim 15, wherein, joint network side is to the terminal installation surrounding environment
In object be identified including:
Object is identified based on the identification information and interactive information associated with object to be identified for network side;
Receive the recognition result of network side.
19. information interacting method according to claim 13, wherein, based on environmental information, after progress object identification, institute
Stating information interacting method also includes:
Network side is based on environmental information, retrieves and sends back end;
Based on identified object, the interactive information in the back end received is extracted;And
Three-dimensionally show the interactive information of extraction.
20. a kind of information interacting method based on object identification, including:
Receive environmental information, identification information, interactive information and the user profile associated with identified object;
Registered user and/or checking user profile;
Identification information and interactive information generation back end based on environmental information, associated with identified object, it is described
Back end is associated with user profile;
The back end of generation is classified and/or retrieved based on environmental information;
Send the back end associated with user profile or environmental information.
21. information interacting method according to claim 20, wherein, based on environmental information, related to identified object
Identification information and interactive information the generation back end of connection further comprise:
Based on the identification information generation associated with identified object received the network associated with identified object
Side identification information;
Identification information, network side identification information and interactive information based on environmental information, associated with identified object are given birth to
Into back end.
22. information interacting method according to claim 20, wherein, described information exchange method also includes:
Based on the identification information associated with object to be identified received and interactive information, object is identified.
23. information interacting method according to claim 20, wherein, described information exchange method also includes:
The back end is fixed a price and is traded based on back end.
24. information interacting method according to claim 20, wherein, described information exchange method also includes:
Based on user's request, there is provided the technical support of the identification information of customization, the interactive information customized and/or customization.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710463511.0A CN107276883B (en) | 2017-06-19 | 2017-06-19 | Terminal device, network side device and information interaction method for mixed reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710463511.0A CN107276883B (en) | 2017-06-19 | 2017-06-19 | Terminal device, network side device and information interaction method for mixed reality |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107276883A true CN107276883A (en) | 2017-10-20 |
CN107276883B CN107276883B (en) | 2020-09-25 |
Family
ID=60067948
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710463511.0A Active CN107276883B (en) | 2017-06-19 | 2017-06-19 | Terminal device, network side device and information interaction method for mixed reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107276883B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109274755A (en) * | 2018-10-12 | 2019-01-25 | 成都信息工程大学 | A kind of cross-node two-way in length and breadth intersection cooperation interaction method based on the cross-domain network of multicore |
CN114648907A (en) * | 2020-12-18 | 2022-06-21 | 山东新松工业软件研究院股份有限公司 | Multi-functional display device of distributing type industrial control system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101031866A (en) * | 2004-05-28 | 2007-09-05 | 新加坡国立大学 | Interactive system and method |
CN104219617A (en) * | 2014-08-22 | 2014-12-17 | 腾讯科技(深圳)有限公司 | Service acquiring method and device |
CN106775198A (en) * | 2016-11-15 | 2017-05-31 | 捷开通讯(深圳)有限公司 | A kind of method and device for realizing accompanying based on mixed reality technology |
CN106843497A (en) * | 2017-02-24 | 2017-06-13 | 北京观动科技有限公司 | A kind of mixed reality information interacting method and device |
-
2017
- 2017-06-19 CN CN201710463511.0A patent/CN107276883B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101031866A (en) * | 2004-05-28 | 2007-09-05 | 新加坡国立大学 | Interactive system and method |
CN104219617A (en) * | 2014-08-22 | 2014-12-17 | 腾讯科技(深圳)有限公司 | Service acquiring method and device |
CN106775198A (en) * | 2016-11-15 | 2017-05-31 | 捷开通讯(深圳)有限公司 | A kind of method and device for realizing accompanying based on mixed reality technology |
CN106843497A (en) * | 2017-02-24 | 2017-06-13 | 北京观动科技有限公司 | A kind of mixed reality information interacting method and device |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109274755A (en) * | 2018-10-12 | 2019-01-25 | 成都信息工程大学 | A kind of cross-node two-way in length and breadth intersection cooperation interaction method based on the cross-domain network of multicore |
CN114648907A (en) * | 2020-12-18 | 2022-06-21 | 山东新松工业软件研究院股份有限公司 | Multi-functional display device of distributing type industrial control system |
CN114648907B (en) * | 2020-12-18 | 2024-05-07 | 山东新松工业软件研究院股份有限公司 | Multifunctional display device of distributed industrial control system |
Also Published As
Publication number | Publication date |
---|---|
CN107276883B (en) | 2020-09-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109426333A (en) | A kind of information interacting method and device based on Virtual Space Scene | |
CN104426972B (en) | Terminal location sharing method and device | |
CN104537550B (en) | A kind of autonomous advertising method in internet based on augmented reality IP maps | |
CN110908504B (en) | Augmented reality museum collaborative interaction method and system | |
CN108234276A (en) | Interactive method, terminal and system between a kind of virtual image | |
CN105810132A (en) | Intelligent tour guide system of museum | |
CN104969591A (en) | Sharing of information common to two mobile device users over a near-field communication (nfc) link | |
CN103024497B (en) | Smart television is controlled to realize method, system and the switching method made friends by mobile phone | |
KR20140090191A (en) | Interaction method, user terminal, server, system and computer storage medium | |
CN102811179A (en) | Information provision method and system for social network | |
CN106982240A (en) | The display methods and device of information | |
CN106713438B (en) | Position sharing processing method and device | |
CN106843497A (en) | A kind of mixed reality information interacting method and device | |
CN103338253A (en) | Social interaction system based on hand-held terminal | |
CN107276883A (en) | Terminal installation, network side apparatus and information interacting method for mixed reality | |
KR102043274B1 (en) | Digital signage system for providing mixed reality content comprising three-dimension object and marker and method thereof | |
KR100687413B1 (en) | System and method for providing service for searching friends | |
CN104457765B (en) | Localization method, electronic equipment and server | |
CN104501797A (en) | Navigation method based on augmented reality IP map | |
CN107608517A (en) | A kind of scene interaction dating system and method based on geographical position | |
CN206848594U (en) | Intelligent glasses for mixed reality | |
KR101659066B1 (en) | Method, system and computer-readable recording medium for creating message containing virtual space and virtual object | |
CN109359179A (en) | Message treatment method, device, terminal device and computer storage medium | |
CN103813265B (en) | A kind of information sharing method based on location-based service | |
JP2009294804A (en) | Molding manufacturing system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |