CN103105993A - Method and system for realizing interaction based on augmented reality technology - Google Patents
Method and system for realizing interaction based on augmented reality technology Download PDFInfo
- Publication number
- CN103105993A CN103105993A CN2013100301095A CN201310030109A CN103105993A CN 103105993 A CN103105993 A CN 103105993A CN 2013100301095 A CN2013100301095 A CN 2013100301095A CN 201310030109 A CN201310030109 A CN 201310030109A CN 103105993 A CN103105993 A CN 103105993A
- Authority
- CN
- China
- Prior art keywords
- terminal
- information
- image
- augmented reality
- relation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Multimedia (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
A method for realizing interaction based on an augmented reality technology comprises the steps of obtaining an image of a real scene shot by a first terminal, obtaining the location information of the first terminal and the geographic location of a second terminal, wherein the location information includes geographical coordinates and the orientations of the first terminal, matching the location information with the geographic location, displaying the matched information of the second terminal in the image of the real scene of the first terminal, and conducting interaction according to the information of the second terminal. According to the above method, the image is formed through real scene shooting, the matched information of the second terminal is displayed in the image, and the interaction is conducted as needed. Virtual information of the second terminal is displayed on the real-scene image according to the geographic location of the second terminal, and thus the interaction is facilitated. Communication with the peripheral second terminal is realized, and at the same time the concrete geographic location and direction can be obtained. Besides, the invention further provides a system for realizing interaction based on the augmented reality technology.
Description
Technical field
The present invention relates to areas of information technology, particularly relate to a kind of method and system of realizing interaction based on augmented reality.
Background technology
Augmented reality (Augmented Reality, AR) is the new technology that grows up on the basis of virtual reality, also is referred to as mixed reality.The information that provides by computer system increases the user to the technology of real world perception, virtual information is applied to real world, and dummy object, scene or system prompt information that computing machine generates are added in real scene, thereby realize enhancing to reality.The user not only can experience by virtual reality system the verisimilitude of " on the spot in person " that experience in objective physical world, and can break through space, time and other objective restriction, experience in real world can't personal experience experience.Navigation, information inquiry, amusement etc. have been applied at present.For example can obtain merchant information or play the amusement animation by augmented reality; Even show the information such as current location, near restaurant, tourist hot spot, bus station for the user.
Yet at present augmented reality can only be unidirectional provides information, and range of application is limited to.
Summary of the invention
Based on this, be necessary can only be unidirectional for augmented reality the problem that information is provided, provide a kind of and realize interactive method and system based on augmented reality.
A kind of method that realizes interaction based on augmented reality comprises:
Obtain the image of the real scene of first terminal shooting;
Obtain the geographic position of the on-site positional information of first terminal and the second terminal, described positional information comprise geographic coordinate and described first terminal towards;
Described positional information and described geographic position are mated;
The information of the second terminal after the match is successful is presented on the image of the real scene that described first terminal takes;
Carry out interaction according to the information of described the second terminal that shows.
A kind of system that realizes interaction based on augmented reality comprises:
Image collection module is obtained the image of the real scene that first terminal takes;
The position acquisition module, the geographic position that obtains the on-site positional information of first terminal and the second terminal;
Matching module mates described positional information and described geographic position, described positional information comprise geographic coordinate and described first terminal towards;
Display module is presented at the information of the second terminal after the match is successful on the image of the real scene that described first terminal takes;
Interactive module is carried out interaction according to the information of described the second terminal that shows.
Above-mentioned realize in interactive method and system based on augmented reality, form image by taking real scene, the information of the second terminal that the match is successful is presented on the image of the real scene that first terminal takes, then can carry out as required interaction.According to the geographic position of the second terminal in reality, the information of the second virtual terminal is presented on the image of real scene, conveniently carries out interaction.Can also obtain concrete geographic position and the orientation of the second terminal when can realize exchanging with the user who holds the second terminal of periphery.
Description of drawings
Fig. 1 is the method flow diagram based on augmented reality realization interaction of an embodiment;
The second terminal of an embodiment and first terminal relative position relation schematic diagram in Fig. 2;
The second terminal in Fig. 3 in Fig. 2 and the relative position relation schematic diagram of the information after the first terminal successful matching on image;
Fig. 4 is the module diagram based on the interactive system of augmented reality realization of an embodiment;
Fig. 5 is the schematic diagram of the matching module of an embodiment.
Embodiment
See also Fig. 1, the method based on augmented reality realization interaction of an embodiment comprises the steps:
Step S110 obtains the image of the real scene that first terminal takes.
When carrying out this method, automatically open the camera that first terminal carries, the real scene that the camera of first terminal is faced is filmed, and presents the image of real scene on the display of first terminal.The scope of the image of the real scene that shows decides according to the scope that the first terminal camera photographs, when the user adjust the zoom of cam lens, towards, angle of inclination etc., the image of the real scene that photographs is corresponding to change.
Step S120, the geographic position that obtains the on-site positional information of first terminal and the second terminal, positional information comprise geographic coordinate and described first terminal towards.
First terminal can be mobile phone, panel computer, notebook computer, electronic reader, intelligent watch, Intelligent vehicle window etc.The on-site positional information of first terminal can comprise the on-site geographic coordinate of first terminal and first terminal towards.Can obtain geographic coordinate by the GPS (Global Positioning System, GPS) that first terminal carries.Also can obtain by other means geographic coordinate, such as Beidou satellite navigation system, architecture etc.When first terminal is mobile phone, the architecture that is applied to the cellphone subscriber is referred to as the cellular base station positioning service, be called again shift position service (Location Based Service, LBS), obtain the positional information (comprising latitude and longitude coordinates) of mobile phone users by the network (as the GSM net) of telecommunications mobile operator.Can by the digital compass of first terminal distinguish first terminal towards, the needle of digital compass can remain under geomagnetic field action on the tangential direction of magnetic meridian, the geographical arctic is pointed in the arctic of needle, utilizes this performance to take one's bearings.Such as when first terminal be intelligent glasses, intelligent watch, intelligent vehicle-carried equipment etc., can utilize digital compass determine camera towards.When the camera of first terminal separates with display, with camera towards as the direction localization criteria.
The geographic position of the second terminal derives from background data base.When opening first terminal, by the geographic position of the networks such as GPRS, 3G and WLAN in back-end data library searching the second terminal.Background data base comprises the geographical location information of the second terminal of all second terminals, the second terminal that often contacts or some grouping, in search procedure, can set as required certain hunting zone, such as the setting search scope is apart from second terminal in 10 kilometer range of first terminal location etc. in the certain angle scope of first terminal location.
Step S130 mates positional information and geographic position.
The distance and bearing relation of the second terminal and first terminal is calculated in the geographic position of positional information on-site according to first terminal and the second terminal.Then judge whether this distance and bearing satisfies preset requirement.When the distance and bearing relation of the geographic position of this second terminal and first terminal satisfied preset requirement, the match is successful to judge this second terminal and first terminal.In the present embodiment, preset requirement comprise distance satisfy the first predetermined value, position relation and first terminal towards satisfied the second predetermined value of deviation.The first predetermined value and the second predetermined value can be fixed in advance, and the first predetermined value and the second predetermined value also can automatically be adjusted or automatically adjust according to the camera parameter (as focal length, visual angle etc.) of first terminal and (can adjust greatlyr for wide-angle lens the second predetermined value according to the angle of inclination of first terminal; If first terminal is by vertically changing into laterally, the first predetermined value can be adjusted littlely and the second predetermined value can be adjusted greatlyr), perhaps automatically adjust in conjunction with angle of inclination and the camera parameter of first terminal.Be equivalent to the user and can carry out as required self-defined setting.In other embodiments, preset requirement can only require position relation and first terminal towards deviation satisfy the second predetermined value.For example, determine that being oriented of first terminal be exposed to the north, position relation and first terminal towards deviation (the second predetermined value) be 30 °, as long as the second terminal in the north by east 30 ° to the scope of 30 ° of norths by west, just with it as the second terminal that the match is successful.Below illustrate when preset requirement comprise distance satisfy the first predetermined value, position relation and first terminal towards the situation of satisfied the second predetermined value of deviation.As shown in Figure 2, in reality, first terminal is positioned at the A point, towards direct north; The second terminal first is positioned at B, and AB is at a distance of 170 meters, and B is positioned at 30 ° of A norths by west; The second terminal second is positioned at C ground, and AC is at a distance of 80 meters, and C ground is positioned at 15 ° of norths by east, A ground; The second terminal third is positioned at D ground, and AD is at a distance of 20 meters, and D is positioned at 80 ° of A norths by west; The second terminal fourth is positioned at E ground, and AE is at a distance of 120 meters, and E is positioned at 70 ° of A norths by east.When the first predetermined value is 200 meters, when the second predetermined value is 35 °, this moment the position relation of B and first terminal towards deviation be 30 °, the position relation of C and first terminal towards deviation be 15 °, the position relation of D and first terminal towards deviation be 80 °, the position relation of B and first terminal towards deviation be 70 °, the distance and bearing deviation that can calculate the second terminal first and second and first terminal satisfies preset requirement, thus first and second the match is successful respectively at first terminal.Set filtering rule by preset requirement, the second terminal is screened, near satisfactory the second terminal can only showing is as far as possible with second terminal conduct the second terminal that the match is successful in the true picture scope.
Step S140 is presented at the information of the second terminal after the match is successful on the image of the real scene that first terminal takes.
Will by after screening and the information of first terminal the second terminal that the match is successful be presented on the image of real scene of shooting.By above-mentioned screening matching process, second end message that will meet pre-provisioning request is presented on image, makes in a page can not be full of the information of flood tide, improves system speed, has also saved flow simultaneously.The information that is presented at the second terminal on image is that the correlationship according to the positional information of the geographic position of the second terminal and first terminal distributes, the relative position relation of information that is the second terminal on image is consistent with the relative position relation in the second terminal actual residing geographic position, is convenient to the user and understands rapidly the second terminal in the position at real world place by display interface.When the positional information relation of the geographic position of the second terminal and first terminal need to satisfy the first predetermined value required distance, position relation and first terminal towards deviation when satisfying the second predetermined value, can be take the central point of image as basic point, close according to the far and near relation of the second terminal and the distance of first terminal and angular deviation and tie up to corresponding distribution the second end message on image.In other embodiment, when the positional information relation of the geographic position of the second terminal and first terminal need to only satisfy position relation and first terminal towards deviation when satisfying the second predetermined value, the distribution of the information of the second terminal on image only need to embody the angular deviation relation.Below illustrate when preset requirement comprise distance satisfy the first predetermined value, position relation and first terminal towards the situation of satisfied the second predetermined value of deviation.According to the position relationship of the second terminal in reality that represents in Fig. 2 and first terminal and above-mentioned preset requirement as can be known, the second terminal first and second and first terminal successful matching, the information of the second terminal first and second is presented in image the position relation as shown in Figure 3, A ' point is the first terminal location, B ' point is the locational information of the second terminal first, C ' point is the locational information of the second terminal second, and in the relative position relation of its demonstration and reality, the relative position relation of the second terminal and first terminal is consistent.Further, the information of the second different terminals can be superimposed upon in the mode of layer on the image of real scene, and can carry out stacked according to both far and near distances, the second end message of closer distance is arranged on the superiors, longer-distance the second end message is presented at lower floor, is arranged in order.More highlight the effect of augmented reality, convenience has been strengthened in follow-up operation.In other embodiment, in the satisfactory situation of image-capable, the information of different the second terminals also can directly be embedded in image.
Step S150 carries out interaction according to the information of the second terminal that shows.
The user can be according to the information of the second terminal that shows, comprise head portrait, distance, dynamically, orientation etc. selects the second terminal of needs interaction, as click the head portrait of the second terminal, pay close attention to the dynamic recently of the second terminal, leave a message or voice call, near perhaps checking the second terminal commonly used is set up group, in order to conveniently carrying out group's activity.First terminal can send corresponding word, voice, video or pictorial information to corresponding the second terminal when the instruction that receives the user, thereby realizes interactive.
Above-mentioned realize in interactive method based on augmented reality, form image by taking real scene, the information of the second terminal that the match is successful is presented on image, then can carry out interaction with the second terminal as required.According to the geographic position of the second terminal in reality, the information of the second virtual terminal is presented on the image of real scene, conveniently carries out interaction.Can realize to obtain when user that periphery is held the second terminal exchanges concrete geographic position and the orientation of the second terminal.As in team's activity, can chat with other users that hold the second terminal, also can know in real time other users' orientation, be convenient to seek unity of action.
In addition, as shown in Figure 4, also provide a kind of and realized interactive system based on augmented reality, comprised image collection module 210, position acquisition module 220, matching module 230, display module 240 and interactive module 250.
First terminal carries camera, and taking module 210 is opened the camera that first terminal carries, and the real scene that the camera of first terminal is faced is filmed, and presents the image of real scene on the display of first terminal.The scope of the image of the real scene that shows decides according to the scope that the first terminal camera photographs, when the user adjust the zoom of cam lens, towards, angle of inclination etc., the image of the real scene that photographs is corresponding to change.
First terminal can be mobile phone, panel computer, notebook computer etc.The on-site positional information of first terminal can comprise the on-site geographic coordinate of first terminal.Acquisition module 220 obtains geographic coordinate by the GPS that first terminal carries.Acquisition module 220 also can obtain geographic coordinate by other means, such as Beidou satellite navigation system, architecture etc.When first terminal is mobile phone, the architecture that is applied to the cellphone subscriber is referred to as the cellular base station positioning service, be called again shift position service (Location Based Service, LBS), obtain the positional information (comprising latitude and longitude coordinates) of mobile phone users by the network (as the GSM net) of telecommunications mobile operator.The on-site positional information of first terminal can also comprise first terminal towards, by the digital compass of first terminal distinguish first terminal towards, the needle of digital compass can remain on the tangential direction of magnetic meridian under geomagnetic field action, the geographical arctic is pointed in the arctic of needle, utilizes this performance to take one's bearings.
The geographic position of the second terminal derives from background data base.When opening first terminal, by the geographic position of the networks such as GPRS, 3G and WLAN in back-end data library searching the second terminal.Background data base comprises the geographical location information of the second terminal of all second terminals, the second terminal that often contacts or some grouping, in search procedure, can set as required certain hunting zone, such as second terminal in 10 kilometer range around the first terminal location etc.
Matching module 230 mates described positional information and described geographic position.
The distance and bearing relation of the second terminal and first terminal is calculated in the geographic position of positional information on-site according to first terminal and the second terminal.Then judge whether this distance and bearing satisfies preset requirement.When the distance and bearing relation of the geographic position of this second terminal and first terminal satisfied preset requirement, the match is successful to judge this second terminal and first terminal.See also Fig. 5, matching module 230 comprises computing unit 232 and matching unit 234.Computing unit 232 calculates the range-azimuth relation of described the second terminal and described first terminal according to described positional information and described geographic position.If the range-azimuth relation of described the second terminal and described first terminal satisfies preset requirement, matching unit 234 with this second terminal as the second terminal that the match is successful.In the present embodiment, preset requirement comprises that distance satisfies the first predetermined value, and the first predetermined value can be determined by range information.Range information comprises two digital distance information of starting point and ending point.Preset requirement also comprise position relation and first terminal towards deviation satisfy the second predetermined value.The first predetermined value and the second predetermined value can be fixed in advance, and the first predetermined value and the second predetermined value also can automatically be adjusted or automatically adjust according to the camera parameter (as focal length, visual angle etc.) of first terminal and (can adjust greatlyr for wide-angle lens the second predetermined value according to the angle of inclination of first terminal; If first terminal is by vertically changing into laterally, the first predetermined value can be adjusted littlely and the second predetermined value can be adjusted greatlyr), perhaps automatically adjust in conjunction with angle of inclination and the camera parameter of first terminal.Be equivalent to the user and can carry out as required self-defined setting.Preset requirement can also only require position relation and first terminal towards deviation satisfy the second predetermined value.For example, determine that being oriented of first terminal be exposed to the north, position relation and first terminal towards deviation (the second predetermined value) be 30 °, as long as contact in the north by east 30 ° to the scope of 30 ° of norths by west, just with it as the second terminal that the match is successful.Below illustrate when preset requirement comprise distance satisfy the first predetermined value, position relation and first terminal towards the situation of satisfied the second predetermined value of deviation.For instance, as shown in Figure 2, in reality, first terminal is positioned at the A point, towards direct north; The second terminal first is positioned at B, and AB is at a distance of 170 meters, and B is positioned at 30 ° of A norths by west; The second terminal second is positioned at C ground, and AC is at a distance of 80 meters, and C ground is positioned at 15 ° of norths by east, A ground; The second terminal third is positioned at D ground, and AD is at a distance of 20 meters, and D is positioned at 80 ° of A norths by west; The second terminal fourth is positioned at E ground, and AE is at a distance of 120 meters, and E is positioned at 70 ° of A norths by east.When the first predetermined value is 200 meters, when the second predetermined value is 35 °, this moment the position relation of B and first terminal towards deviation be 30 °, the position relation of C and first terminal towards deviation be 15 °, the position relation of D and first terminal towards deviation be 80 °, the position relation of B and first terminal towards deviation be 70 °, the distance and bearing deviation that can calculate the second terminal first and second and first terminal satisfies preset requirement, thus first and second the match is successful respectively at first terminal.Set filtering rule by preset requirement, the second terminal is screened, near satisfactory the second terminal can only showing is as far as possible with second terminal conduct the second terminal that the match is successful in the true picture scope.
Will by after screening and the information of first terminal the second terminal that the match is successful be presented on the image of real scene of shooting.By above-mentioned screening pairing process, second end message that will meet pre-provisioning request is presented on image, makes in a page can not be full of the information of flood tide, improves system speed, has also saved flow simultaneously.Display module 240 comprises the relative position corresponding unit, the relative position corresponding unit be presented at the information of described the second terminal on described image according to the range-azimuth relation of described the second terminal and described first terminal and make the relative position relation of information of the second terminal on image and the relative position relation of the second terminal consistent.The correlationship that is presented at the second end message on image and is according to the positional information of the geographic position of the second terminal and first terminal distributes, the relative position relation of information that is the second terminal on image is consistent with the relative position relation in the second terminal actual residing geographic position, is convenient to the user and understands rapidly the second terminal in the position at real world place by display interface.When the positional information relation of the geographic position of the second terminal and first terminal need to satisfy the first predetermined value required distance, position relation and first terminal towards deviation when satisfying the second predetermined value, can be take the central point of image as basic point, close according to the far and near relation of the second terminal and the distance of first terminal and angular deviation and tie up to corresponding distribution the second end message on image.In other embodiment, when the positional information relation of the geographic position of the second terminal and first terminal need to only satisfy position relation and first terminal towards deviation when satisfying the second predetermined value, the distribution of the information of the second terminal on image only need to embody the angular deviation relation.Below illustrate when preset requirement comprise distance satisfy the first predetermined value, position relation and first terminal towards the situation of satisfied the second predetermined value of deviation.According to the position relationship of the second terminal in reality that represents in Fig. 2 and first terminal and above-mentioned preset requirement as can be known, the second terminal first and second and first terminal successful matching, the information of the second terminal first and second is presented in image the position relation as shown in Figure 3, A ' point is the first terminal location, B ' point is the locational information of the second terminal first, C ' point is the locational information of the second terminal second, and in the relative position relation of its demonstration and reality, the relative position relation of the second terminal and first terminal is consistent.Further, display module 240 also comprises superpositing unit, and superpositing unit is superimposed upon the information of described the second terminal mode with layer on the image of described real scene.The information that is different the second terminal can be superimposed upon in the mode of layer on the image of real scene, and can carry out stacked according to both far and near distances, the second end message of closer distance is arranged on the superiors, and longer-distance the second end message is presented at lower floor, is arranged in order.More highlight the effect of augmented reality, convenience has been strengthened in follow-up operation.In other embodiment, in the satisfactory situation of image-capable, the information of different the second terminals also can directly be embedded in image.
The user can be according to the information of the second terminal that shows, comprise head portrait, distance, dynamically, orientation etc. selects the second terminal of needs interaction, as click the head portrait of the second terminal, pay close attention to the dynamic recently of the second terminal, leave a message or voice call, near perhaps checking the second terminal commonly used is set up group, in order to conveniently carrying out group's activity.Interactive module 250 can send corresponding word, voice, video or pictorial information to corresponding the second terminal when the instruction that receives the user, thereby realizes interactive.
Above-mentioned realize in interactive system based on augmented reality, form image by taking real scene, the information of the second terminal that the match is successful is presented on image, then can carry out interaction with the second terminal as required.According to the geographic position of the second terminal in reality, the information of the second virtual terminal is presented on the image of real scene, conveniently carries out interaction.Can realize to obtain when user that periphery is held the second terminal exchanges user's concrete geographic position and orientation.As in team's activity, can chat with other users that hold the second terminal, also can know in real time other users' orientation, be convenient to seek unity of action.
The above embodiment has only expressed several embodiment of the present invention, and it describes comparatively concrete and detailed, but can not therefore be interpreted as the restriction to the scope of the claims of the present invention.Should be pointed out that for the person of ordinary skill of the art, without departing from the inventive concept of the premise, can also make some distortion and improvement, these all belong to protection scope of the present invention.Therefore, the protection domain of patent of the present invention should be as the criterion with claims.
Claims (12)
1. realize interactive method based on augmented reality for one kind, it is characterized in that, comprising:
Obtain the image of the real scene of first terminal shooting;
Obtain the geographic position of the on-site positional information of first terminal and the second terminal, described positional information comprise geographic coordinate and described first terminal towards;
Described positional information and described geographic position are mated;
The information of the second terminal after the match is successful is presented on the image of the real scene that described first terminal takes;
Carry out interaction according to the information of described the second terminal that shows.
2. according to claim 1ly realize interactive method based on augmented reality, it is characterized in that, the described step that described positional information and described geographic position are mated comprises:
Range-azimuth relation according to described positional information and described geographic position described the second terminal of calculating and described first terminal;
If the range-azimuth relation of described the second terminal and described first terminal satisfies conduct the second terminal that the match is successful of preset requirement.
3. according to claim 2ly realize interactive method based on augmented reality, it is characterized in that, described preset requirement be described distance satisfy the first predetermined value, described position relation and described first terminal towards satisfied the second predetermined value of deviation.
4. the method that realizes interaction based on augmented reality according to claim 3, is characterized in that, described the second predetermined value is automatically adjusted and/or automatically adjusts according to the camera parameter of first terminal according to the angle of inclination of described first terminal.
5. according to claim 2 to 4, any one is described realizes interactive method based on augmented reality, it is characterized in that, the step that the information of described the second terminal after will the match is successful is presented on the image of the real scene that described first terminal takes comprises:
According to the range-azimuth relation of described the second terminal and described first terminal be presented at the information of described the second terminal on described image and make the relative position relation of information of the second terminal on image and the relative position relation of the second terminal consistent.
6. the method that realizes interaction based on augmented reality according to claim 5, is characterized in that, the described step that the information of the second terminal after the match is successful is presented on described image also comprises:
The information of described the second terminal is superimposed upon on the image of described real scene in the mode of layer.
7. realize interactive system based on augmented reality for one kind, it is characterized in that, comprising:
Image collection module is obtained the image of the real scene that first terminal takes;
The position acquisition module, the geographic position that obtains the on-site positional information of first terminal and the second terminal;
Matching module mates described positional information and described geographic position, described positional information comprise geographic coordinate and described first terminal towards;
Display module is presented at the information of the second terminal after the match is successful on the image of the real scene that described first terminal takes;
Interactive module is carried out interaction according to the information of described the second terminal that shows.
8. the system that realizes interaction based on augmented reality according to claim 7, is characterized in that, described matching module comprises:
Computing unit is according to the range-azimuth relation of described positional information and described geographic position described the second terminal of calculating and described first terminal;
Matching unit is if the range-azimuth relation of described the second terminal and described first terminal satisfies conduct the second terminal that the match is successful of preset requirement.
9. according to claim 8ly realize interactive system based on augmented reality, it is characterized in that, described preset requirement be described distance satisfy the first predetermined value, described position relation and described first terminal towards satisfied the second predetermined value of deviation.
10. the system that realizes interaction based on augmented reality according to claim 9, is characterized in that, described the second predetermined value is automatically adjusted and/or automatically adjusts according to the camera parameter of described first terminal according to the angle of inclination of described first terminal.
11. the described system that realizes interaction based on augmented reality of according to claim 8 to 10 any one is characterized in that, described display module comprises:
The relative position corresponding unit, according to the range-azimuth relation of described the second terminal and described first terminal be presented at the information of described the second terminal on described image and make the relative position relation of information of the second terminal on image and the relative position relation of the second terminal consistent.
12. the system that realizes interaction based on augmented reality according to claim 11 is characterized in that, described display module also comprises:
Superpositing unit is superimposed upon the information of described the second terminal mode with layer on the image of described real scene.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310030109.5A CN103105993B (en) | 2013-01-25 | 2013-01-25 | Method and system for realizing interaction based on augmented reality technology |
PCT/CN2013/089651 WO2014114151A1 (en) | 2013-01-25 | 2013-12-17 | Method and system for performing interaction based on augmented reality |
US14/371,996 US10049494B2 (en) | 2013-01-25 | 2014-07-11 | Method and system for performing interaction based on augmented reality |
US16/034,130 US10127736B1 (en) | 2013-01-25 | 2018-07-12 | Method and system for performing interaction based on augmented reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310030109.5A CN103105993B (en) | 2013-01-25 | 2013-01-25 | Method and system for realizing interaction based on augmented reality technology |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103105993A true CN103105993A (en) | 2013-05-15 |
CN103105993B CN103105993B (en) | 2015-05-20 |
Family
ID=48313902
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310030109.5A Active CN103105993B (en) | 2013-01-25 | 2013-01-25 | Method and system for realizing interaction based on augmented reality technology |
Country Status (3)
Country | Link |
---|---|
US (2) | US10049494B2 (en) |
CN (1) | CN103105993B (en) |
WO (1) | WO2014114151A1 (en) |
Cited By (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014114151A1 (en) * | 2013-01-25 | 2014-07-31 | Tencent Technology (Shenzhen) Company Limited | Method and system for performing interaction based on augmented reality |
CN104284155A (en) * | 2014-10-16 | 2015-01-14 | 浙江宇视科技有限公司 | Video image information labeling method and device |
CN104333564A (en) * | 2013-07-22 | 2015-02-04 | 腾讯科技(深圳)有限公司 | Target operation method, system and device |
CN104333845A (en) * | 2013-07-22 | 2015-02-04 | 腾讯科技(深圳)有限公司 | Method, device, equipment and system of finding target |
CN105045084A (en) * | 2015-07-10 | 2015-11-11 | 常熟恒基科技有限公司 | Beidou positioning technology-based smart watch |
CN105204347A (en) * | 2015-06-18 | 2015-12-30 | 丰唐物联技术(深圳)有限公司 | Method, device and system for smart home interaction based on augmented reality technologies |
CN105323252A (en) * | 2015-11-16 | 2016-02-10 | 上海璟世数字科技有限公司 | Method and system for realizing interaction based on augmented reality technology and terminal |
CN105651300A (en) * | 2015-11-30 | 2016-06-08 | 东莞酷派软件技术有限公司 | Information obtaining method and electronic facility |
CN105871952A (en) * | 2015-01-20 | 2016-08-17 | 阿里巴巴集团控股有限公司 | Method and device for information processing |
CN105991658A (en) * | 2016-06-24 | 2016-10-05 | 湖南汇博电子技术有限公司 | Terminal interaction method and server |
CN106354248A (en) * | 2016-05-16 | 2017-01-25 | 刘瑞雪 | Augmented reality system and interactive system for obtaining user information and the method thereof |
CN104183015B (en) * | 2014-09-02 | 2017-04-05 | 中国商用飞机有限责任公司北京民用飞机技术研究中心 | For the method and apparatus for replacing part nacelle physics window in aircraft |
CN106780753A (en) * | 2016-11-22 | 2017-05-31 | 宇龙计算机通信科技(深圳)有限公司 | A kind of augmented reality register device and its method |
CN106846311A (en) * | 2017-01-21 | 2017-06-13 | 吴东辉 | Positioning and AR method and system and application based on image recognition |
CN106982240A (en) * | 2016-01-18 | 2017-07-25 | 腾讯科技(北京)有限公司 | The display methods and device of information |
CN107276892A (en) * | 2017-08-02 | 2017-10-20 | 北京人亩田网络科技有限公司 | Location-based information sharing method and system |
CN107450088A (en) * | 2017-06-08 | 2017-12-08 | 百度在线网络技术(北京)有限公司 | A kind of location Based service LBS augmented reality localization method and device |
CN107895330A (en) * | 2017-11-28 | 2018-04-10 | 特斯联(北京)科技有限公司 | A kind of visitor's service platform that scenario building is realized towards smart travel |
CN108182228A (en) * | 2017-12-27 | 2018-06-19 | 北京奇虎科技有限公司 | User social contact method, device and the computing device realized using augmented reality |
WO2018153158A1 (en) * | 2017-02-22 | 2018-08-30 | 中兴通讯股份有限公司 | Communication method, mobile terminal and server |
CN108596971A (en) * | 2018-04-27 | 2018-09-28 | 北京小米移动软件有限公司 | Image display method and apparatus |
CN108712360A (en) * | 2017-04-12 | 2018-10-26 | 朱恩辛 | A kind of transboundary interactive friend-making system and its method |
CN105468249B (en) * | 2014-09-09 | 2019-01-08 | 联胜(中国)科技有限公司 | Intelligent interaction system and its control method |
CN109522503A (en) * | 2018-09-29 | 2019-03-26 | 东南大学 | The virtual message board system in tourist attractions based on AR Yu LBS technology |
CN109997094A (en) * | 2016-10-04 | 2019-07-09 | 乐威指南公司 | System and method for rebuilding the reference picture from media asset |
CN111044061A (en) * | 2018-10-12 | 2020-04-21 | 腾讯大地通途(北京)科技有限公司 | Navigation method, device, equipment and computer readable storage medium |
CN113973235A (en) * | 2020-07-22 | 2022-01-25 | 上海哔哩哔哩科技有限公司 | Interactive information display method and device and computer equipment |
WO2022068364A1 (en) * | 2020-09-29 | 2022-04-07 | 北京字跳网络技术有限公司 | Information exchange method, first terminal device, server and second terminal device |
US11798234B2 (en) | 2019-07-19 | 2023-10-24 | Huawei Technologies Co., Ltd. | Interaction method in virtual reality scenario and apparatus |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016076651A1 (en) * | 2014-11-12 | 2016-05-19 | 주식회사 퓨처플레이 | Method, system, and computer-readable recording medium for providing content by at least one of plurality of devices on basis of angular relationship between plurality of devices |
US10607420B2 (en) | 2017-08-30 | 2020-03-31 | Dermagenesis, Llc | Methods of using an imaging apparatus in augmented reality, in medical imaging and nonmedical imaging |
CN108471533B (en) * | 2018-03-21 | 2020-11-27 | 万物共算(成都)科技有限责任公司 | High-precision positioning method suitable for AR |
US10782651B2 (en) * | 2018-06-03 | 2020-09-22 | Apple Inc. | Image capture to provide advanced features for configuration of a wearable device |
WO2020211077A1 (en) * | 2019-04-19 | 2020-10-22 | Orange | Method for assisting the acquisition of media content at a scene |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102123194A (en) * | 2010-10-15 | 2011-07-13 | 张哲颖 | Method for optimizing mobile navigation and man-machine interaction functions by using augmented reality technology |
CN102495959A (en) * | 2011-12-05 | 2012-06-13 | 无锡智感星际科技有限公司 | Augmented reality (AR) platform system based on position mapping and application method |
CN102884490A (en) * | 2010-03-05 | 2013-01-16 | 索尼电脑娱乐美国公司 | Maintaining multiple views on a shared stable virtual space |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FI115943B (en) * | 2003-12-12 | 2005-08-15 | Nokia Corp | Arrangement for presenting information on a monitor |
US8730156B2 (en) | 2010-03-05 | 2014-05-20 | Sony Computer Entertainment America Llc | Maintaining multiple views on a shared stable virtual space |
US20100214111A1 (en) * | 2007-12-21 | 2010-08-26 | Motorola, Inc. | Mobile virtual and augmented reality system |
US8711176B2 (en) * | 2008-05-22 | 2014-04-29 | Yahoo! Inc. | Virtual billboards |
US7966024B2 (en) * | 2008-09-30 | 2011-06-21 | Microsoft Corporation | Virtual skywriting |
US9571625B2 (en) * | 2009-08-11 | 2017-02-14 | Lg Electronics Inc. | Electronic device and control method thereof |
KR101648339B1 (en) * | 2009-09-24 | 2016-08-17 | 삼성전자주식회사 | Apparatus and method for providing service using a sensor and image recognition in portable terminal |
WO2011063034A1 (en) * | 2009-11-17 | 2011-05-26 | Rtp, Llc | Systems and methods for augmented reality |
WO2011114330A1 (en) * | 2010-03-17 | 2011-09-22 | Hisep Technology Ltd. | Direction finding system device and method |
CN101833896B (en) * | 2010-04-23 | 2011-10-19 | 西安电子科技大学 | Geographic information guide method and system based on augment reality |
KR101606727B1 (en) * | 2010-06-25 | 2016-03-28 | 엘지전자 주식회사 | Mobile terminal and operation method thereof |
KR101372722B1 (en) * | 2010-06-30 | 2014-03-10 | 주식회사 팬택 | Mobile terminal and information display method using the same |
US8251819B2 (en) * | 2010-07-19 | 2012-08-28 | XMG Studio | Sensor error reduction in mobile device based interactive multiplayer augmented reality gaming through use of one or more game conventions |
JP5571498B2 (en) * | 2010-08-09 | 2014-08-13 | Necマグナスコミュニケーションズ株式会社 | Ambient information search device and ambient information search method |
KR101350033B1 (en) * | 2010-12-13 | 2014-01-14 | 주식회사 팬택 | Terminal and method for providing augmented reality |
KR20120075624A (en) * | 2010-12-20 | 2012-07-09 | 한국전자통신연구원 | Terminal system, shopping system and method for shopping using the same |
KR20120080774A (en) * | 2011-01-10 | 2012-07-18 | 삼성전자주식회사 | Displaying method for displaying information of electro-field intensity and system thereof, and portable device supporting the same |
JP5170278B2 (en) * | 2011-04-07 | 2013-03-27 | ソニー株式会社 | Display control device, display control method, program, and display control system |
WO2012144389A1 (en) * | 2011-04-20 | 2012-10-26 | Necカシオモバイルコミュニケーションズ株式会社 | Individual identification character display system, terminal device, individual identification character display method, and computer program |
KR101611964B1 (en) * | 2011-04-28 | 2016-04-12 | 엘지전자 주식회사 | Mobile terminal and method for controlling same |
WO2013067513A1 (en) * | 2011-11-04 | 2013-05-10 | Massachusetts Eye & Ear Infirmary | Contextual image stabilization |
US20130142384A1 (en) * | 2011-12-06 | 2013-06-06 | Microsoft Corporation | Enhanced navigation through multi-sensor positioning |
US9052802B2 (en) * | 2012-03-02 | 2015-06-09 | Realtek Semiconductor Corp. | Multimedia interaction system and related computer program product capable of filtering multimedia interaction commands |
US9510292B2 (en) * | 2012-03-13 | 2016-11-29 | Qualcomm Incorporated | Limiting wireless discovery range |
KR101465974B1 (en) * | 2012-09-05 | 2014-12-10 | 주식회사 팬택 | Method and apparatus for position detecting and communication of device |
KR102019124B1 (en) * | 2013-01-04 | 2019-09-06 | 엘지전자 주식회사 | Head mounted display and method for controlling the same |
CN103105993B (en) * | 2013-01-25 | 2015-05-20 | 腾讯科技(深圳)有限公司 | Method and system for realizing interaction based on augmented reality technology |
-
2013
- 2013-01-25 CN CN201310030109.5A patent/CN103105993B/en active Active
- 2013-12-17 WO PCT/CN2013/089651 patent/WO2014114151A1/en active Application Filing
-
2014
- 2014-07-11 US US14/371,996 patent/US10049494B2/en active Active
-
2018
- 2018-07-12 US US16/034,130 patent/US10127736B1/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102884490A (en) * | 2010-03-05 | 2013-01-16 | 索尼电脑娱乐美国公司 | Maintaining multiple views on a shared stable virtual space |
CN102123194A (en) * | 2010-10-15 | 2011-07-13 | 张哲颖 | Method for optimizing mobile navigation and man-machine interaction functions by using augmented reality technology |
CN102495959A (en) * | 2011-12-05 | 2012-06-13 | 无锡智感星际科技有限公司 | Augmented reality (AR) platform system based on position mapping and application method |
Cited By (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014114151A1 (en) * | 2013-01-25 | 2014-07-31 | Tencent Technology (Shenzhen) Company Limited | Method and system for performing interaction based on augmented reality |
US10049494B2 (en) | 2013-01-25 | 2018-08-14 | Tencent Technology (Shenzhen) Company Limited | Method and system for performing interaction based on augmented reality |
CN104333564A (en) * | 2013-07-22 | 2015-02-04 | 腾讯科技(深圳)有限公司 | Target operation method, system and device |
CN104333845A (en) * | 2013-07-22 | 2015-02-04 | 腾讯科技(深圳)有限公司 | Method, device, equipment and system of finding target |
CN104333845B (en) * | 2013-07-22 | 2019-11-22 | 腾讯科技(深圳)有限公司 | Target lookup method, device, equipment and system |
CN104183015B (en) * | 2014-09-02 | 2017-04-05 | 中国商用飞机有限责任公司北京民用飞机技术研究中心 | For the method and apparatus for replacing part nacelle physics window in aircraft |
CN105468249B (en) * | 2014-09-09 | 2019-01-08 | 联胜(中国)科技有限公司 | Intelligent interaction system and its control method |
CN104284155A (en) * | 2014-10-16 | 2015-01-14 | 浙江宇视科技有限公司 | Video image information labeling method and device |
CN104284155B (en) * | 2014-10-16 | 2017-08-15 | 浙江宇视科技有限公司 | Video image information mask method and device |
CN105871952A (en) * | 2015-01-20 | 2016-08-17 | 阿里巴巴集团控股有限公司 | Method and device for information processing |
CN105204347A (en) * | 2015-06-18 | 2015-12-30 | 丰唐物联技术(深圳)有限公司 | Method, device and system for smart home interaction based on augmented reality technologies |
CN105045084A (en) * | 2015-07-10 | 2015-11-11 | 常熟恒基科技有限公司 | Beidou positioning technology-based smart watch |
CN105323252A (en) * | 2015-11-16 | 2016-02-10 | 上海璟世数字科技有限公司 | Method and system for realizing interaction based on augmented reality technology and terminal |
CN105651300A (en) * | 2015-11-30 | 2016-06-08 | 东莞酷派软件技术有限公司 | Information obtaining method and electronic facility |
CN106982240A (en) * | 2016-01-18 | 2017-07-25 | 腾讯科技(北京)有限公司 | The display methods and device of information |
CN106354248A (en) * | 2016-05-16 | 2017-01-25 | 刘瑞雪 | Augmented reality system and interactive system for obtaining user information and the method thereof |
CN105991658A (en) * | 2016-06-24 | 2016-10-05 | 湖南汇博电子技术有限公司 | Terminal interaction method and server |
CN109997094A (en) * | 2016-10-04 | 2019-07-09 | 乐威指南公司 | System and method for rebuilding the reference picture from media asset |
CN109997094B (en) * | 2016-10-04 | 2023-08-04 | 乐威指南公司 | System and method for reconstructing a reference image from a media asset |
CN106780753A (en) * | 2016-11-22 | 2017-05-31 | 宇龙计算机通信科技(深圳)有限公司 | A kind of augmented reality register device and its method |
CN106846311A (en) * | 2017-01-21 | 2017-06-13 | 吴东辉 | Positioning and AR method and system and application based on image recognition |
CN106846311B (en) * | 2017-01-21 | 2023-10-13 | 吴东辉 | Positioning and AR method and system based on image recognition and application |
WO2018153158A1 (en) * | 2017-02-22 | 2018-08-30 | 中兴通讯股份有限公司 | Communication method, mobile terminal and server |
CN108712360A (en) * | 2017-04-12 | 2018-10-26 | 朱恩辛 | A kind of transboundary interactive friend-making system and its method |
CN107450088A (en) * | 2017-06-08 | 2017-12-08 | 百度在线网络技术(北京)有限公司 | A kind of location Based service LBS augmented reality localization method and device |
CN107450088B (en) * | 2017-06-08 | 2021-05-14 | 百度在线网络技术(北京)有限公司 | Location-based service LBS augmented reality positioning method and device |
US11164379B2 (en) | 2017-06-08 | 2021-11-02 | Baidu Online Network Technology (Beijing) Co., Ltd. | Augmented reality positioning method and apparatus for location-based service LBS |
CN107276892A (en) * | 2017-08-02 | 2017-10-20 | 北京人亩田网络科技有限公司 | Location-based information sharing method and system |
CN107895330A (en) * | 2017-11-28 | 2018-04-10 | 特斯联(北京)科技有限公司 | A kind of visitor's service platform that scenario building is realized towards smart travel |
CN107895330B (en) * | 2017-11-28 | 2018-10-26 | 特斯联(北京)科技有限公司 | A kind of tourist's service platform for realizing scenario building towards smart travel |
CN108182228A (en) * | 2017-12-27 | 2018-06-19 | 北京奇虎科技有限公司 | User social contact method, device and the computing device realized using augmented reality |
CN108596971B (en) * | 2018-04-27 | 2024-03-19 | 北京小米移动软件有限公司 | Image display method and device |
CN108596971A (en) * | 2018-04-27 | 2018-09-28 | 北京小米移动软件有限公司 | Image display method and apparatus |
CN109522503B (en) * | 2018-09-29 | 2021-04-27 | 东南大学 | Tourist attraction virtual message board system based on AR and LBS technology |
CN109522503A (en) * | 2018-09-29 | 2019-03-26 | 东南大学 | The virtual message board system in tourist attractions based on AR Yu LBS technology |
CN111044061A (en) * | 2018-10-12 | 2020-04-21 | 腾讯大地通途(北京)科技有限公司 | Navigation method, device, equipment and computer readable storage medium |
US11798234B2 (en) | 2019-07-19 | 2023-10-24 | Huawei Technologies Co., Ltd. | Interaction method in virtual reality scenario and apparatus |
CN113973235A (en) * | 2020-07-22 | 2022-01-25 | 上海哔哩哔哩科技有限公司 | Interactive information display method and device and computer equipment |
WO2022068364A1 (en) * | 2020-09-29 | 2022-04-07 | 北京字跳网络技术有限公司 | Information exchange method, first terminal device, server and second terminal device |
Also Published As
Publication number | Publication date |
---|---|
US20150187142A1 (en) | 2015-07-02 |
US10049494B2 (en) | 2018-08-14 |
CN103105993B (en) | 2015-05-20 |
WO2014114151A1 (en) | 2014-07-31 |
US20180322707A1 (en) | 2018-11-08 |
US10127736B1 (en) | 2018-11-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103105993B (en) | Method and system for realizing interaction based on augmented reality technology | |
US11709067B2 (en) | User controlled directional interface processing | |
US9013505B1 (en) | Mobile system representing virtual objects on live camera image | |
US9043318B2 (en) | Mobile terminal and photo searching method thereof | |
US9582937B2 (en) | Method, apparatus and computer program product for displaying an indication of an object within a current field of view | |
CN102339579B (en) | Guide system | |
CN106878949B (en) | Positioning terminal, system and method based on double cameras | |
CN101924992A (en) | Method, system and equipment for acquiring scene information through mobile terminal | |
CN103826201A (en) | Geographical position-based virtual interaction method and system thereof | |
US20080171558A1 (en) | Location-Based Service Method and System Using Location Data Included in Image Data | |
US20240161370A1 (en) | Methods, apparatus, systems, devices, and computer program products for augmenting reality in connection with real world places | |
WO2017071476A1 (en) | Image synthesis method and device, and storage medium | |
CN107483830B (en) | Photo shooting control method and system based on mobile terminal and storage medium | |
KR102120786B1 (en) | Method of providing location based video content and apparatus therefor | |
CN105871826A (en) | Method and device for sharing geographic position between terminals | |
CN105682031A (en) | Method and device for automatically switching network positioning services | |
US20150371449A1 (en) | Method for the representation of geographically located virtual environments and mobile device | |
CN113532442A (en) | Indoor AR pedestrian navigation method | |
CA2573319C (en) | Directional location system for a portable electronic device | |
CN102946476A (en) | Rapid positioning method and rapid positioning device | |
CN107071278A (en) | Terminal and star orbital shooting method | |
US9366545B2 (en) | Directional location system for a portable electronic device | |
CN104469674B (en) | The method and apparatus of recording geographical position | |
CN105704662A (en) | Adaptive network positioning method and apparatus thereof | |
KR20110136529A (en) | System and method for providing augumented reality information using video transmission |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |