CN113470186A - AR interaction method and device, electronic equipment and storage medium - Google Patents
AR interaction method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN113470186A CN113470186A CN202110736104.9A CN202110736104A CN113470186A CN 113470186 A CN113470186 A CN 113470186A CN 202110736104 A CN202110736104 A CN 202110736104A CN 113470186 A CN113470186 A CN 113470186A
- Authority
- CN
- China
- Prior art keywords
- special effect
- entity object
- entity
- displaying
- identification code
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 68
- 230000003993 interaction Effects 0.000 title claims abstract description 46
- 230000000694 effects Effects 0.000 claims abstract description 171
- 239000012634 fragment Substances 0.000 claims abstract description 55
- 238000004590 computer program Methods 0.000 claims description 11
- 230000003190 augmentative effect Effects 0.000 claims description 6
- 230000004044 response Effects 0.000 claims description 6
- 238000005516 engineering process Methods 0.000 description 14
- 238000010586 diagram Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 230000008569 process Effects 0.000 description 8
- 238000012545 processing Methods 0.000 description 7
- 230000008901 benefit Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000008520 organization Effects 0.000 description 3
- 238000011160 research Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- PEDCQBHIVMGVHV-UHFFFAOYSA-N Glycerine Chemical compound OCC(O)CO PEDCQBHIVMGVHV-UHFFFAOYSA-N 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present disclosure provides an AR interaction method, an AR interaction device, an electronic device, and a storage medium, where the AR interaction method includes: identifying graphic identification code information of the target entity object; the target entity object is any entity object in an entity object group, the entity object group comprises a plurality of entity objects, and the entity objects have the same first attribute characteristics; acquiring and displaying an AR special effect fragment matched with the target entity object based on the recognized graphic identification code information; the AR special effect fragment is one special effect fragment in the complete AR special effect corresponding to the entity object group; and after determining that the AR equipment identifies the graphic identification code information of all the entity objects in the entity object group, displaying the complete AR special effect corresponding to the entity object group.
Description
Technical Field
The present disclosure relates to the field of Augmented Reality (AR) technologies, and in particular, to an AR interaction method, an AR interaction apparatus, an electronic device, and a storage medium.
Background
The AR technology is a technology for skillfully fusing virtual information and a real world, and can superimpose the virtual information and a real environment on one picture in real time. At present, the AR technology has been widely applied to various fields, for example, the AR technology can be applied to game interaction, industrial application, event watching and other scenes to enrich the experience of users in different scenes.
However, although the current AR technology is applied in many scenes, the corresponding AR experience is still lacking for some specific scenes with continuous scenes, such as philatelic scenes.
Disclosure of Invention
The embodiment of the disclosure at least provides an AR interaction method, an AR interaction device, electronic equipment and a storage medium.
In a first aspect, an embodiment of the present disclosure provides an AR interaction method, including:
identifying graphic identification code information of the target entity object; the target entity object is any entity object in an entity object group, the entity object group comprises a plurality of entity objects, and the entity objects have the same first attribute characteristics;
acquiring and displaying an AR special effect fragment matched with the target entity object based on the recognized graphic identification code information; the AR special effect fragment is one special effect fragment in the complete AR special effect corresponding to the entity object group;
and after determining that the AR equipment identifies the graphic identification code information of all the entity objects in the entity object group, displaying the complete AR special effect corresponding to the entity object group.
In the embodiment of the disclosure, the target entity object in the entity object group may be identified, and the AR special effect fragment matched with the identified target entity object may be displayed, and in addition, after it is determined that the AR device identifies all the entity objects in the entity object group, the complete AR special effect corresponding to the entity object group may be displayed, so that the AR technology is applied to a specific scene with continuous scenes, and different experiences are brought to the user.
According to the first aspect, in one possible implementation, the complete AR special effect includes a panoramic special effect map corresponding to a target area, and the AR special effect fragment includes a partial special effect map in the panoramic special effect map.
In the embodiment of the disclosure, since the complete AR special effect includes the panoramic special effect image corresponding to the target area, and the AR special effect fragment includes a part of the special effect images in the panoramic special effect image, the user can experience the AR special effects of different entity objects, and can know the target area through the AR special effects, so as to stimulate the experience interest of the user.
According to the first aspect, in a possible implementation manner, after acquiring and displaying the AR special effect fragment matched with the target entity object based on the recognized graphical identification code information, the method further includes:
determining and displaying the collection state information of the AR equipment aiming at the entity object group based on the historical identification record of the AR equipment; wherein the collection status information is used for reflecting the collection completion degree of the entity objects in the entity object group.
In the embodiment of the disclosure, after the target test question object is identified, the collection state information of the AR equipment for the entity object group is displayed based on the history identification record of the AR equipment, so that the user can intuitively know the collection state of the entity object group, and further guidance information is provided for the next collection of the user.
According to the first aspect, in one possible implementation, the AR special effect fragment includes a virtual object and an AR special effect matching the virtual object; the acquiring and displaying of the AR special effect fragment matched with the target entity object based on the identified graphic identification code information includes:
acquiring and displaying a virtual object matched with the target entity object and an AR special effect matched with the virtual object based on the recognized graphic identification code information; different entity objects in the entity object group respectively correspond to different virtual objects, and the different virtual objects have the same second attribute characteristics.
In the embodiment of the disclosure, the AR special effect fragment includes the virtual object and the AR special effect matched with the virtual object, and different virtual objects have the same second attribute characteristics, so that an association relationship exists between different virtual objects, and therefore, a user can bring a corresponding story scene when viewing the collected virtual objects, thereby improving user experience.
According to the first aspect, in a possible implementation manner, the obtaining and displaying a virtual object matched with the target entity object and an AR special effect matched with the virtual object based on the recognized graphical identification code information includes:
acquiring and displaying the virtual object based on the identified graphic identification code information;
responding to the trigger operation aiming at the virtual object, and acquiring and displaying an AR special effect matched with the virtual object, wherein the virtual object and the AR special effect have a preset position relation.
In the embodiment of the disclosure, since the virtual object is displayed first, and then the AR special effect matched with the virtual object is displayed in response to the trigger operation for the virtual object, that is, the participation of the user is required in the process of displaying the AR special effect, so that the interestingness of displaying the AR special effect segment is improved.
According to the first aspect, in one possible implementation, the method further comprises:
responding to the trigger operation aiming at the AR special effect fragment, and displaying the collection time corresponding to the target entity object; the collection time is a time when the AR device recognizes the graphic identification code information.
In the embodiment of the disclosure, the collection time of each entity object of the user can be reminded by displaying the collection time corresponding to the target entity object, and in addition, the collection time is the time when the AR equipment recognizes the graphic identification code information, that is, the time when the entity object is correspondingly obtained, so that the AR experience is more closely and truly collected, and the user experience is further improved.
According to the first aspect, in one possible implementation, the method further comprises:
displaying the AR virtual medal matched with the current collection state information under the condition that the collection state information meets the preset condition;
and responding to the trigger operation aiming at the AR virtual medal, and displaying the AR special effect matched with the AR virtual medal.
In the embodiment of the disclosure, by displaying the AR virtual medal matched with the current collection state information and the AR special effect matched with the AR virtual medal, the user can be encouraged to finish the collection, the honor feeling is brought to the user, and the interest in the process of collecting the entity object is increased.
According to the first aspect, in a possible implementation manner, before the identifying the graphic identification code information of the target entity object, the method further includes:
and responding to the trigger operation aiming at the AR equipment, and controlling the AR equipment to enter an AR mode.
In the embodiment of the present disclosure, since the trigger operation for the AR device needs to be responded, the AR device is controlled to enter the AR mode, which indicates that the AR device has other modes, such as a general mode, and the AR device can realize switching between the general mode and the AR mode, thereby improving the applicability of the AR device.
In one possible implementation, the controlling, by the AR device, the AR device to enter the AR mode in response to the trigger operation of the AR device includes:
responding to the trigger operation for acquiring the graphic identification code information, and displaying a loading page on the AR equipment, wherein the loading page is displayed with an AR mode trigger;
and responding to the trigger operation aiming at the AR mode trigger mark, and controlling the AR equipment to enter the AR mode.
In the embodiment of the disclosure, the AR mode trigger is displayed on the loading page, and the AR mode can be entered only after the AR mode trigger is triggered, so that interaction with a user can be enhanced, and the occurrence of false triggering caused by the situations such as false triggering can be avoided.
In a possible implementation according to the first aspect, the AR mode is implemented by a web page or an applet.
In the embodiment of the disclosure, the AR mode is started in a mode of a world wide web page or an applet, and corresponding software does not need to be additionally installed, so that the operation of a user is facilitated.
In a second aspect, an embodiment of the present disclosure provides an AR interaction apparatus, including:
the entity object identification module is used for identifying the graphic identification code information of the target entity object; the target entity object is any entity object in an entity object group, the entity object group comprises a plurality of entity objects, and the entity objects have the same first attribute characteristics;
the first special effect display module is used for acquiring and displaying the AR special effect fragment matched with the target entity object based on the recognized graphic identification code information; the AR special effect fragment is one special effect fragment in the complete AR special effect corresponding to the entity object group;
and the second special effect display module is used for displaying the complete AR special effect corresponding to the entity object group after determining that the AR equipment identifies the graphic identification code information of all the entity objects in the entity object group.
According to the second aspect, in a possible implementation, the complete AR special effect includes a panoramic special effect map corresponding to a target area, and the AR special effect fragment includes a partial special effect map in the panoramic special effect map.
According to the second aspect, in one possible implementation, the first special effects presentation module is further configured to:
determining and displaying the collection state information of the AR equipment aiming at the entity object group based on the historical identification record of the AR equipment; wherein the collection status information is used for reflecting the collection completion degree of the entity objects in the entity object group.
According to a second aspect, in one possible implementation, the AR special effect fragment comprises a virtual object and an AR special effect matching the virtual object; the entity object identification module is specifically configured to:
acquiring and displaying a virtual object matched with the target entity object and an AR special effect matched with the virtual object based on the recognized graphic identification code information; different entity objects in the entity object group respectively correspond to different virtual objects, and the different virtual objects have the same second attribute characteristics.
According to a second aspect, in a possible implementation, the entity object identification module is specifically configured to:
acquiring and displaying the virtual object based on the identified graphic identification code information;
responding to the trigger operation aiming at the virtual object, and acquiring and displaying an AR special effect matched with the virtual object, wherein the virtual object and the AR special effect have a preset position relation.
According to the second aspect, in one possible implementation, the first special effects presentation module is further configured to:
responding to the trigger operation aiming at the AR special effect fragment, and displaying the collection time corresponding to the target entity object; the collection time is a time when the AR device recognizes the graphic identification code information.
According to the second aspect, in one possible implementation, the first special effects presentation module is further configured to:
displaying the AR virtual medal matched with the current collection state information under the condition that the collection state information meets the preset condition;
and responding to the trigger operation aiming at the AR virtual medal, and displaying the AR special effect matched with the AR virtual medal.
According to a second aspect, in a possible implementation, the apparatus further comprises:
and the AR mode triggering module is used for responding to the triggering operation aiming at the AR equipment and controlling the AR equipment to enter an AR mode.
According to the second aspect, in a possible implementation manner, the AR mode triggering module is specifically configured to:
responding to the trigger operation for acquiring the graphic identification code information, and displaying a loading page on the AR equipment, wherein the loading page is displayed with an AR mode trigger;
and responding to the trigger operation aiming at the AR mode trigger mark, and controlling the AR equipment to enter the AR mode.
According to a second aspect, in one possible implementation, the AR mode is implemented by a web page or applet.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the AR interaction method according to the first aspect.
In a fourth aspect, the disclosed embodiments provide a computer-readable storage medium having a computer program stored thereon, where the computer program is executed by a processor to perform the steps of the AR interaction method according to the first aspect.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required to be used in the embodiments will be briefly described below, and the drawings herein are incorporated into and constitute a part of this specification, and show the embodiments consistent with the present disclosure and together with the description serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1 shows a flowchart of a first AR interaction method provided by an embodiment of the present disclosure;
FIG. 2 is a schematic diagram illustrating a set of entity objects provided by an embodiment of the present disclosure;
fig. 3 shows a flowchart of a second AR interaction method provided by an embodiment of the present disclosure;
fig. 4 shows a flowchart of a third AR interaction method provided by the embodiment of the present disclosure;
fig. 5 shows a flowchart of a method for controlling an AR device to enter an AR mode according to an embodiment of the present disclosure;
fig. 6 shows a schematic diagram of a loaded page provided by an embodiment of the present disclosure.
Fig. 7 shows a flowchart of a fourth AR interaction method provided by the embodiments of the present disclosure;
fig. 8 is a schematic structural diagram of an AR interaction apparatus provided in an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of another AR interaction apparatus provided in the embodiments of the present disclosure;
fig. 10 shows a schematic diagram of an electronic device provided by an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of the embodiments of the present disclosure, as generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making any creative effort, shall fall within the protection scope of the disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The term "and/or" herein merely describes an associative relationship, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
The AR technology is also called augmented reality, and is a newer technical content that enables real world information and virtual world information content to be integrated together, and implements simulation processing on the basis of computer and other scientific technologies on the basis of entity information which is difficult to experience in the spatial range of the real world originally, and superimposes the virtual information content to be effectively applied in the real world, and the virtual information content can be perceived by human senses in the process, so that the sensory experience beyond reality is realized. After the real environment and the virtual object are overlapped, the real environment and the virtual object can exist in the same picture and space at the same time.
Research shows that the AR technology has been widely applied to various fields, for example, the AR technology can be applied to game interaction, industrial application, event watching and other scenes to enrich the experience of users in different scenes. However, although the current AR technology is applied in many scenes, the corresponding AR experience is still lacking for some specific scenes with continuous scenes, such as philatelic scenes.
Based on the above research, the present disclosure provides an AR interaction method, which may identify a target entity object in an entity object group, and display an AR special effect fragment matched with the identified target entity object, and may also display the complete AR special effect corresponding to the entity object group after it is determined that an AR device identifies all entity objects in the entity object group, so that an AR technology is applied to a specific scene with continuous plots, and different experiences are brought to a user.
To facilitate understanding of the present embodiment, first, an AR interaction method disclosed in the embodiments of the present disclosure is described in detail, where an execution subject of the AR interaction method provided in the embodiments of the present disclosure is generally an electronic device with certain computing capability, and the electronic device includes, for example: terminal devices or servers or other processing devices, and the terminal devices may include mobile phones, tablet computers, wearable devices, and the like. The server can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and can also be a cloud server for providing basic cloud computing services such as cloud service, a cloud database, cloud computing, cloud storage, big data, an artificial intelligence platform and the like. The other processing device may be a device that includes a processor and a memory, and is not limited thereto.
Referring to fig. 1, a flowchart of a first AR interaction method provided in the embodiment of the present disclosure is shown, where the AR interaction method includes the following steps S101 to S103:
s101, identifying graphic identification code information of a target entity object; the target entity object is any entity object in an entity object group, the entity object group comprises a plurality of entity objects, and the entity objects have the same first attribute characteristics.
The first attribute feature refers to a feature that a plurality of entity objects belong to the same category and a preset association relationship exists between the entity objects.
For example, the set of physical objects may include a set of mail items, where a mail item includes not only a stamp, but also a postal item such as a seal, sheet, slip, stamp, etc. issued and used by a postal authority. The forms of the postal articles are various, including first-day seals, international postage aviation envelopes, commemorative seals, original picture cards, post cards, postage seals, small full sheets, small copy tickets, small sheets, commemorative postmarks, posters and the like.
Also for example, the set of physical objects may include a set of postcards. The postcard is a card with an image and written with text contents, which can be posted directly without an envelope. The postcard may be issued by a different organization, and the postcard is issued by the postal organization, but the postcard is not limited to the issuing organization.
Illustratively, the graphic identification code refers to an identification code for characterizing identity information of the entity object, that is, the graphic identification code may be used to distinguish between different entity objects, for example, the graphic identification code may include a two-dimensional code or a barcode, and is not limited herein.
The two-dimensional code is also called a two-dimensional bar code, and is a pattern which is distributed on a plane (in a two-dimensional direction) according to a certain rule by using a certain specific geometric figure, is black and white, and records data symbol information. A bar code (barcode) is a graphic identifier in which a plurality of black bars and spaces having different widths are arranged according to a certain coding rule to express a set of information.
Referring to fig. 2, for an exemplary schematic diagram of a target entity object provided in the present disclosure, the entity object group 10 includes a group of stamps with a story line (e.g., the great moon palace), each entity object 11 is a stamp, and each entity object 11 has a graphic identification code 12, in the present disclosure, the graphic identification code 12 is a two-dimensional code. The two-dimensional code can be photographed by a camera of the electronic equipment to acquire image information of the two-dimensional code, and then the two-dimensional code information is identified.
S102, acquiring and displaying an AR special effect fragment matched with the target entity object based on the identified graphic identification code information; the AR special effect fragment is one of the complete AR special effects corresponding to the entity object group.
It is to be understood that the correspondence between different entity objects and different AR special effect fragments may be pre-established, for example, as shown in table 1 below, a correspondence table between entity objects and AR special effect fragments may be pre-established.
TABLE 1
Entity object name | AR Special Effect fragment name |
A entity object | M-AR special effect fragments |
B entity object | N-AR special effect fragments |
… | … |
C entity object | P-AR special effect fragments |
Therefore, after the target entity object corresponding to the current graphic identification code is determined based on the identified graphic identification code information, the AR special effect fragment corresponding to the target entity object can be obtained through table 1. For example, if the target entity object is determined to be the entity object a after the graphic identification code information is identified, the M-AR special effect fragment can be obtained and displayed through table 1.
It can be understood that the complete AR special effect is a fusion special effect of the AR special effect fragments corresponding to each entity object. For example, the M-AR special effect fragment, the N-AR special effect fragment, and the P-AR special effect fragment may constitute a complete AR special effect corresponding to the entity object group, where the entity object group includes an a entity object, a B entity object, and a C entity object.
In some embodiments, taking the entity object group 10 in fig. 2 as an example, the AR special effect segment corresponding to each entity object 11 is an AR segment of a "great womb" animation, and the complete AR special effect corresponding to the entity object group 10 is the complete AR special effect of the "great womb" animation. Similarly, if the physical object group is a set of gipypower animated stamps, the AR special effect segment corresponding to each physical object is an AR segment of the gipypower animation, and the complete AR special effect corresponding to the physical object group is the complete AR special effect of the gipypower animation.
In some further embodiments, the complete AR special effect includes a panoramic special effect map corresponding to the target area, and the AR special effect segment includes a partial special effect map in the panoramic special effect map. Wherein the target area may be a target scenic spot, a target museum, etc.
Exemplarily, the AR special effect fragment may include a virtual object and an AR special effect matched with the virtual object; therefore, step S102 may specifically include: acquiring and displaying a virtual object matched with the target entity object and an AR special effect matched with the virtual object based on the recognized graphic identification code information; different entity objects in the entity object group respectively correspond to different virtual objects, and the different virtual objects have the same second attribute characteristics.
The virtual object specifically refers to virtual information generated by computer simulation, and may be a virtual three-dimensional object, such as a virtual animal, a virtual plant, a virtual other object, or a virtual planar object, such as a virtual arrow, a virtual character, a virtual picture, or the like.
In this embodiment, the virtual object is, for example, a virtual object in an animation form, so that the display form of the virtual object corresponding to the target entity object can be richer and more vivid, and the visual experience of the user is improved.
Illustratively, the different virtual objects have the same second attribute feature, which means that the different virtual objects have the same subject and the different virtual objects have a continuity relationship in a preset order. The continuity may be continuity of the display screen or continuity of the story line, and is not limited herein.
In some embodiments, when presenting a virtual object matching the target physical object and an AR special effect matching the virtual object, may include:
(1) and acquiring and displaying the virtual object based on the identified graphic identification code information.
(2) Responding to the trigger operation aiming at the virtual object, and acquiring and displaying an AR special effect matched with the virtual object, wherein the virtual object and the AR special effect have a preset position relation.
The preset position relationship may be set according to specific requirements, for example, the virtual object may be displayed at an edge of the AR screen while displaying the AR special effect matched with the virtual object, that is, the AR special effect and the special effect of the virtual object are displayed on the AR screen, and the special effect of the virtual object is displayed on one side of the AR special effect. The AR picture refers to a picture presented on a screen of the AR device.
Illustratively, a virtual object may be displayed first, and then an AR special effect matched with the virtual object is displayed in response to a trigger operation for the virtual object, that is, participation of a user is required in the process of displaying the AR special effect, so that interestingness of displaying the AR special effect segment is improved.
Of course, in other embodiments, the AR special effect matched with the virtual object may also be automatically displayed after the virtual object is displayed for the preset time, without the user applying a trigger operation.
S103, after determining that the AR equipment identifies the graphic identification code information of all the entity objects in the entity object group, displaying the complete AR special effect corresponding to the entity object group.
The AR device may specifically include a smart phone, a tablet computer, AR glasses, and the like, that is, the AR device may be a terminal device in the foregoing electronic device with a certain computing capability. The AR equipment can be internally provided with an image acquisition component and also can be externally connected with the image acquisition component, and after the AR equipment enters a working state, real scene images can be shot in real time through the image acquisition component.
In this embodiment, the AR device may be the same AR device or a plurality of different AR devices, and in the case of a plurality of AR devices, the user subjects of different AR devices are the same, for example, the same user account logs in different AR devices.
It can be understood that after the AR device recognizes the graphic identification code information of all the entity objects in the entity object group, it indicates that all the entity objects in the entity object group are recognized, taking the entity objects as stamps as an example, after a group of stamps are recognized, it indicates that all the group of stamps are successfully collected, and the stamp collecting process is completed, so that the complete AR special effect can be displayed, and the effect consistent with the real stamp collecting experience is achieved.
Referring to fig. 3, a flowchart of a second AR interaction method provided in the embodiment of the present disclosure is different from the AR interaction method shown in fig. 1, and further includes the following S104:
s104, responding to the trigger operation aiming at the AR special effect fragment, and displaying the collection time corresponding to the target entity object; the collection time is a time when the AR device recognizes the graphic identification code information.
It can be understood that after the graphic identification code information of the target entity object is identified, that is, the target entity object is successfully collected, so that the AR special effect fragment matched with the target entity object is stored for the user to view subsequently, and since the collection time of different entity objects is different, in order to remind the user of the collection time of each entity object, the collection time corresponding to the target entity object may be displayed in response to a trigger operation for the AR special effect fragment. In addition, the collecting time is the time when the AR equipment recognizes the graphic identification code information, namely the time when the entity object is correspondingly obtained, so that the AR experience is more close and real, and the user experience is further improved.
Referring to fig. 4, a flowchart of a third AR interaction method provided in the embodiment of the present disclosure is different from the AR interaction method shown in fig. 1, and further includes the following S100:
s100, responding to the trigger operation aiming at the AR equipment, and controlling the AR equipment to enter an AR mode.
It is to be understood that since an AR device refers to a device having AR functionality, it is not intended that the AR device only have AR functionality. In order to improve the applicability of the AR device, the AR device may also generally have the same function as a general electronic device, wherein the general electronic device refers to an electronic device without an AR function. Therefore, it is desirable for the AR device to be able to switch between the general mode and the AR mode. In the AR mode, the AR device can implement the AR function, and in the general mode, the AR device cannot implement the AR function.
For example, referring to fig. 5, when the AR device is controlled to enter the AR mode in response to the trigger operation for the AR device, the following steps S1001 to S1002 may be included:
s1001, responding to the trigger operation of obtaining the graphic identification code information, displaying a loading page on the AR equipment, wherein the loading page is displayed with an AR mode trigger.
S1002, responding to the trigger operation aiming at the AR mode trigger mark, and controlling the AR equipment to enter the AR mode.
Exemplarily, referring to fig. 6, an illustration diagram of a loaded page provided by an embodiment of the present disclosure is shown. The user scans the two-dimensional code on the target entity object by using the AR device and then enters the loading page 20, the AR mode trigger identifier 21 is displayed on the loading page 20, and when the user applies a trigger operation (such as a long-time pressing operation) to the AR mode trigger identifier 21, the AR device can enter the AR mode. In addition, the loading page 20 may also display a next operation prompt icon 22 to prompt the user to scan the graphical identification code on the target entity object again using the AR device in the AR mode. Therefore, guidance can be provided for the actual operation of the user, and convenience is brought to the user. In this embodiment, the loading page is an H5 page, and the AR mode trigger 21 is a two-dimensional code.
It will be appreciated that in some embodiments, the AR mode may be implemented by a web page or applet.
The web page (web) is a network service established on the Internet, and provides a graphical and easily-accessible visual interface for a browser to search and browse information on the Internet.
A Mini Program (also called a Web Program) is a Program developed based on a front-end-oriented Language (e.g., JavaScript) and implementing a service in a hypertext Markup Language (HTML) page, and software downloaded by a client (e.g., any client embedded in a browser core) via a network (e.g., the internet) and interpreted and executed in a browser environment of the client saves steps installed in the client. For example, an applet for implementing AR mode may be downloaded, run in a social network client.
Referring to fig. 7, a fourth AR interaction method is also provided in the embodiments of the present disclosure. The AR interaction method comprises the following steps S201 to S206:
s201, identifying the graphic identification code information of the target entity object; the target entity object is any entity object in an entity object group, the entity object group comprises a plurality of entity objects, and the entity objects have the same first attribute characteristics.
This step is similar to the step S101, and is not described herein again.
S202, acquiring and displaying an AR special effect fragment matched with the target entity object based on the recognized graphic identification code information; the AR special effect fragment is one of the complete AR special effects corresponding to the entity object group.
This step is similar to the step S102, and is not described herein again.
S203, determining and displaying the collected state information of the AR equipment aiming at the entity object group based on the historical identification record of the AR equipment; wherein the collection status information is used for reflecting the collection completion degree of the entity objects in the entity object group.
For example, the number of the entity objects that have been collected may be determined according to the historical identification record of the AR device, and then the collection status information for the entity object group may be determined according to the total number of the entity object group. For example, if there are 10 entity objects in the entity object group, it is determined that 8 entity objects have been currently collected according to the history identification record of the AR device, and the collection can be completed by 2 entity objects, and at this time, corresponding collection status information may be displayed to prompt the user of the current collection progress.
For example, the collection status information may include a progress indication bar indicating a ratio of the physical object currently collected by the user to the physical object group, similar to the remaining power display bar of the AR device, and a prompt message "only one step away from the successful collection |" may be displayed at the same time as the progress indication bar! The method and the system interact with the user, and the collection experience is improved.
For another example, the collection status information may also be presented in the form of an identifier map, that is, each time an entity object is collected, the identifier map corresponding to the entity object may be presented, so that which entity objects have been collected and which entity objects remain uncollected may be visually presented to the user.
It is understood that the content included in the collected state information and the presentation form are examples, in other embodiments, the collected state information may also include more or less content, or may also be presented in other forms, and the embodiment of the present disclosure is not particularly limited as long as the current collected state of the entity object group can be represented.
And S204, under the condition that the collection state information meets the preset condition, displaying the AR virtual medals matched with the current collection state information.
In some embodiments, in order to encourage the user and increase interest in collecting the physical object, an AR virtual medal matching the current collection status information may be displayed in a case where the collection status information satisfies a preset condition.
The preset condition may be that the number of the entity objects collected by the user reaches a preset number in the entity object group, and the preset number may be half or one third of the number of all the entity objects in the entity object group, which is not limited herein. In addition, the display form of the virtual medal is not limited, for example, the virtual medal may be an image similar to a physical medal, and may also be text information such as "philatelic heir".
S205, responding to the trigger operation aiming at the AR virtual medal, and displaying the AR special effect matched with the AR virtual medal.
Illustratively, in order to improve the interest of the collection process and the interactivity with the user, when the user triggers the AR virtual medal, the AR special effect matched with the AR virtual medal is displayed. The AR special effect matched with the AR virtual medal may be preset according to a requirement, and is not limited herein. In addition, the acquisition time of the AR virtual medal can be displayed.
It should be noted that each trigger operation in the embodiment of the present disclosure may be a single-click operation, a double-click operation, a sliding operation, and the like for a display screen of an AR device, and is not specifically limited herein as long as a corresponding trigger function can be implemented.
S206, after determining that the AR equipment identifies the graphic identification code information of all the entity objects in the entity object group, displaying the complete AR special effect corresponding to the entity object group.
This step is similar to the step S103, and is not described herein again.
In the AR interaction method in the embodiment of the present disclosure, since the target entity object in the entity object group may be identified, and the AR special effect segment matched with the identified target entity object may be displayed, in addition, after it is determined that the AR device identifies all the entity objects in the entity object group, the complete AR special effect corresponding to the entity object group may be displayed, so that the AR technology is applied to a specific scene with continuous plots, and different experiences are brought to the user.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same technical concept, an augmented reality AR interaction device corresponding to the augmented reality AR interaction method is further provided in the embodiment of the present disclosure, and as the principle of solving the problem of the device in the embodiment of the present disclosure is similar to that of the above-mentioned augmented reality AR interaction method in the embodiment of the present disclosure, the implementation of the device may refer to the implementation of the method, and repeated details are omitted.
Referring to fig. 8, a schematic diagram of an AR interaction apparatus 500 provided in an embodiment of the present disclosure is shown, where the AR interaction apparatus includes:
an entity object recognition module 501, configured to recognize graphic identification code information of a target entity object; the target entity object is any entity object in an entity object group, the entity object group comprises a plurality of entity objects, and the entity objects have the same first attribute characteristics;
a first special effect display module 502, configured to obtain and display an AR special effect fragment matched with the target entity object based on the identified graphic identification code information; the AR special effect fragment is one of the complete AR special effects corresponding to the entity object group;
a second special effect displaying module 503, configured to display the complete AR special effect corresponding to the entity object group after determining that the AR device identifies the graphic identification code information of all entity objects in the entity object group.
In one possible implementation, the complete AR special effect includes a panoramic special effect map corresponding to the target area, and the AR special effect segment includes a partial special effect map in the panoramic special effect map.
In a possible implementation, the first special effects presentation module 502 is further configured to:
determining and displaying the collection state information of the AR equipment aiming at the entity object group based on the historical identification record of the AR equipment; wherein the collection status information is used for reflecting the collection completion degree of the entity objects in the entity object group.
In one possible embodiment, the AR special effect fragment includes a virtual object and an AR special effect matching the virtual object; the entity object identifying module 501 is specifically configured to:
acquiring and displaying a virtual object matched with the target entity object and an AR special effect matched with the virtual object based on the recognized graphic identification code information; different entity objects in the entity object group respectively correspond to different virtual objects, and the different virtual objects have the same second attribute characteristics.
In a possible implementation manner, the entity object identification module 501 is specifically configured to:
acquiring and displaying the virtual object based on the identified graphic identification code information;
responding to the trigger operation aiming at the virtual object, and acquiring and displaying an AR special effect matched with the virtual object, wherein the virtual object and the AR special effect have a preset position relation.
In a possible implementation, the first special effects presentation module 502 is further configured to:
responding to the trigger operation aiming at the AR special effect fragment, and displaying the collection time corresponding to the target entity object; the collection time is a time when the AR device recognizes the graphic identification code information.
In a possible implementation, the first special effects presentation module 502 is further configured to:
displaying the AR virtual medal matched with the current collection state information under the condition that the collection state information meets the preset condition;
and responding to the trigger operation aiming at the AR virtual medal, and displaying the AR special effect matched with the AR virtual medal.
Referring to fig. 9, in a possible embodiment, the apparatus further comprises:
an AR mode triggering module 504, configured to respond to a triggering operation for the AR device, and control the AR device to enter an AR mode.
In a possible implementation manner, the AR mode triggering module 504 is specifically configured to:
responding to the trigger operation for acquiring the graphic identification code information, and displaying a loading page on the AR equipment, wherein the loading page is displayed with an AR mode trigger;
and responding to the trigger operation aiming at the AR mode trigger mark, and controlling the AR equipment to enter the AR mode.
In one possible implementation, the AR mode is implemented by a web page or applet.
The description of the processing flow of each module in the device and the interaction flow between the modules may refer to the related description in the above method embodiments, and will not be described in detail here.
Based on the same technical concept, the embodiment of the disclosure also provides an electronic device. Referring to fig. 10, a schematic structural diagram of an electronic device 700 provided in the embodiment of the present disclosure includes a processor 701, a memory 702, and a bus 703. The memory 702 is used for storing execution instructions and includes a memory 7021 and an external memory 7022; the memory 7021 is also referred to as an internal memory and temporarily stores operation data in the processor 701 and data exchanged with an external memory 7022 such as a hard disk, and the processor 701 exchanges data with the external memory 7022 via the memory 7021.
In this embodiment, the memory 702 is specifically configured to store application program codes for executing the scheme of the present application, and the processor 701 controls the execution. That is, when the electronic device 700 is operated, the processor 701 and the memory 702 communicate with each other through the bus 703, so that the processor 701 executes the application program code stored in the memory 702, thereby executing the method described in any of the foregoing embodiments.
The Memory 702 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like.
The processor 701 may be an integrated circuit chip having signal processing capabilities. The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 700. In other embodiments of the present application, the electronic device 700 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The embodiments of the present disclosure also provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the AR interaction method in the above method embodiments. The storage medium may be a volatile or non-volatile computer-readable storage medium.
The embodiments of the present disclosure also provide a computer program product, where the computer program product carries a program code, and instructions included in the program code may be used to execute steps of the AR interaction method in the foregoing method embodiments, which may be referred to specifically for the foregoing method embodiments, and are not described herein again.
The computer program product may be implemented by hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in this disclosure, it should be understood that the disclosed systems, devices, and methods may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions in actual implementation, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may also be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can still modify or easily conceive of changes in the technical solutions described in the foregoing embodiments or equivalent substitutions of some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.
Claims (13)
1. An Augmented Reality (AR) interaction method, comprising:
identifying graphic identification code information of the target entity object; the target entity object is any entity object in an entity object group, the entity object group comprises a plurality of entity objects, and the entity objects have the same first attribute characteristics;
acquiring and displaying an AR special effect fragment matched with the target entity object based on the recognized graphic identification code information; the AR special effect fragment is one special effect fragment in the complete AR special effect corresponding to the entity object group;
and after determining that the AR equipment identifies the graphic identification code information of all the entity objects in the entity object group, displaying the complete AR special effect corresponding to the entity object group.
2. The method of claim 1, wherein the complete AR special effect comprises a panoramic special effect map corresponding to a target area, and wherein the AR special effect segment comprises a partial special effect map in the panoramic special effect map.
3. The method according to claim 1 or 2, wherein after the obtaining and displaying the AR special effect fragment matched with the target entity object based on the recognized graphic identification code information, the method further comprises:
determining and displaying the collection state information of the AR equipment aiming at the entity object group based on the historical identification record of the AR equipment; wherein the collection status information is used for reflecting the collection completion degree of the entity objects in the entity object group.
4. The method of any of claims 1-3, wherein the AR special effect fragment comprises a virtual object and an AR special effect that matches the virtual object; the acquiring and displaying of the AR special effect fragment matched with the target entity object based on the recognized graphic identification code information comprises the following steps:
acquiring and displaying a virtual object matched with the target entity object and an AR special effect matched with the virtual object based on the recognized graphic identification code information; different entity objects in the entity object group respectively correspond to different virtual objects, and the different virtual objects have the same second attribute characteristics.
5. The method of claim 4, wherein the obtaining and displaying the virtual object matched with the target entity object and the AR special effect matched with the virtual object based on the recognized graphic identification code information comprises:
acquiring and displaying the virtual object based on the identified graphic identification code information;
responding to the trigger operation aiming at the virtual object, and acquiring and displaying an AR special effect matched with the virtual object, wherein the virtual object and the AR special effect have a preset position relation.
6. The method according to any one of claims 1-5, further comprising:
responding to the trigger operation aiming at the AR special effect fragment, and displaying the collection time corresponding to the target entity object; the collection time is a time when the AR device recognizes the graphic identification code information.
7. The method according to any one of claims 3-6, further comprising:
displaying the AR virtual medal matched with the current collection state information under the condition that the collection state information meets the preset condition;
and responding to the trigger operation aiming at the AR virtual medal, and displaying the AR special effect matched with the AR virtual medal.
8. The method according to any one of claims 1 to 7, wherein before the identifying the graphic identification code information of the target entity object, the method further comprises:
and responding to the trigger operation aiming at the AR equipment, and controlling the AR equipment to enter an AR mode.
9. The method of claim 8, wherein the controlling the AR device to enter the AR mode in response to the triggering operation of the AR device comprises:
responding to the trigger operation for acquiring the graphic identification code information, and displaying a loading page on the AR equipment, wherein the loading page is displayed with an AR mode trigger;
and responding to the trigger operation aiming at the AR mode trigger mark, and controlling the AR equipment to enter the AR mode.
10. The method according to claim 8 or 9, wherein the AR mode is implemented by a web page or an applet.
11. An AR interaction apparatus, comprising:
the entity object identification module is used for identifying the graphic identification code information of the target entity object; the target entity object is any entity object in an entity object group, the entity object group comprises a plurality of entity objects, and the entity objects have the same first attribute characteristics;
the first special effect display module is used for acquiring and displaying the AR special effect fragment matched with the target entity object based on the recognized graphic identification code information; the AR special effect fragment is one special effect fragment in the complete AR special effect corresponding to the entity object group;
and the second special effect display module is used for displaying the complete AR special effect corresponding to the entity object group after determining that the AR equipment identifies the graphic identification code information of all the entity objects in the entity object group.
12. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is operating, the machine-readable instructions when executed by the processor performing the steps of the AR interaction method of any of claims 1-10.
13. A computer-readable storage medium, having stored thereon a computer program which, when executed by a processor, performs the steps of the AR interaction method according to any one of claims 1-10.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110736104.9A CN113470186A (en) | 2021-06-30 | 2021-06-30 | AR interaction method and device, electronic equipment and storage medium |
PCT/CN2022/085956 WO2023273501A1 (en) | 2021-06-30 | 2022-04-08 | Ar interaction method and apparatus, and electronic device, medium and program |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110736104.9A CN113470186A (en) | 2021-06-30 | 2021-06-30 | AR interaction method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113470186A true CN113470186A (en) | 2021-10-01 |
Family
ID=77876454
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110736104.9A Withdrawn CN113470186A (en) | 2021-06-30 | 2021-06-30 | AR interaction method and device, electronic equipment and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113470186A (en) |
WO (1) | WO2023273501A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023273501A1 (en) * | 2021-06-30 | 2023-01-05 | 上海商汤智能科技有限公司 | Ar interaction method and apparatus, and electronic device, medium and program |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109213728A (en) * | 2017-06-29 | 2019-01-15 | 深圳市掌网科技股份有限公司 | Cultural relic exhibition method and system based on augmented reality |
CN109376776A (en) * | 2018-10-15 | 2019-02-22 | 百度在线网络技术(北京)有限公司 | Method and apparatus for playing music |
CN110716645A (en) * | 2019-10-15 | 2020-01-21 | 北京市商汤科技开发有限公司 | Augmented reality data presentation method and device, electronic equipment and storage medium |
CN112348969A (en) * | 2020-11-06 | 2021-02-09 | 北京市商汤科技开发有限公司 | Display method and device in augmented reality scene, electronic equipment and storage medium |
US20210150449A1 (en) * | 2019-11-18 | 2021-05-20 | Monday.Com | Digital processing systems and methods for dynamic object display of tabular information in collaborative work systems |
CN112884528A (en) * | 2021-03-23 | 2021-06-01 | 腾讯科技(深圳)有限公司 | Interactive processing method based on radio frequency identification and related device |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN205185621U (en) * | 2015-11-27 | 2016-04-27 | 河南连横信息技术有限公司 | Intelligence stamped postcard and intelligent stamped postcard suit |
CN106888203B (en) * | 2016-12-13 | 2020-03-24 | 阿里巴巴集团控股有限公司 | Virtual object distribution method and device based on augmented reality |
CN109167936A (en) * | 2018-10-29 | 2019-01-08 | Oppo广东移动通信有限公司 | A kind of image processing method, terminal and storage medium |
US11094114B2 (en) * | 2019-02-08 | 2021-08-17 | Ursa Space Systems Inc. | Satellite SAR artifact suppression for enhanced three-dimensional feature extraction, change detection, and visualizations |
CN111062704A (en) * | 2019-12-10 | 2020-04-24 | 支付宝(杭州)信息技术有限公司 | Method and device for identifying graphic code |
CN113470186A (en) * | 2021-06-30 | 2021-10-01 | 北京市商汤科技开发有限公司 | AR interaction method and device, electronic equipment and storage medium |
CN113470187A (en) * | 2021-06-30 | 2021-10-01 | 北京市商汤科技开发有限公司 | AR collection method, terminal, device and storage medium |
CN113345110A (en) * | 2021-06-30 | 2021-09-03 | 北京市商汤科技开发有限公司 | Special effect display method and device, electronic equipment and storage medium |
-
2021
- 2021-06-30 CN CN202110736104.9A patent/CN113470186A/en not_active Withdrawn
-
2022
- 2022-04-08 WO PCT/CN2022/085956 patent/WO2023273501A1/en unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109213728A (en) * | 2017-06-29 | 2019-01-15 | 深圳市掌网科技股份有限公司 | Cultural relic exhibition method and system based on augmented reality |
CN109376776A (en) * | 2018-10-15 | 2019-02-22 | 百度在线网络技术(北京)有限公司 | Method and apparatus for playing music |
CN110716645A (en) * | 2019-10-15 | 2020-01-21 | 北京市商汤科技开发有限公司 | Augmented reality data presentation method and device, electronic equipment and storage medium |
US20210150449A1 (en) * | 2019-11-18 | 2021-05-20 | Monday.Com | Digital processing systems and methods for dynamic object display of tabular information in collaborative work systems |
CN112348969A (en) * | 2020-11-06 | 2021-02-09 | 北京市商汤科技开发有限公司 | Display method and device in augmented reality scene, electronic equipment and storage medium |
CN112884528A (en) * | 2021-03-23 | 2021-06-01 | 腾讯科技(深圳)有限公司 | Interactive processing method based on radio frequency identification and related device |
Non-Patent Citations (2)
Title |
---|
CHUN-HSIUNG LEE .ETAL: "What drives stickiness in location-based AR games? An examination of flow and satisfaction", TELEMATICS AND INFORMATICS, vol. 35, no. 07, pages 1958 - 1970 * |
张永生 等: "基于AR的互动式3D电子书的研发与实现", 齐齐哈尔大学学报(自然科学版), vol. 32, no. 02, pages 60 - 63 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023273501A1 (en) * | 2021-06-30 | 2023-01-05 | 上海商汤智能科技有限公司 | Ar interaction method and apparatus, and electronic device, medium and program |
Also Published As
Publication number | Publication date |
---|---|
WO2023273501A1 (en) | 2023-01-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9805511B2 (en) | Interacting with data fields on a page using augmented reality | |
US10204216B2 (en) | Verification methods and verification devices | |
US20160321814A1 (en) | Information processing method and system | |
CN111914775B (en) | Living body detection method, living body detection device, electronic equipment and storage medium | |
CN112288883B (en) | Method and device for prompting operation guide information, electronic equipment and storage medium | |
CN113282687A (en) | Data display method and device, computer equipment and storage medium | |
CN108805577B (en) | Information processing method, device, system, computer equipment and storage medium | |
CN113961794A (en) | Book recommendation method and device, computer equipment and storage medium | |
CN107220291A (en) | The method and system of the anti-crawl of web data | |
CN111666014B (en) | Message pushing method, device, equipment and computer readable storage medium | |
CN111652983A (en) | Augmented reality AR special effect generation method, device and equipment | |
CN112905014A (en) | Interaction method and device in AR scene, electronic equipment and storage medium | |
CN114153548A (en) | Display method and device, computer equipment and storage medium | |
CN113470186A (en) | AR interaction method and device, electronic equipment and storage medium | |
WO2022252518A1 (en) | Data presentation method and apparatus, and computer device, storage medium and computer program product | |
KR102234172B1 (en) | Apparatus and method for providing digital twin book shelf | |
CN113326709B (en) | Display method, device, equipment and computer readable storage medium | |
CN112991555B (en) | Data display method, device, equipment and storage medium | |
CN111744197B (en) | Data processing method, device and equipment and readable storage medium | |
Ouali et al. | Real-time application for recognition and visualization of arabic words with vowels based dl and ar | |
CN108170838B (en) | Topic evolution visualization display method, application server and computer readable storage medium | |
CN113345110A (en) | Special effect display method and device, electronic equipment and storage medium | |
CN116563503A (en) | Augmented reality-based display processing method, device, equipment and storage medium | |
CN114049467A (en) | Display method, display device, display apparatus, storage medium, and program product | |
CN111986332A (en) | Method and device for displaying message board, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
REG | Reference to a national code |
Ref country code: HK Ref legal event code: DE Ref document number: 40051300 Country of ref document: HK |
|
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20211001 |
|
WW01 | Invention patent application withdrawn after publication |