CN113359986A - Augmented reality data display method and device, electronic equipment and storage medium - Google Patents

Augmented reality data display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113359986A
CN113359986A CN202110620483.5A CN202110620483A CN113359986A CN 113359986 A CN113359986 A CN 113359986A CN 202110620483 A CN202110620483 A CN 202110620483A CN 113359986 A CN113359986 A CN 113359986A
Authority
CN
China
Prior art keywords
special effect
equipment
target
data
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110620483.5A
Other languages
Chinese (zh)
Other versions
CN113359986B (en
Inventor
田真
李斌
欧华富
王婷婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sensetime Technology Development Co Ltd
Original Assignee
Beijing Sensetime Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sensetime Technology Development Co Ltd filed Critical Beijing Sensetime Technology Development Co Ltd
Priority to CN202110620483.5A priority Critical patent/CN113359986B/en
Publication of CN113359986A publication Critical patent/CN113359986A/en
Application granted granted Critical
Publication of CN113359986B publication Critical patent/CN113359986B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/44Browsing; Visualisation therefor
    • G06F16/444Spatial browsing, e.g. 2D maps, 3D or virtual spaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/48Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/487Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location

Abstract

The present disclosure provides an augmented reality data display method, apparatus, electronic device and storage medium, the method comprising: responding to the target information code on the augmented reality AR equipment scanning bottle body, and determining a target exhibition area to be visited; acquiring positioning information of the AR equipment; and under the condition that the AR equipment is determined to reach an explanation area corresponding to any preset knowledge point in the target exhibition area based on the positioning information, playing audio explanation data matched with any preset knowledge point through the AR equipment, and displaying first AR special effect data matched with the audio explanation data.

Description

Augmented reality data display method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of augmented reality technologies, and in particular, to a method and an apparatus for displaying augmented reality data, an electronic device, and a storage medium.
Background
Augmented Reality (AR) technology is a technology that skillfully integrates virtual information with the real world, and the technology uses various technical means such as multimedia, three-dimensional modeling, real-time tracking and registration, intelligent interaction, sensing and the like. The technology can be used for simulating and simulating virtual information such as characters, images, three-dimensional models, music, videos and the like generated by a computer and then applying the virtual information to the real world, and the two kinds of information are mutually supplemented, so that the real world is enhanced. Therefore, it is increasingly important to provide an augmented reality AR data display method.
Disclosure of Invention
In view of the above, the present disclosure at least provides an augmented reality data display method, an augmented reality data display apparatus, an electronic device and a storage medium.
In a first aspect, the present disclosure provides an augmented reality data display method, including:
responding to the target information code on the augmented reality AR equipment scanning bottle body, and determining a target exhibition area to be visited;
acquiring positioning information of the AR equipment;
and under the condition that the AR equipment is determined to reach an explanation area corresponding to any preset knowledge point in the target exhibition area based on the positioning information, playing audio explanation data matched with any preset knowledge point through the AR equipment, and displaying first AR special effect data matched with the audio explanation data.
According to the method, the target information code on the bottle body is scanned in response to the AR equipment, the target exhibition area to be visited is determined, and the bottle body is a common object in any scene, so that the target information code is arranged on the bottle body, the target information code and the common object in the scene are combined, the AR equipment can enter an AR navigation process conveniently and quickly, and the AR data exhibition efficiency is improved.
Meanwhile, the AR equipment is controlled to play the audio explanation data corresponding to any preset knowledge point, the first AR special effect data matched with the currently played audio explanation data is displayed while the audio explanation data is played, the audio explanation data is assisted and explained by the aid of the first AR special effect data, the explanation process of the preset knowledge point is clear and visual, and the explanation efficiency of the preset knowledge point is improved.
In one possible embodiment, the determining a target exhibition area to be visited in response to the augmented reality AR device scanning a target information code on the bottle body includes:
responding to a target information code on the bottle body scanned by the AR equipment, and determining a target exhibition area to be visited and a destination position of the AR equipment in the target exhibition area;
the method further comprises the following steps:
generating navigation guidance information from the current location to the destination location based on the destination location and a current location of the AR device;
and displaying the navigation guidance information through the AR equipment.
Here, the navigation guidance information from the current location to the destination location may be generated based on the determined destination location within the target exhibition area and the current location of the AR device, and the navigation guidance information may be displayed by the AR device, so that the AR device may reach the destination location according to the navigation guidance information, improving the diversity and flexibility of AR data display.
In one possible embodiment, the determining a target exhibition area to be visited and a destination location where the AR device is located within the target exhibition area in response to the AR device scanning the target information code on the bottle body includes:
responding to a target information code on the bottle body scanned by the AR equipment, and displaying resource pushing information through the AR equipment;
responding to a triggering operation aiming at the resource pushing information, and determining the destination position corresponding to the resource pushing information and located in a target exhibition area to be visited; and the target position is the position for receiving the target resource corresponding to the resource pushing information.
In this embodiment, can respond to the target information code on the AR equipment scanning bottle, show resource propelling movement information, for example, prize winning information etc. through AR equipment for the show of information is comparatively nimble.
In one possible embodiment, the audio interpretation data corresponds to AR special effect data having a plurality of styles, and the method further includes:
and determining first AR special effect data matched with the audio explanation data from multiple types of AR special effect data corresponding to the audio explanation data according to the historical record information of the AR equipment and/or the user information of the target user to which the AR equipment belongs.
Here, the audio explanation data may correspond to a plurality of types of AR special effect data, and the first AR special effect data matched with the audio explanation data may be determined from a plurality of types of AR special effect data corresponding to the audio explanation data according to history information of the AR device and/or user information of a target user to which the AR device belongs, so that different AR devices may display different first AR special effect data when playing the same audio explanation data, or the same AR device may display different first AR special effect data when playing the same audio explanation data for a plurality of times, thereby improving flexibility and diversity of display of the first AR special effect data.
In one possible embodiment, the presenting the first AR special effects data matched with the audio explanation data includes:
generating AR special effect data matched with a target user based on user information of the target user to which the AR equipment belongs;
controlling the AR equipment to sequentially display first AR special effect data matched with the audio explanation data and AR special effect data matched with the target user; or overlapping the AR special effect matched with the target user and the first AR special effect matched with the audio explanation data to generate overlapped second AR special effect data, and displaying the second AR special effect data through the AR equipment.
In this embodiment, the AR special effect data matched with the target user may be generated based on the user information of the target user to which the AR device belongs, and then the second special effect data matched with the target user of the AR device may be generated based on the AR special effect data matched with the target user and the first AR special effect data matched with the audio interpretation data, so that the generated second special effect data has uniqueness, and the AR special effect data is enriched. Or the AR equipment can be controlled to sequentially display the first AR special effect data matched with the audio explanation data and the AR special effect data matched with the target user, so that the diversity of the displayed AR special effect data is improved.
In a possible embodiment, the method further comprises:
and displaying the determined third AR special effect data through the AR equipment under the condition that the AR equipment reaches the destination position in the target exhibition area and/or the AR equipment has played the audio explanation data corresponding to each preset knowledge point in the target exhibition area.
In a possible implementation manner, the playing, by the AR device, audio explanation data matched with any preset knowledge point, and displaying first AR special effect data matched with the audio explanation data includes:
and under the condition that the AR equipment meets the triggering display condition of the first AR special effect data based on the positioning information, controlling the AR equipment to play the audio explanation data and displaying the first AR special effect data.
Here, when the AR device satisfies the trigger presentation condition of the first AR special effect data, the AR device is controlled to play the audio explanation data, and the AR special effect data is presented, so that a situation that the playing effect of the first AR special effect data is poor when the positioning information of the AR device is not appropriate is avoided, and the presentation effect of the AR special effect data is improved.
In a possible embodiment, the method further comprises:
and controlling the AR device to display guiding information for guiding adjustment of positioning information of the AR device when it is determined that the AR device does not meet a trigger display condition of the first AR special effect data based on the positioning information.
Here, when the AR equipment does not satisfy the trigger show condition of AR special effect data, can control the display of AR equipment to be used for guiding the guide information of the locating information of adjusting the AR equipment to alright adjust the locating information of AR equipment according to guide information, make the first AR special effect data of comparatively clear audio-visual show of AR equipment after the adjustment, avoid the omission to predetermineeing the knowledge point, improve the efficiency of explaining of predetermineeing the knowledge point.
In one possible implementation, the method is applied to a client application platform, and the client application platform is a Web application platform or an applet application platform.
The following descriptions of the effects of the apparatus, the electronic device, and the like refer to the description of the above method, and are not repeated here.
In a second aspect, the present disclosure provides an augmented reality data presentation device, comprising:
the first determination module is used for responding to the target information code on the augmented reality AR equipment scanning bottle body and determining a target exhibition area to be visited;
an obtaining module, configured to obtain positioning information of the AR device;
and the first display module is used for determining that the AR equipment reaches an explanation area corresponding to any preset knowledge point in the target exhibition area based on the positioning information, playing audio explanation data matched with any preset knowledge point through the AR equipment, and displaying first AR special effect data matched with the audio explanation data.
In one possible embodiment, the first determining module, when determining the target exhibition area to be visited in response to the augmented reality AR device scanning the target information code on the bottle body, is configured to:
responding to a target information code on the bottle body scanned by the AR equipment, and determining a target exhibition area to be visited and a destination position of the AR equipment in the target exhibition area;
the device further comprises: a generation module to:
generating navigation guidance information from the current location to the destination location based on the destination location and a current location of the AR device;
and displaying the navigation guidance information through the AR equipment.
In one possible embodiment, the first determining module, when determining the target exhibition area to be visited and the destination location where the AR device is located within the target exhibition area in response to the AR device scanning the target information code on the bottle, is configured to:
responding to a target information code on the bottle body scanned by the AR equipment, and displaying resource pushing information through the AR equipment;
responding to a triggering operation aiming at the resource pushing information, and determining the destination position corresponding to the resource pushing information and located in a target exhibition area to be visited; and the target position is the position for receiving the target resource corresponding to the resource pushing information.
In one possible embodiment, the audio interpretation data corresponds to AR special effect data of a plurality of styles, and the apparatus further includes: a second determination module to:
and determining first AR special effect data matched with the audio explanation data from multiple types of AR special effect data corresponding to the audio explanation data according to the historical record information of the AR equipment and/or the user information of the target user to which the AR equipment belongs.
In one possible embodiment, the first presentation module, when presenting the first AR special effects data matching the audio explanation data, is configured to:
generating AR special effect data matched with a target user based on user information of the target user to which the AR equipment belongs;
controlling the AR equipment to sequentially display first AR special effect data matched with the audio explanation data and AR special effect data matched with the target user; or overlapping the AR special effect matched with the target user and the first AR special effect matched with the audio explanation data to generate overlapped second AR special effect data, and displaying the second AR special effect data through the AR equipment.
In a possible embodiment, the apparatus further comprises: a second display module to:
and displaying the determined third AR special effect data through the AR equipment under the condition that the AR equipment reaches the destination position in the target exhibition area and/or the AR equipment has played the audio explanation data corresponding to each preset knowledge point in the target exhibition area.
In a possible implementation manner, the first presentation module, when playing, by the AR device, audio interpretation data matched with any preset knowledge point and presenting first AR special effect data matched with the audio interpretation data, is configured to:
and under the condition that the AR equipment meets the triggering display condition of the first AR special effect data based on the positioning information, controlling the AR equipment to play the audio explanation data and displaying the first AR special effect data.
In a possible embodiment, the apparatus further comprises: a third display module to:
and controlling the AR device to display guiding information for guiding adjustment of positioning information of the AR device when it is determined that the AR device does not meet a trigger display condition of the first AR special effect data based on the positioning information.
In one possible implementation, the method is applied to a client application platform, and the client application platform is a Web application platform or an applet application platform.
In a third aspect, the present disclosure provides an electronic device comprising: a processor, a memory and a bus, wherein the memory stores machine-readable instructions executable by the processor, the processor and the memory communicate via the bus when the electronic device is running, and the machine-readable instructions are executed by the processor to perform the steps of the augmented reality data presentation method according to the first aspect or any one of the embodiments.
In a fourth aspect, the present disclosure provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the augmented reality data presentation method according to the first aspect or any one of the embodiments.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1 is a schematic flowchart illustrating an augmented reality data presentation method according to an embodiment of the present disclosure;
fig. 2 illustrates an interface schematic diagram of an AR device provided by an embodiment of the present disclosure;
fig. 3 shows an interface schematic diagram of an AR device provided by an embodiment of the present disclosure;
fig. 4 shows an interface schematic diagram of an AR device provided by an embodiment of the present disclosure;
fig. 5 is a schematic diagram illustrating a triggering display condition of first AR special effect data in an augmented reality data display method provided by an embodiment of the present disclosure;
fig. 6 shows an interface schematic diagram of an AR device provided by an embodiment of the present disclosure;
fig. 7 is a schematic flow chart illustrating another augmented reality data presentation method provided by the embodiment of the present disclosure;
fig. 8 is a schematic diagram illustrating an architecture of an augmented reality data presentation apparatus provided in an embodiment of the present disclosure;
fig. 9 shows a schematic structural diagram of an electronic device provided in an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of the embodiments of the present disclosure, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
Augmented Reality (AR) technology is a technology that skillfully integrates virtual information with the real world, and the technology uses various technical means such as multimedia, three-dimensional modeling, real-time tracking and registration, intelligent interaction, sensing and the like. The technology can be used for simulating and simulating virtual information such as characters, images, three-dimensional models, music, videos and the like generated by a computer and then applying the virtual information to the real world, and the two kinds of information are mutually supplemented, so that the real world is enhanced. Therefore, the embodiment of the disclosure provides an augmented reality data display method and device, an electronic device and a storage medium.
The above-mentioned drawbacks are the results of the inventor after practical and careful study, and therefore, the discovery process of the above-mentioned problems and the solutions proposed by the present disclosure to the above-mentioned problems should be the contribution of the inventor in the process of the present disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
For facilitating understanding of the embodiments of the present disclosure, a detailed description will be first given of an augmented reality data display method disclosed in the embodiments of the present disclosure. An executing subject of the augmented reality data display method provided by the embodiment of the present disclosure may be an AR device, and the AR device is an intelligent device capable of supporting an AR function, for example, the AR device includes but is not limited to a mobile phone, a tablet, AR glasses, and the like.
Referring to fig. 1, a schematic flow chart of an augmented reality data display method provided in the embodiment of the present disclosure is shown, where the method includes S101-S103, where:
s101, responding to a target information code on a bottle body scanned by the augmented reality AR equipment, and determining a target exhibition area to be visited;
s102, acquiring positioning information of the AR equipment;
s103, under the condition that the AR equipment is determined to reach an explanation area corresponding to any preset knowledge point in the target exhibition area based on the positioning information, audio explanation data matched with any preset knowledge point is played through the AR equipment, and first AR special effect data matched with the audio explanation data is displayed.
According to the method, the target information code on the bottle body is scanned in response to the AR equipment, the target exhibition area to be visited is determined, and the bottle body is a common object in any scene, so that the target information code is arranged on the bottle body, the target information code and the common object in the scene are combined, the AR equipment can enter an AR navigation process conveniently and quickly, and the AR data exhibition efficiency is improved.
Meanwhile, the AR equipment is controlled to play the audio explanation data corresponding to any preset knowledge point, the first AR special effect data matched with the currently played audio explanation data is displayed while the audio explanation data is played, the audio explanation data is assisted and explained by the aid of the first AR special effect data, the explanation process of the preset knowledge point is clear and visual, and the explanation efficiency of the preset knowledge point is improved.
S101 to S103 will be specifically described below.
For S101:
wherein, the bottle body can be any bottle-shaped object such as a water bottle, a beer bottle, a beverage bottle and the like.
During implementation, the AR equipment can be controlled to scan the target information code on the bottle body, and the target exhibition area to be visited corresponding to the AR equipment is determined in response to the target information code scanned by the AR equipment. Wherein, this target information code can be the information code of printing on the bottle such as two-dimensional code, bar code.
The target exhibition area can be the exhibition area that this target information sign indicating number corresponds, and the target information sign indicating number on the different bottles can correspond the same target exhibition area, also can correspond different target exhibition areas. Wherein, the target exhibition area can be set according to the actual situation. For example, the target exhibition area may be an exhibition area corresponding to any exhibition hall such as a museum, an exhibition hall, a memorial hall, or an exhibition area set in the exhibition hall, for example, the target exhibition area may be a calligraphy exhibition area or a painting exhibition area in a painting and calligraphy exhibition hall, or the like.
In an alternative embodiment, the determining a target exhibition area to be visited in response to the augmented reality AR device scanning the target information code on the bottle body may include: and responding to the target information code on the bottle body scanned by the AR equipment, and determining a target exhibition area to be visited and a destination position of the AR equipment in the target exhibition area.
In implementation, different target information codes may correspond to different destination locations, where the destination locations corresponding to the target information codes may be configured according to actual conditions. For example, when the target exhibition area is an exhibition area corresponding to a beer exhibition hall, the destination position corresponding to the first target information code may be a hop production area in the beer exhibition hall, and the destination position corresponding to the second target information code may be a beer fermentation area in the beer exhibition hall, etc.
The augmented reality data display method further comprises the following steps: generating navigation guidance information from the current location to the destination location based on the destination location and a current location of the AR device; and displaying the navigation guidance information through the AR equipment.
After the target information code on the bottle body is scanned by responding to the AR equipment and the target position of the AR equipment in the target exhibition area is determined, the current position of the AR equipment can be determined; and generating navigation guidance information from the current location to the destination location according to the current location and the destination location of the AR device. Wherein, the current position of the AR device may be determined according to a positioning sensor on the AR device; the current position of the AR device can also be determined by a visual positioning mode according to the current scene image and the three-dimensional scene model acquired by the AR device.
And the navigation guidance information can be displayed by the AR device, so that the AR device can be controlled to move to the destination position based on the navigation guidance information. Referring to the interface diagram of an AR device shown in fig. 2, the navigation guidance information is shown in fig. 2, for example, the navigation guidance information may be "move forward by 10 meters and then turn right", and/or a navigation route map located at the upper left position of fig. 2.
Here, the navigation guidance information from the current location to the destination location may be generated based on the determined destination location within the target exhibition area and the current location of the AR device, and the navigation guidance information may be displayed by the AR device, so that the AR device may reach the destination location according to the navigation guidance information, improving the diversity and flexibility of AR data display.
In an alternative embodiment, the determining a target exhibition area to be visited and a destination location where the AR device is located within the target exhibition area in response to the AR device scanning the target information code on the bottle body may include:
step A1, responding to the target information code on the bottle body scanned by the AR equipment, and displaying resource pushing information through the AR equipment;
step A2, in response to a trigger operation for the resource pushing information, determining the destination location corresponding to the resource pushing information and located in a target exhibition area to be visited; and the target position is the position for receiving the target resource corresponding to the resource pushing information.
When implementing, can control the target information code that AR equipment can scan on the bottle, respond to the target information code on AR equipment scanning bottle, show resource propelling movement information through AR equipment. For example, the resource pushing information may be winning information. Referring to the interface diagram of an AR device shown in fig. 3, the resource pushing information is shown in fig. 3, for example, the resource pushing information may be "congratulate you and won a prize".
For example, voice data input by a user, a touch screen operation, and the like may be used as a trigger operation for the resource pushing information, for example, the voice data may be "prize winning", and the touch screen operation may be clicking a display position corresponding to the resource pushing information, and the like. And responding to the triggering operation aiming at the resource pushing information, and determining a destination position corresponding to the resource pushing information and located in a target exhibition area to be visited, wherein the destination position is a position for picking up a target resource corresponding to the resource pushing information. For example, when the resource pushing information is winning information, the destination location may be a winning area set in the target exhibition area.
In this embodiment, can respond to the target information code on the AR equipment scanning bottle, show resource propelling movement information, for example, prize winning information etc. through AR equipment for the show of information is comparatively nimble.
For S102:
according to an optional implementation mode, the positioning information of the AR equipment can be determined according to a sensor arranged on the AR equipment, and the determination process of the positioning information of the AR equipment is simple and quick. For example, the sensor may be a Global Positioning System (GPS), an Inertial sensor (IMU), or the like.
In another alternative embodiment, a real-time scene image acquired by the AR device may be acquired; and determining the positioning information of the AR equipment based on the real-time scene image and the constructed three-dimensional scene model corresponding to the target exhibition area.
In implementation, after a real-time scene image acquired by the AR device is acquired, feature points in the real-time scene image can be extracted, the feature points are matched with feature point clouds included in the three-dimensional scene model, and positioning information when the AR device acquires the real-time scene image is determined. The positioning information may include position information and/or orientation information, for example, the position information may be coordinate information of the AR device in a coordinate system corresponding to the three-dimensional scene model; the orientation information may be an euler angle corresponding to the AR device.
In implementation, the three-dimensional scene model of the target exhibition area can be constructed according to the following steps: collecting multiple frames of scene images at different positions, different angles and different times in a target exhibition area, and extracting feature points of each frame of scene image to obtain a point cloud set corresponding to each frame of scene image; and acquiring the characteristic point clouds corresponding to the target exhibition area by utilizing the point cloud sets respectively corresponding to the multiple frames of scene images, wherein the characteristic point clouds corresponding to the target exhibition area form a three-dimensional scene model.
Or collecting scene videos corresponding to the target exhibition area at different positions, different angles and different times, acquiring multiple frames of video frames from the collected scene videos, and extracting feature points of each frame of video frames to obtain a point cloud set corresponding to each frame of video frames; and obtaining a three-dimensional scene model corresponding to the target exhibition area by using the point cloud sets respectively corresponding to the multiple frames of video frames.
Here, the positioning information of the AR device can be determined more accurately by using the real-time scene image acquired by the AR device and the constructed three-dimensional scene model.
For S103:
the preset knowledge points can be any interpretable knowledge points in the target exhibition area. Illustratively, when the target exhibition area is a painting and calligraphy exhibition hall, the preset knowledge point can be the name of any painting and calligraphy in exhibition; for example, the preset knowledge points may be: mona Lisa and Qingming Shang river map, etc. When the target exhibition area is a beer exhibition hall, the preset knowledge points may be titles of beer making processes, for example: hop production, beer fermentation, and the like.
During implementation, the spatial position of each preset knowledge point and the explanation area corresponding to each preset knowledge point can be determined in the constructed three-dimensional scene model, wherein the spatial position and the explanation area corresponding to each preset knowledge point can be set according to an actual scene. For example, a preset knowledge point for hop production may be aimed at, an exhibition area of a hop production process within a target exhibition area may be used as an explanation area of the preset knowledge point, and a central position of the explanation area may be used as a spatial position of the preset knowledge point. For example, for a preset knowledge point of the 'river map on Qingming', an exhibition area corresponding to the 'river map on Qingming' in the target exhibition area may be used as the exhibition area corresponding to the preset knowledge point, and a position of the 'river map on Qingming' in the target exhibition area may be determined as a spatial position of the preset knowledge point.
During implementation, whether the AR equipment is located in the explanation area of the preset knowledge point or not can be determined based on the explanation area corresponding to the preset knowledge point and the position information indicated by the positioning information of the AR equipment, and if yes, the audio explanation data corresponding to the preset knowledge point and the AR special effect data matched with the audio explanation data are obtained. And playing the audio explanation data matched with the preset knowledge point through the AR equipment, and displaying the first AR special effect data matched with the audio explanation data.
The audio explanation data can be preset audio data for explaining the preset knowledge points; the first AR special effect data may be AR data comprising one or more of: video content, image content, text content. For example, when the preset knowledge point is Mona Lisa, the audio explanation data may be audio data explaining author information, creation background and other information of Mona Lisa; the first AR special effect data may be AR data of a female figure included in monna lisa; alternatively, the first AR special effect data may be AR data of an author of monna lisa or the like. The first AR special effect data matched with the audio explanation data can be set as required; the audio explanation data can correspond to one AR special effect data and can also correspond to a plurality of AR special effect data.
The display pose of the first AR special effect data can be set in the three-dimensional scene model according to needs. For example, the display pose of the first AR special effect data may be located in an explanation area of the preset knowledge point.
In implementation, the corresponding first AR special effect data may be matched with each piece of audio interpretation data in advance, and after the audio interpretation data is acquired, the first AR special effect data matched with the audio interpretation data may be determined.
Or after the audio explanation data corresponding to any preset knowledge point is acquired, the first AR special effect data matched with the audio explanation data can be determined in real time. For example, after audio explanation data corresponding to any preset knowledge point is acquired, a target keyword corresponding to the audio explanation data may be identified; or, the text content of the audio explanation data can be determined, the text content corresponding to the audio explanation data is identified, and the target keywords included in the audio explanation data are determined; the target keyword can be a word related to the preset knowledge point. And then, determining first AR special effect data matched with the audio explanation data based on the target keywords corresponding to the audio explanation data and the keywords corresponding to each piece of first AR special effect data.
During implementation, when the AR equipment is controlled to play the audio explanation data, the AR equipment can be controlled to display the first AR special effect data related to the target keyword according to the target keyword of the currently played audio explanation data, so that the content of the played first AR special effect data is consistent with that of the played audio explanation data, and the preset knowledge point can be explained and displayed clearly.
In this embodiment, the audio explanation data corresponding to each preset knowledge point may be predetermined according to the following steps: acquiring audio explanation data to be processed corresponding to a target area; determining target keywords respectively matched with a plurality of preset knowledge points from the audio explanation data to be processed according to a preset knowledge point library; and respectively determining audio explanation data corresponding to each preset knowledge point from the audio explanation data to be processed according to the corresponding playing time position of each target keyword in the audio explanation data to be processed.
During implementation, to-be-processed audio explanation data corresponding to the target exhibition area can be generated according to a plurality of knowledge point information included in the target exhibition area, wherein the to-be-processed audio explanation data can be complete audio data explaining each knowledge point respectively. Determining target keywords respectively matched with the plurality of preset knowledge points from the audio explanation data to be processed according to a preset knowledge point library; or determining target keywords respectively matched with the plurality of preset knowledge points from text contents corresponding to the audio explanation data to be processed. And finally, segmenting the audio explanation data to be processed according to the corresponding playing time position of each target keyword in the audio explanation data to be processed, and determining the audio explanation data corresponding to each preset knowledge point from a plurality of audio explanation data obtained after segmentation.
In an alternative embodiment, the audio explanation data corresponds to AR special effect data of a plurality of styles, and the method further includes: and determining first AR special effect data matched with the audio explanation data from multiple types of AR special effect data corresponding to the audio explanation data according to the historical record information of the AR equipment and/or the user information of the target user to which the AR equipment belongs.
Here, the audio explanation data may correspond to a plurality of types of AR special effect data, and the first AR special effect data matched with the audio explanation data may be determined from a plurality of types of AR special effect data corresponding to the audio explanation data according to history information of the AR device and/or user information of a target user to which the AR device belongs, so that different AR devices may display different first AR special effect data when playing the same audio explanation data, or the same AR device may display different first AR special effect data when playing the same audio explanation data for a plurality of times, thereby improving flexibility and diversity of display of the first AR special effect data.
Wherein the history information of the AR device may include at least one of the following information: the times of scanning the information code by the AR equipment, the equipment information of the AR equipment, the historical visit record of the AR equipment and the like; the user information of the target user to which the AR device belongs comprises at least one of the following: age, gender, occupation, personal preference data of the target user, etc.
For example, the tour level of the AR device may be determined according to the number of times that the AR device scans the information code, and the greater the number of times, the higher the level is, different levels may correspond to different types of AR special effect data, for example, the AR special effect data corresponding to level one may include text content, the AR special effect data corresponding to level two may include image content, the AR special effect data corresponding to level three may include video content, and the like. And then, according to the times of scanning the information codes by the AR equipment, determining first AR special effect data matched with the audio explanation data from multiple types of AR special effect data corresponding to the audio explanation data.
The first AR special effect data matched with the audio explanation data can be determined from multiple types of AR special effect data corresponding to the audio explanation data according to historical tour records of the AR device. For example, if the history browsing record of the AR device indicates that the AR device repeatedly displays AR special effect data corresponding to the first type, the AR special effect data corresponding to the first type may be determined as the first AR special effect data matched with the audio explanation data from multiple types of AR special effect data corresponding to the audio explanation data.
Or, the AR device may be further instructed to display the AR special effect data corresponding to the type one last time according to the history browsing record of the AR device, and then, from among multiple types of AR special effect data corresponding to the audio explanation data, other types of AR special effect data than the type one may be determined as the first AR special effect data matched with the audio explanation data.
For example, matching user information may be determined in advance for each AR special effect data. And then, according to the user information of the target user, determining first AR special effect data matched with the audio explanation data from multiple types of AR special effect data corresponding to the audio explanation data. For example, if the target user is a woman, the AR special effect data matched with the woman may be determined from the multiple types of AR special effect data as the first AR special effect data matched with the audio interpretation data.
In an alternative embodiment, the presenting the first AR special effects data matched with the audio explanation data may include:
step B1, based on the user information of the target user to which the AR equipment belongs, generating AR special effect data matched with the target user;
step B2, controlling the AR equipment to sequentially display first AR special effect data matched with the audio explanation data and AR special effect data matched with the target user;
and step B3, superposing the AR special effect matched with the target user and the first AR special effect matched with the audio explanation data to generate superposed second AR special effect data, and displaying the second AR special effect data through the AR equipment.
In step B1, AR special effect data matched with the target user may be generated based on the user information of the target user to which the AR device belongs, for example, if the user information of the target user is: sex: female, age: and if the user is 20 years old, AR special effect data matched with the target user can be generated, for example, the AR special effect data can comprise the AR special effect of the cartoon image.
After the step B1 is performed, a step B2 may be performed, that is, the AR device may be controlled to display the first AR special effect data matched with the audio explanation data, and then the AR device may be controlled to display the AR special effect data matched with the target user; or, the AR device may be controlled to display the AR special effect data matched with the target user first, and then the AR device may be controlled to display the first AR special effect data matched with the audio explanation data.
After the step B1 is performed, a step B3 may also be performed, that is, the AR special effect matched with the target user may be superimposed on the first AR special effect matched with the audio interpretation data, so as to generate superimposed second AR special effect data, and the second AR special effect data is displayed by the AR device. Referring to fig. 4, which is an interface diagram of an AR device, fig. 4 shows an AR special effect 41 matched with a target user and a first AR special effect 42 matched with audio interpretation data.
In this embodiment, the AR special effect data matched with the target user may be generated based on the user information of the target user to which the AR device belongs, and then the second special effect data matched with the target user of the AR device may be generated based on the AR special effect data matched with the target user and the first AR special effect data matched with the audio interpretation data, so that the generated second special effect data has uniqueness, and the AR special effect data is enriched. Or the AR equipment can be controlled to sequentially display the first AR special effect data matched with the audio explanation data and the AR special effect data matched with the target user, so that the diversity of the displayed AR special effect data is improved.
In an alternative embodiment, the method further comprises: and displaying the determined third AR special effect data through the AR equipment under the condition that the AR equipment reaches the destination position in the target exhibition area and/or the AR equipment has played the audio explanation data corresponding to each preset knowledge point in the target exhibition area.
In this embodiment, when the AR device reaches the destination position in the target exhibition area, and/or when the AR device has played the audio interpretation data corresponding to each preset knowledge point in the target exhibition area, it may be determined that this navigation of the AR device is finished, and then the determined third AR special effect data may be displayed by the AR device.
The third AR special effect data may be set according to actual conditions, for example, the third AR special effect data may be AR special effect data related to a destination location, or may also be AR special effect data summarizing and explaining a target exhibition area.
In implementation, the third AR special effect data of multiple styles may be preset, and then the third AR special effect data to be displayed may be determined from the preset third AR special effect data of multiple styles according to history information of the AR device and/or user information of a target user to which the AR device belongs, and the AR device is controlled to display the determined third AR special effect data to be displayed.
Or, the generated AR special effect matched with the target user and the third AR special effect may be superimposed, so as to generate third AR special effect data after the superimposition processing, and control the third AR special effect data after the superimposition processing by the AR device.
In an optional implementation manner, the playing, by the AR device, audio explanation data matched with any preset knowledge point, and displaying first AR special effect data matched with the audio explanation data may include:
in a first mode, under the condition that the AR equipment meets the triggering display condition of the first AR special effect data based on the positioning information, the AR equipment is controlled to play the audio explanation data, and the first AR special effect data is displayed.
And controlling the AR equipment to display guiding information for guiding and adjusting the positioning information of the AR equipment under the condition that the AR equipment is determined to not meet the triggering display condition of the first AR special effect data based on the positioning information.
Here, when the AR device satisfies the trigger presentation condition of the first AR special effect data, the AR device is controlled to play the audio explanation data, and the first AR special effect data is presented, so that a situation that the playing effect of the AR special effect data is poor when the positioning information of the AR device is not appropriate is avoided, and the presentation effect of the AR special effect data is improved.
And when the AR equipment does not satisfy the triggering display condition of the first AR special effect data, the AR equipment can be controlled to display the guiding information used for guiding and adjusting the positioning information of the AR equipment, so that the positioning information of the AR equipment can be adjusted according to the guiding information, the adjusted AR equipment can clearly and visually display the first AR special effect data, omission of the preset knowledge points is avoided, and the explanation efficiency of the preset knowledge points is improved.
In implementation, a corresponding trigger display condition may be set for each piece of first AR special effect data, for example, the trigger display condition may include a trigger display direction and/or a trigger display distance. When the trigger display condition comprises a trigger display direction, and when the positioning information of the AR device indicates that the orientation of the AR device is located in the set trigger display direction, determining that the AR device meets the trigger display condition of the AR special effect data. When the triggering display condition comprises a triggering display distance, when the positioning information of the AR device indicates that the distance between the AR device and the display pose of the AR special effect data is smaller than a set first distance value and/or larger than a set second distance value, the AR device is determined to meet the triggering display condition of the AR special effect data.
Referring to fig. 5, for the first AR special effect data in fig. 5, a trigger display direction and a trigger display distance corresponding to the first AR special effect data may be generated based on a display pose of the first AR special effect data. For example, when the orientation of the AR device is located in the direction range corresponding to the trigger display direction, it is determined that the AR device meets the trigger display condition, for example, the AR device one 51 in fig. 5, and when the orientation of the AR device is the direction 511, the orientation of the AR device is located in the direction range corresponding to the set trigger display direction, and the AR device meets the trigger display condition; when the orientation of the AR device is direction 512, then the AR device does not satisfy the trigger presentation condition.
For example, if the distance between the first AR device 51 and the display position of the first AR special effect data in fig. 5 is smaller than the set first distance value and larger than the set second distance value, the first AR device 51 satisfies the set trigger display distance. The distance between the second AR device 52 in fig. 5 and the display position of the AR special effect data is greater than the set first distance value, so the second AR device 52 does not satisfy the set trigger display distance.
When the trigger display condition includes a trigger display direction and a trigger display distance, it is determined that the AR device satisfies the trigger display condition of the first AR special-effect data when the orientation indicated by the positioning information of the AR device satisfies the trigger display direction condition and the position information indicated by the positioning information of the AR device satisfies the condition of the trigger display distance. And when the orientation indicated by the positioning information of the AR device does not meet the condition of triggering the display direction and/or the position information indicated by the positioning information of the AR device does not meet the condition of triggering the display distance, determining that the AR device does not meet the triggering display condition of the first AR special effect data.
And if the AR equipment meets the trigger display condition corresponding to the AR special effect data, controlling the AR equipment to play the audio explanation data and displaying the first AR special effect data. If the AR equipment does not meet the trigger display condition corresponding to the first AR special effect data, generating guide information for guiding and adjusting the position and posture information of the AR equipment, and displaying the guide information through the AR equipment so as to adjust the position and posture of the AR equipment through the guide information, so that the AR equipment can display the AR special effect data more clearly.
Referring to the interface schematic diagram of the AR device shown in fig. 6, the guidance information is shown in fig. 6, for example, the guidance information may be "please move the AR device to the right to view the first AR special effect data corresponding to the preset knowledge point a".
In an optional implementation manner, the method is applied to a client application platform, and the client application platform is a Web application platform or an applet application platform.
When the method is implemented, the method can be applied to a client application platform, and the client can be a Web application platform on an AR device or an applet application platform on the AR device. Alternatively, the client application platform may also be an application on an AR device for AR navigation.
In implementation, referring to fig. 7, the augmented reality data display method may include:
s701, responding to the target information code on the bottle body scanned by the AR equipment, and determining a target exhibition area to be visited.
S702, acquiring the positioning information of the AR equipment.
S703, judging whether the AR equipment reaches an explanation area corresponding to any preset knowledge point in the target exhibition area or not based on the positioning information of the AR equipment.
S704, if yes, audio explanation data corresponding to any preset knowledge point and first AR special effect data matched with the audio explanation data are obtained.
S705, determining whether the AR device meets a trigger display condition of the first AR special effect data based on the positioning information.
And S706, if the condition is met, playing audio explanation data matched with any preset knowledge point through the AR equipment, and displaying first AR special effect data matched with the audio explanation data. For example, when the preset knowledge point is a historical figure, the audio explanation data of the preset knowledge point can be played, and the first AR special effect data including the animation effect of the historical figure can be displayed; for another example, when the target exhibition area is a beer production factory, the hop production is performed at the preset knowledge point, and the audio explanation data corresponding to the hop production and the first AR special effect data including the production and fermentation processes of the hops can be played.
And S707, if not, the AR device may be controlled to display guidance information for guiding to adjust the pose information of the AR device, so as to guide the AR device to display the first AR special effect data. For example, when the orientation of the AR device is located in a trigger display direction corresponding to the AR special effect data, and/or when the distance between the AR device and the display pose of the AR device and the first AR special effect data is within a range of 0.5 m to 5 m (including 0.5 m and 5 m), it is determined that the AR device satisfies the trigger display condition of the first AR special effect data.
And S708, when the AR device reaches the destination position in the target exhibition area and/or the AR device has played the audio explanation data corresponding to each preset knowledge point in the target exhibition area, displaying the determined third AR special effect data through the AR device.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same concept, an embodiment of the present disclosure further provides an augmented reality data display apparatus, as shown in fig. 8, which is an architecture schematic diagram of the augmented reality data display apparatus provided in the embodiment of the present disclosure, and includes a first determining module 801, an obtaining module 802, and a first displaying module 803, specifically:
a first determining module 801, configured to determine a target exhibition area to be visited in response to a target information code on a bottle scanned by an augmented reality AR device;
an obtaining module 802, configured to obtain positioning information of the AR device;
the first display module 803 is configured to play audio explanation data matched with any preset knowledge point through the AR device and display first AR special-effect data matched with the audio explanation data when it is determined that the AR device reaches an explanation area corresponding to any preset knowledge point in the target exhibition area based on the positioning information.
In one possible implementation, the first determining module 801, when determining the target exhibition area to be visited in response to the augmented reality AR device scanning the target information code on the bottle, is configured to:
responding to a target information code on the bottle body scanned by the AR equipment, and determining a target exhibition area to be visited and a destination position of the AR equipment in the target exhibition area;
the device further comprises: a generating module 804 configured to:
generating navigation guidance information from the current location to the destination location based on the destination location and a current location of the AR device;
and displaying the navigation guidance information through the AR equipment.
In one possible implementation, the first determining module 801, when determining the target exhibition area to be visited and the destination location where the AR device is located in the target exhibition area in response to the AR device scanning the target information code on the bottle, is configured to:
responding to a target information code on the bottle body scanned by the AR equipment, and displaying resource pushing information through the AR equipment;
responding to a triggering operation aiming at the resource pushing information, and determining the destination position corresponding to the resource pushing information and located in a target exhibition area to be visited; and the target position is the position for receiving the target resource corresponding to the resource pushing information.
In one possible embodiment, the audio interpretation data corresponds to AR special effect data of a plurality of styles, and the apparatus further includes: a second determining module 805 configured to:
and determining first AR special effect data matched with the audio explanation data from multiple types of AR special effect data corresponding to the audio explanation data according to the historical record information of the AR equipment and/or the user information of the target user to which the AR equipment belongs.
In one possible embodiment, the first presentation module 803, when presenting the first AR special effect data matching with the audio explanation data, is configured to:
generating AR special effect data matched with a target user based on user information of the target user to which the AR equipment belongs;
controlling the AR equipment to sequentially display first AR special effect data matched with the audio explanation data and AR special effect data matched with the target user; or overlapping the AR special effect matched with the target user and the first AR special effect matched with the audio explanation data to generate overlapped second AR special effect data, and displaying the second AR special effect data through the AR equipment.
In a possible embodiment, the apparatus further comprises: a second display module 806 for:
and displaying the determined third AR special effect data through the AR equipment under the condition that the AR equipment reaches the destination position in the target exhibition area and/or the AR equipment has played the audio explanation data corresponding to each preset knowledge point in the target exhibition area.
In a possible implementation manner, the first presentation module 803, when playing, by the AR device, audio interpretation data matched with any preset knowledge point and presenting first AR special effect data matched with the audio interpretation data, is configured to:
and under the condition that the AR equipment meets the triggering display condition of the first AR special effect data based on the positioning information, controlling the AR equipment to play the audio explanation data and displaying the first AR special effect data.
In a possible embodiment, the apparatus further comprises: a third presentation module 807 for:
and under the condition that the AR equipment is determined not to meet the trigger display condition of the first AR special effect data based on the positioning information, controlling the AR equipment to display guide information for guiding and adjusting the position and orientation information of the AR equipment.
In one possible implementation, the method is applied to a client application platform, and the client application platform is a Web application platform or an applet application platform.
In some embodiments, the functions of the apparatus provided in the embodiments of the present disclosure or the included templates may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, no further description is provided here.
Based on the same technical concept, the embodiment of the disclosure also provides an electronic device. Referring to fig. 9, a schematic structural diagram of an electronic device provided in the embodiment of the present disclosure includes a processor 901, a memory 902, and a bus 903. The memory 902 is used for storing execution instructions, and includes a memory 9021 and an external memory 9022; the memory 9021 is also referred to as an internal memory, and is configured to temporarily store operation data in the processor 901 and data exchanged with an external memory 9022 such as a hard disk, the processor 901 exchanges data with the external memory 9022 through the memory 9021, and when the electronic device 900 is operated, the processor 901 communicates with the memory 902 through the bus 903, so that the processor 901 executes the following instructions:
responding to the target information code on the augmented reality AR equipment scanning bottle body, and determining a target exhibition area to be visited;
acquiring positioning information of the AR equipment;
and under the condition that the AR equipment is determined to reach an explanation area corresponding to any preset knowledge point in the target exhibition area based on the positioning information, playing audio explanation data matched with any preset knowledge point through the AR equipment, and displaying first AR special effect data matched with the audio explanation data.
In addition, the embodiment of the present disclosure further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the augmented reality data presentation method in the above method embodiment are executed. The storage medium may be a volatile or non-volatile computer-readable storage medium.
An embodiment of the present disclosure further provides a computer program product, where the computer program product carries a program code, and instructions included in the program code may be used to execute the steps of the augmented reality data display method in the foregoing method embodiment, which may be referred to specifically in the foregoing method embodiment, and are not described herein again.
The computer program product may be implemented by hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above are only specific embodiments of the present disclosure, but the scope of the present disclosure is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present disclosure, and shall be covered by the scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (12)

1. An augmented reality data presentation method, comprising:
responding to the target information code on the augmented reality AR equipment scanning bottle body, and determining a target exhibition area to be visited;
acquiring positioning information of the AR equipment;
and under the condition that the AR equipment is determined to reach an explanation area corresponding to any preset knowledge point in the target exhibition area based on the positioning information, playing audio explanation data matched with any preset knowledge point through the AR equipment, and displaying first AR special effect data matched with the audio explanation data.
2. The method of claim 1, wherein the determining a target display area to be visited in response to the Augmented Reality (AR) device scanning a target information code on the bottle comprises:
responding to a target information code on the bottle body scanned by the AR equipment, and determining a target exhibition area to be visited and a destination position of the AR equipment in the target exhibition area;
the method further comprises the following steps:
generating navigation guidance information from the current location to the destination location based on the destination location and a current location of the AR device;
and displaying the navigation guidance information through the AR equipment.
3. The method of claim 2, wherein said determining a target display area to be visited and a destination location where the AR device is located within the target display area in response to the AR device scanning a target information code on the bottle comprises:
responding to a target information code on the bottle body scanned by the AR equipment, and displaying resource pushing information through the AR equipment;
responding to a triggering operation aiming at the resource pushing information, and determining the destination position corresponding to the resource pushing information and located in a target exhibition area to be visited; and the target position is the position for receiving the target resource corresponding to the resource pushing information.
4. The method according to any one of claims 1 to 3, wherein the audio interpretation data corresponds to a plurality of styles of AR special effect data, and the method further comprises:
and determining first AR special effect data matched with the audio explanation data from multiple types of AR special effect data corresponding to the audio explanation data according to the historical record information of the AR equipment and/or the user information of the target user to which the AR equipment belongs.
5. The method according to any one of claims 1 to 4, wherein the presenting the first AR special effect data matched with the audio explanation data comprises:
generating AR special effect data matched with a target user based on user information of the target user to which the AR equipment belongs;
controlling the AR equipment to sequentially display first AR special effect data matched with the audio explanation data and AR special effect data matched with the target user; or overlapping the AR special effect matched with the target user and the first AR special effect matched with the audio explanation data to generate overlapped second AR special effect data, and displaying the second AR special effect data through the AR equipment.
6. The method according to any one of claims 1 to 5, further comprising:
and displaying the determined third AR special effect data through the AR equipment under the condition that the AR equipment reaches the destination position in the target exhibition area and/or the AR equipment has played the audio explanation data corresponding to each preset knowledge point in the target exhibition area.
7. The method according to any one of claims 1 to 6, wherein the playing audio explanation data matched with any one of the preset knowledge points through the AR device and displaying the first AR special effect data matched with the audio explanation data comprises:
and under the condition that the AR equipment meets the triggering display condition of the first AR special effect data based on the positioning information, controlling the AR equipment to play the audio explanation data and displaying the first AR special effect data.
8. The method of any one of claims 1 to 7, further comprising:
and controlling the AR device to display guiding information for guiding adjustment of positioning information of the AR device when it is determined that the AR device does not meet a trigger display condition of the first AR special effect data based on the positioning information.
9. The method according to any one of claims 1 to 8, wherein the method is applied to a client application platform, and the client application platform is a Web application platform or an applet application platform.
10. An augmented reality data presentation device, comprising:
the first determination module is used for responding to the target information code on the augmented reality AR equipment scanning bottle body and determining a target exhibition area to be visited;
an obtaining module, configured to obtain positioning information of the AR device;
and the first display module is used for determining that the AR equipment reaches an explanation area corresponding to any preset knowledge point in the target exhibition area based on the positioning information, playing audio explanation data matched with any preset knowledge point through the AR equipment, and displaying first AR special effect data matched with the audio explanation data.
11. An electronic device, comprising: processor, memory and bus, the memory storing machine readable instructions executable by the processor, the processor and the memory communicating via the bus when the electronic device is running, the machine readable instructions when executed by the processor performing the steps of the augmented reality data presentation method according to any one of claims 1 to 9.
12. A computer-readable storage medium, having stored thereon a computer program for performing, when being executed by a processor, the steps of the augmented reality data presentation method according to any one of claims 1 to 9.
CN202110620483.5A 2021-06-03 2021-06-03 Augmented reality data display method and device, electronic equipment and storage medium Active CN113359986B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110620483.5A CN113359986B (en) 2021-06-03 2021-06-03 Augmented reality data display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110620483.5A CN113359986B (en) 2021-06-03 2021-06-03 Augmented reality data display method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113359986A true CN113359986A (en) 2021-09-07
CN113359986B CN113359986B (en) 2023-06-20

Family

ID=77531802

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110620483.5A Active CN113359986B (en) 2021-06-03 2021-06-03 Augmented reality data display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113359986B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113899359A (en) * 2021-09-30 2022-01-07 北京百度网讯科技有限公司 Navigation method, device, equipment and storage medium
WO2022252690A1 (en) * 2021-06-03 2022-12-08 上海商汤智能科技有限公司 Method and apparatus for presenting special effect of bottle body, device, storage medium, computer program, and product
WO2023103961A1 (en) * 2021-12-10 2023-06-15 北京字跳网络技术有限公司 Content display method and apparatus, electronic device, storage medium, and program product

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107484120A (en) * 2017-06-21 2017-12-15 湖南简成信息技术有限公司 Intelligent guide method, tour guide device and equipment based on third party application
WO2018104834A1 (en) * 2016-12-07 2018-06-14 Yogesh Chunilal Rathod Real-time, ephemeral, single mode, group & auto taking visual media, stories, auto status, following feed types, mass actions, suggested activities, ar media & platform
US20180199110A1 (en) * 2017-01-06 2018-07-12 Google Inc. Electronic Programming Guide with Expanding Cells for Video Preview
US20180246698A1 (en) * 2017-02-28 2018-08-30 Magic Leap, Inc. Virtual and real object recording in mixed reality device
US20180349703A1 (en) * 2018-07-27 2018-12-06 Yogesh Rathod Display virtual objects in the event of receiving of augmented reality scanning or photo of real world object from particular location or within geofence and recognition of real world object
CN109040289A (en) * 2018-08-27 2018-12-18 百度在线网络技术(北京)有限公司 Interest point information method for pushing, server, terminal and storage medium
CN110703922A (en) * 2019-10-22 2020-01-17 成都中科大旗软件股份有限公司 Electronic map tour guide method special for tourist attraction
CN111640202A (en) * 2020-06-11 2020-09-08 浙江商汤科技开发有限公司 AR scene special effect generation method and device
CN111640171A (en) * 2020-06-10 2020-09-08 浙江商汤科技开发有限公司 Historical scene explaining method and device, electronic equipment and storage medium
CN111665945A (en) * 2020-06-10 2020-09-15 浙江商汤科技开发有限公司 Tour information display method and device
CN112348968A (en) * 2020-11-06 2021-02-09 北京市商汤科技开发有限公司 Display method and device in augmented reality scene, electronic equipment and storage medium
CN112348969A (en) * 2020-11-06 2021-02-09 北京市商汤科技开发有限公司 Display method and device in augmented reality scene, electronic equipment and storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018104834A1 (en) * 2016-12-07 2018-06-14 Yogesh Chunilal Rathod Real-time, ephemeral, single mode, group & auto taking visual media, stories, auto status, following feed types, mass actions, suggested activities, ar media & platform
US20180199110A1 (en) * 2017-01-06 2018-07-12 Google Inc. Electronic Programming Guide with Expanding Cells for Video Preview
US20180246698A1 (en) * 2017-02-28 2018-08-30 Magic Leap, Inc. Virtual and real object recording in mixed reality device
CN107484120A (en) * 2017-06-21 2017-12-15 湖南简成信息技术有限公司 Intelligent guide method, tour guide device and equipment based on third party application
US20180349703A1 (en) * 2018-07-27 2018-12-06 Yogesh Rathod Display virtual objects in the event of receiving of augmented reality scanning or photo of real world object from particular location or within geofence and recognition of real world object
CN109040289A (en) * 2018-08-27 2018-12-18 百度在线网络技术(北京)有限公司 Interest point information method for pushing, server, terminal and storage medium
CN110703922A (en) * 2019-10-22 2020-01-17 成都中科大旗软件股份有限公司 Electronic map tour guide method special for tourist attraction
CN111640171A (en) * 2020-06-10 2020-09-08 浙江商汤科技开发有限公司 Historical scene explaining method and device, electronic equipment and storage medium
CN111665945A (en) * 2020-06-10 2020-09-15 浙江商汤科技开发有限公司 Tour information display method and device
CN111640202A (en) * 2020-06-11 2020-09-08 浙江商汤科技开发有限公司 AR scene special effect generation method and device
CN112348968A (en) * 2020-11-06 2021-02-09 北京市商汤科技开发有限公司 Display method and device in augmented reality scene, electronic equipment and storage medium
CN112348969A (en) * 2020-11-06 2021-02-09 北京市商汤科技开发有限公司 Display method and device in augmented reality scene, electronic equipment and storage medium

Non-Patent Citations (12)

* Cited by examiner, † Cited by third party
Title
ADITYA FAJAR INDRAWAN等: "Google Maps Adds AR Features As Directions To The Destination Location", 《HTTPS://VOI.ID/EN/TEKNOLOGI/15903/GOOGLE-MAPS-ADDS-AR-FEATURES-AS-DIRECTIONS-TO-THE-DESTINATION-LOCATION》 *
ADITYA FAJAR INDRAWAN等: "Google Maps Adds AR Features As Directions To The Destination Location", 《HTTPS://VOI.ID/EN/TEKNOLOGI/15903/GOOGLE-MAPS-ADDS-AR-FEATURES-AS-DIRECTIONS-TO-THE-DESTINATION-LOCATION》, 6 October 2020 (2020-10-06) *
BOGDAN BELE: "How to Use AR Navigation in Google Maps and Why You Should", 《HTTPS://WWW.GROOVYPOST.COM/HOWTO/USE-AR-NAVIGATION-IN-GOOGLE-MAPS-AND-WHY-YOU-SHOULD/》 *
BOGDAN BELE: "How to Use AR Navigation in Google Maps and Why You Should", 《HTTPS://WWW.GROOVYPOST.COM/HOWTO/USE-AR-NAVIGATION-IN-GOOGLE-MAPS-AND-WHY-YOU-SHOULD/》, 8 December 2019 (2019-12-08) *
张凌睿等: "APP模式移动图书馆共享文化社区的构建研究", 《图书馆》 *
张凌睿等: "APP模式移动图书馆共享文化社区的构建研究", 《图书馆》, no. 09, 15 September 2018 (2018-09-15) *
张维等: "移动互联网时代博物馆服务方式的新探索", 《科技通报》 *
张维等: "移动互联网时代博物馆服务方式的新探索", 《科技通报》, no. 07, 31 July 2016 (2016-07-31) *
李月琳等: "移动技术应用领域信息服务相关研究进展", 《情报学进展》 *
李月琳等: "移动技术应用领域信息服务相关研究进展", 《情报学进展》, no. 00, 30 September 2016 (2016-09-30) *
温日琴;: "基于AR增强现实技术的数字图书馆服务创新研究", 农业图书情报学刊, no. 08 *
黄金;: "公共服务谱新篇 画卷漫展佳境来――南通博物苑"掌上博物馆"建设的实践与思考", 博物馆研究, no. 02 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022252690A1 (en) * 2021-06-03 2022-12-08 上海商汤智能科技有限公司 Method and apparatus for presenting special effect of bottle body, device, storage medium, computer program, and product
CN113899359A (en) * 2021-09-30 2022-01-07 北京百度网讯科技有限公司 Navigation method, device, equipment and storage medium
CN113899359B (en) * 2021-09-30 2023-02-17 北京百度网讯科技有限公司 Navigation method, device, equipment and storage medium
WO2023103961A1 (en) * 2021-12-10 2023-06-15 北京字跳网络技术有限公司 Content display method and apparatus, electronic device, storage medium, and program product

Also Published As

Publication number Publication date
CN113359986B (en) 2023-06-20

Similar Documents

Publication Publication Date Title
AU2022256192B2 (en) Multi-sync ensemble model for device localization
US8983184B2 (en) Vision image information storage system and method thereof, and recording medium having recorded program for implementing method
US10055894B2 (en) Markerless superimposition of content in augmented reality systems
CN113359986B (en) Augmented reality data display method and device, electronic equipment and storage medium
US10186084B2 (en) Image processing to enhance variety of displayable augmented reality objects
CN110716645A (en) Augmented reality data presentation method and device, electronic equipment and storage medium
US20170153787A1 (en) Injection of 3-d virtual objects of museum artifact in ar space and interaction with the same
Romli et al. Mobile augmented reality (AR) marker-based for indoor library navigation
CN111638796A (en) Virtual object display method and device, computer equipment and storage medium
CN107689082B (en) Data projection method and device
WO2016122973A1 (en) Real time texture mapping
CN112148197A (en) Augmented reality AR interaction method and device, electronic equipment and storage medium
JP2022505998A (en) Augmented reality data presentation methods, devices, electronic devices and storage media
US11842514B1 (en) Determining a pose of an object from rgb-d images
KR101867020B1 (en) Method and apparatus for implementing augmented reality for museum
CN111638797A (en) Display control method and device
JP2019192026A (en) Information processing device, information processing method, program, content display system and content distribution device
CN113867531A (en) Interaction method, device, equipment and computer readable storage medium
WO2022252688A1 (en) Augmented reality data presentation method and apparatus, electronic device, and storage medium
CN111693063A (en) Navigation interaction display method and device, electronic equipment and storage medium
CN113282687A (en) Data display method and device, computer equipment and storage medium
CN113345108A (en) Augmented reality data display method and device, electronic equipment and storage medium
Abbas et al. Augmented reality-based real-time accurate artifact management system for museums
CN114967914A (en) Virtual display method, device, equipment and storage medium
KR20140078083A (en) Method of manufacturing cartoon contents for augemented reality and apparatus performing the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant