CN111665942A - AR special effect triggering display method and device, electronic equipment and storage medium - Google Patents

AR special effect triggering display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111665942A
CN111665942A CN202010514681.9A CN202010514681A CN111665942A CN 111665942 A CN111665942 A CN 111665942A CN 202010514681 A CN202010514681 A CN 202010514681A CN 111665942 A CN111665942 A CN 111665942A
Authority
CN
China
Prior art keywords
target
scene image
view
theme park
theme
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010514681.9A
Other languages
Chinese (zh)
Inventor
潘思霁
揭志伟
李炳泽
张一�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shangtang Technology Development Co Ltd
Zhejiang Sensetime Technology Development Co Ltd
Original Assignee
Zhejiang Shangtang Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shangtang Technology Development Co Ltd filed Critical Zhejiang Shangtang Technology Development Co Ltd
Priority to CN202010514681.9A priority Critical patent/CN111665942A/en
Publication of CN111665942A publication Critical patent/CN111665942A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

The disclosure provides an AR special effect triggering display method, an AR special effect triggering display device, an electronic device and a storage medium, wherein the method comprises the following steps: acquiring a target scene image of a theme park shot by AR equipment in real time; the target scene image comprises a view field separator; after the view separator is determined to be changed from a closed state to an open state according to a plurality of continuously acquired target scene images, generating AR display data of the theme park for the AR equipment; in the method, the display of the AR special effect is triggered by changing the closed state of the visual field separator into the open state, and the visual field separator belongs to the theme park, so that the display mode of the AR special effect triggered by the visual field separator can be better matched with a real scene, and the integration of the AR special effect in the real scene is improved.

Description

AR special effect triggering display method and device, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of AR (augmented reality), in particular to an AR special effect triggering display method and device, electronic equipment and a storage medium.
Background
Augmented Reality (AR) technology superimposes entity information (visual information, sound, touch, etc.) on the real world after simulation, so that a real environment and a virtual object are presented on the same screen or space in real time. In recent years, the application field of the AR device is becoming wider and wider, so that the AR device plays an important role in life, work and entertainment.
After combining AR technology with theme parks, how to trigger AR special effects is a considerable problem.
Disclosure of Invention
The embodiment of the disclosure at least provides an AR special effect triggering display method, an AR special effect triggering display device, electronic equipment and a storage medium.
In a first aspect, an embodiment of the present disclosure provides an AR special effect triggering display method, where the method includes:
acquiring a target scene image of a theme park shot by AR equipment in real time; the target scene image comprises a view field separator;
after the view separator is determined to be changed from a closed state to an open state according to a plurality of continuously acquired target scene images, generating AR display data of the theme park for the AR equipment;
and sending the AR display data to the AR equipment so as to display the AR effect of the theme park on the AR equipment.
In the embodiment of the present disclosure, when a target scene image including a view separator is acquired, it indicates that a user is located at a viewing position of a theme park, and after determining that the view separator is changed from a closed state to an open state according to a plurality of continuously acquired target scene images, AR display data of the theme park is generated for an AR device, that is: the method comprises the steps that the visual field separator of the theme park comprises a closed state and an open state, AR display data of the theme park are generated for AR equipment only when the visual field separator is changed from the closed state to the open state, the AR display data are sent to the AR equipment after the AR equipment generates the AR display data of the theme park, so that the AR effect of the theme park is displayed on the AR equipment, at the moment, a user can see the AR effect of the theme park.
In one possible implementation, acquiring, in real time, an image of a target scene of a theme park captured by an AR device includes:
acquiring a real scene image shot by the AR equipment in real time;
detecting whether the view field separator is contained in the real scene image;
and after the view field separator is determined to be contained in the real scene image, taking the real scene image as the target scene image.
In one possible embodiment, detecting whether the view field separator is included in the real scene image includes:
performing target detection on the real scene image based on a trained target detection network, and extracting a detection target included in the real scene image;
and calculating the similarity between the extracted image feature vector of the detection target and the image feature vector of the view separator, and if the similarity is greater than a set threshold value, determining that the view separator is contained in the real scene image.
In one possible embodiment, the determining that the view separator is changed from the closed state to the open state according to a plurality of target scene images acquired in succession includes:
and inputting the latest continuous N target scene images including the target scene image into a trained neural network for state detection every time one target scene image is acquired until the view separator is determined to be changed from the closed state to the open state.
In one possible embodiment, generating AR show data of the theme park for the AR device includes:
identifying attribute features of the view spacers according to the target scene image;
determining a target theme type of the theme park according to the attribute characteristics of the view spacers;
and generating AR display data of the theme park for the AR equipment according to the determined target theme type.
In the embodiment of the disclosure, the method is beneficial to quickly and accurately determining the content in the AR display data.
In one possible implementation, generating, for the AR device, AR show data of the theme park according to the determined target theme type includes:
selecting AR materials matched with the target theme type according to the determined target theme type and pre-stored AR materials corresponding to different theme types;
and generating AR display data of the theme park for the AR equipment according to the selected AR materials and the current target scene image of the AR equipment.
In the embodiment of the disclosure, the method is beneficial to improving the diversity of AR display data of the theme park, so that the display content in the theme park is richer and more diverse.
In one possible embodiment, the view separator comprises a door and/or a window.
In the embodiment of the disclosure, after the door and/or the window are set as the view field spacers, the display of the AR special effect can be triggered through the door and/or the window, so that the real scene can be further matched, and the integration of the AR special effect in the real scene is improved.
In a second aspect, an embodiment of the present disclosure provides an AR special effect triggering display apparatus, where the apparatus includes:
the acquiring unit is used for acquiring a target scene image of the theme park shot by the AR equipment in real time; the target scene image comprises a view field separator;
the determining unit is used for generating AR display data of the theme park for the AR equipment after determining that the view separator is changed from a closed state to an open state according to a plurality of continuously acquired target scene images;
and the display unit is used for sending the AR display data to the AR equipment so as to display the AR effect of the theme park on the AR equipment.
In a possible implementation manner, the configuration of the acquiring unit, when used for acquiring an image of a target scene of a theme park captured by an AR device in real time, includes:
acquiring a real scene image shot by the AR equipment in real time;
detecting whether the view field separator is contained in the real scene image;
and after the view field separator is determined to be contained in the real scene image, taking the real scene image as the target scene image.
In a possible embodiment, the configuration of the acquiring unit when used for detecting whether the view field separator is included in the real scene image includes:
performing target detection on the real scene image based on a trained target detection network, and extracting a detection target included in the real scene image;
and calculating the similarity between the extracted image feature vector of the detection target and the image feature vector of the view separator, and if the similarity is greater than a set threshold value, determining that the view separator is contained in the real scene image.
In one possible embodiment, the determining unit, when determining that the view separator is changed from the closed state to the open state based on a plurality of target scene images acquired in succession, includes:
and inputting the latest continuous N target scene images including the target scene image into a trained neural network for state detection every time one target scene image is acquired until the view separator is determined to be changed from the closed state to the open state.
In a possible implementation manner, the configuration of the determining unit, when being used for generating the AR show data of the theme park for the AR device, includes:
identifying attribute features of the view spacers according to the target scene image;
determining a target theme type of the theme park according to the attribute characteristics of the view spacers;
and generating AR display data of the theme park for the AR equipment according to the determined target theme type.
In a possible implementation manner, when the configuration of the presentation unit is configured to generate, for the AR device, AR presentation data of the theme park according to the determined target theme type, the configuration includes:
selecting AR materials matched with the target theme type according to the determined target theme type and pre-stored AR materials corresponding to different theme types;
and generating AR display data of the theme park for the AR equipment according to the selected AR materials and the current target scene image of the AR equipment.
In one possible embodiment, the view separator comprises a door and/or a window.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: a processor, a memory and a bus, wherein the memory stores machine-readable instructions executable by the processor, the processor and the memory communicate with each other through the bus when the electronic device is running, and the machine-readable instructions, when executed by the processor, perform the steps of the AR special effect trigger showing method according to the first aspect.
In a fourth aspect, the disclosed embodiments provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, performs the steps of the AR special effect trigger presentation method according to the first aspect.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1 is a schematic flowchart of an AR special effect triggering display method according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of another AR special effect triggering display method according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of another AR special effect triggering display method according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of another AR special effect triggering display method according to an embodiment of the present disclosure;
fig. 5 is a schematic flowchart of another AR special effect triggering display method according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an AR special effect triggering display device according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of the embodiments of the present disclosure, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The term "and/or" herein merely describes an associative relationship, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
Research shows that the AR technology can bring vivid experience to users, and after the AR technology is combined with a theme park, vivid display effect can be brought to the users. In addition, in order to avoid the mutual influence among theme parks and distinguish different theme parks, identification marks are provided for users to identify different theme parks, different view separators are required to be arranged for the different theme parks, so the view separators can be used as trigger factors for triggering the AR special effects, and the different theme parks are provided with the different view separators, so the AR special effect of one theme park can be triggered only by the way of triggering the AR special effects through the view separators, the triggering result is more accurate, and the corresponding AR display data can be determined according to the theme parks triggering the AR special effects, so the different theme parks can have different AR display data, the triggered AR special effects are rich and diverse, meanwhile, the way of triggering the AR special effects through the view separators can be better matched with the actual scene, the integration of the AR special effect in the real scene is improved, and the construction cost of the theme park is reduced.
Based on the research, an AR special effect triggering display method, an AR special effect triggering display device, an electronic device, and a storage medium indicate that a user is located at a viewing position of a theme park when a target scene image including a view spacer is acquired, and after determining that the view spacer is changed from a closed state to an open state according to a plurality of continuously acquired target scene images, AR display data of the theme park is generated for an AR device, that is: the method comprises the steps that the visual field separator of the theme park comprises a closed state and an open state, AR display data of the theme park are generated for AR equipment only when the visual field separator is changed from the closed state to the open state, the AR display data are sent to the AR equipment after the AR equipment generates the AR display data of the theme park, so that the AR effect of the theme park is displayed on the AR equipment, at the moment, a user can see the AR effect of the theme park.
In order to facilitate understanding of the embodiment, first, a detailed description is given to an AR special effect triggering display method disclosed in the embodiment of the present disclosure, an execution main body of the AR special effect triggering display method provided in the embodiment of the present disclosure may be a computer device with certain computing capability, specifically, a terminal device or a server or other processing device, for example, a server connected to an AR device, and the AR device may include devices with display functions and data processing capabilities, such as AR glasses, a tablet computer, a smart phone, and a smart wearable device.
Fig. 1 is a schematic flowchart of an AR special effect triggering display method according to an embodiment of the present disclosure, and as shown in fig. 1, the AR special effect triggering display method includes the following steps:
step 101, acquiring a target scene image of a theme park shot by AR equipment in real time; the target scene image includes a view field separator therein.
And step 102, after the view separator is determined to be changed from the closed state to the open state according to the plurality of continuously acquired target scene images, generating AR display data of the theme park for the AR equipment.
And 103, sending the AR display data to the AR equipment so as to display the AR effect of the theme park on the AR equipment.
Specifically, in order to distinguish different theme parks, enemy view separators are arranged for the different theme parks, the view separators comprise a closed state and an open state, AR display data of the theme parks are not generated for AR equipment when the view separators are in the closed state, the AR display data of the theme parks are generated for the AR equipment only when the view separators are in the open state, so that AR special effects of the theme parks are displayed, in order to display complete AR special effects for users, the view separators are in the closed state in a normal state, the users can shoot scene images of the theme parks through the AR equipment and send the shot scene images to a server, and when target scene images containing the view separators are detected, the users are shown to be in the watching positions of the theme parks, the view separators can be controlled to be opened, and after determining that the view field separator is changed from the closed state to the open state according to a plurality of continuously acquired target scene images, triggering the AR special effect of the theme park at the moment, namely: generating AR presentation data for the theme park for the AR device, sending the AR presentation data to the AR device, so that the AR effect of the theme park is exhibited on the AR device, at which time the user can view the AR effect exhibited by the theme park, because the view field separator belongs to the theme park, the display mode of the AR special effect triggered by the view field separator can be better matched with the real scene, the integration of the AR special effect in the real scene is improved, and in addition, because different theme parks are provided with different visual field separators, the AR special effect of one theme park can be triggered only by triggering the AR special effect through the visual field separators, the triggering result is more accurate, moreover, the corresponding AR display data can be determined according to the theme park triggering the AR special effect, therefore, different theme parks can have different AR display data, and the triggered AR special effects are rich and diverse.
It should be noted that the style of the view separator of the theme park may be set according to actual needs, and the opening mode of the view separator in the process of changing from the closed state to the open state may also be set according to actual needs, for example: the view field spacer may be a fence of the theme park, and the process of changing the view field spacer from the closed state to the open state may be to lift the fence, and after the fence is lifted to a designated position, the AR effect is shown to the user, and the style of the view field spacer and the state changing process are not specifically limited herein.
In a possible implementation, fig. 2 is a schematic flow chart of another AR special effect triggering display method provided in an embodiment of the present disclosure, and as shown in fig. 2, when step 101 is executed, the following steps may be implemented:
step 201, real scene images shot by the AR equipment are obtained in real time.
Step 202, detecting whether the real scene image contains the view field separator.
Step 203, after it is determined that the view field separator is included in the real scene image, the real scene image is taken as the target scene image.
Specifically, in order to ensure that the AR special effect can be accurately triggered, a real scene image that the AR shooting device arrives needs to be acquired in real time, each frame of real-time scene image is analyzed, whether the frame of real-time scene image contains a view spacer or not is determined, it is determined that the real-time scene image contains the view spacer, and it indicates that the user is currently located at the trigger position of the AR special effect, that is: the user is located at a display position for watching the AR special effect, so that the real scene image is required to be used as a target scene image for detection, and after the view separator is determined to be changed from the closed state to the open state from a plurality of continuously acquired target scene images, the AR effect of the theme park is triggered for the user to watch.
In a possible implementation, fig. 3 is a schematic flowchart of another AR special effect triggering display method provided in an embodiment of the present disclosure, and as shown in fig. 3, when step 202 is executed, the following steps may be implemented:
step 301, performing target detection on the real scene image based on a trained target detection network, and extracting a detection target included in the real scene image.
Step 302, calculating the similarity between the extracted image feature vector of the detection target and the image feature vector of the view separator, and if the similarity is greater than a set threshold, determining that the view separator is included in the image of the real scene.
Specifically, the target detection network may be trained in advance to enable the target detection network to detect a real object (i.e., a detection target) included in a real scene image, then an image feature vector of the detection target is extracted, since the view separator has a specific image feature vector, the image feature vector of the detection target and the image feature vector of the view separator may be compared, if the similarity is higher than a set threshold, it indicates that the detection target is the view separator with a high probability, it is determined that the real scene image includes the view separator, and if the similarity is lower than the set threshold, it indicates that the detection target is not the view separator, at this time, it is determined that the real scene image does not include the view separator, and the target scene image may be determined quickly by the above method, so as to show an AR effect for a user in time.
Of course, the view field separator may also be included in the real scene image in other ways, for example: whether the real scene image includes the view field separator is detected by using another image detection model, and the specific manner for determining whether the real scene image includes the view field separator is not specifically limited herein.
In a possible embodiment, when the step of determining that the view separator is changed from the closed state to the open state based on a plurality of continuously acquired target scene images is performed, each target scene image may be acquired, and the latest N consecutive target scene images including the target scene image may be input to the trained neural network for state detection until it is determined that the view separator is changed from the closed state to the open state.
In a possible implementation, fig. 4 is a schematic flow diagram of another AR special effect triggering display method provided in an embodiment of the present disclosure, and as shown in fig. 4, when the step of generating AR display data of the theme park for the AR device is executed, the following steps may be implemented:
step 401, identifying the attribute features of the view field separator according to the target scene image.
Step 402, determining the target theme type of the theme park according to the attribute characteristics of the view field separator.
Step 403, generating, for the AR device, AR presentation data of the theme park according to the determined target theme type.
Specifically, view spacers corresponding to different theme parks have different attribute characteristics, for example: taking a marine organism theme park as an example, the attribute characteristics of the field of view isolator can be marine organisms such as dolphins, sharks and/or octopuses, or can also be decorations with the wave flowers equal to the sea, taking a circus theme park as an example, the attribute characteristics of the field of view isolator can be decorations related to circus, fire rings, wheelbarrows and/or balloons, and the like, the target theme type of the theme park can be determined after the attribute characteristics of the field of view isolator are identified, and then based on the determined target theme type, the AR display data of the theme park can be matched for AR equipment for users to watch, taking the circus theme park as an example, the unicycle performance can be taken as AR display data, or the animal circus can be taken as AR display data, and then the clown expression can be taken as AR display data, or the clown, the dolphin, the shark and/or the like can be taken as, The fire ring, the wheelbarrow and the balloon are made into a virtual animation for welcoming the play of the user, the virtual animation is taken as AR presentation data, and virtual atmosphere can be added to the virtual animation, such as: virtual fireworks, etc.; taking a marine organism theme park as an example, a virtual animation showing various marine organisms such as dolphin, shark and/or octopus playing in water, or explanation on life habits of dolphin, shark and/or octopus as AR showing data, etc. may be used as AR showing data, or marine organisms such as dolphin, shark and octopus may be made into a virtual animation for welcoming the play of a user, and then the virtual animation is used as AR showing data, and a virtual atmosphere may be added to the virtual animation, such as: virtual fireworks and the like
By the method, the target theme type of the theme park can be determined according to the attribute characteristics of the visual field separator of the theme park, and the AR display data corresponding to the target theme type is selected to be displayed to the user, so that the method is beneficial to improving the rich diversity of the display content of the theme park, can better match with the real scene, and improves the integration of the virtual project content.
It should be noted that the attribute characteristics of the view field separator may be set according to the corresponding theme park, and the AR display data corresponding to the attribute characteristics of the view field separator may also be set according to actual needs, taking the marine organism theme park as an example, the AR display data may include: the specific practical way is not limited in this regard, and may be a virtual animation of a shark following a dolphin with its mouth, or a virtual animation of an octopus wrapped with an antenna.
In a possible implementation, fig. 5 is a schematic flowchart of another AR special effect triggering display method provided in an embodiment of the present disclosure, and as shown in fig. 5, when step 403 is executed, the following steps may be implemented:
step 501, selecting AR materials matched with the target theme type according to the determined target theme type and pre-stored AR materials corresponding to different theme types.
Step 502, generating AR display data of the theme park for the AR equipment according to the selected AR materials and the current target scene image of the AR equipment.
Specifically, the AR materials corresponding to different theme types are pre-established, wherein the AR materials may be virtual three-dimensional scene animation materials, and the types of the AR materials may also include multiple types, for example: the categories of AR material may include category 1, category 2, and category 3, with each theme type having corresponding AR material in the various categories, such as: the AR material corresponding to the theme type 1 in the category 1 can be an AR material, the AR material corresponding to the theme type 1 in the category 2 can be an AR material 2, the AR material corresponding to the theme type 1 in the category 3 can be an AR material 3, then the AR material 1, the AR material 2 and the AR material 3 jointly form the AR material which is won by the theme type, after the target theme type is determined, the AR material corresponding to the target theme type can be determined according to the corresponding relation between the prestored theme type and the AR material, after the AR material corresponding to the target theme type is determined, the obtained AR material can be blended into a real scene shot by an AR device by using an AR technology, so that a user can see an image subjected to AR processing, and can interact with the content in the seen image at the moment, thereby being beneficial to improving the playing fun of the user, meanwhile, the target scene image is used when the AR display data are generated, and the shooting angle (visual positioning) of the AR equipment can be determined according to the target scene image, so that the three-dimensional animation corresponding to the AR material can be presented in the AR equipment at what angle, and the display picture can be presented by the AR display data from the angle.
In one possible embodiment, the view separator includes a door and/or a window.
Specifically, for an entity building, a door and a window are usually indispensable things, and for a theme park, the door and the window of the theme park generally have element contents of a theme type corresponding to the theme park, and taking a marine organism theme park as an example, in general, the door of the theme park is constructed in a shark mouth pattern, or a dolphin is arranged on the door, and element contents such as bubbles or mermaid are arranged on the window, and meanwhile, the door and the window have a closed state and an open state, so that the door and the window of the theme park as visual field isolation can distinguish the theme type of the theme park, and entities originally existing in the theme park can be used to trigger the display of the AR special effect, so that the reality scene can be better matched, and the blending special effect of the AR in the reality scene is improved.
It should be noted that the view separator may also be other solid objects, such as: the specific type of wall, view separator is not specifically limited herein.
Fig. 6 is a schematic structural diagram of an AR special effect triggering display device according to an embodiment of the present disclosure, and as shown in fig. 6, the device includes:
the acquiring unit 61 is configured to acquire a target scene image of the theme park shot by the AR device in real time; the target scene image comprises a view field separator;
the determining unit 62 is configured to generate, for the AR device, AR display data of the theme park after determining that the view separator is changed from the closed state to the open state according to the plurality of continuously acquired target scene images;
and the display unit 63 is configured to send the AR display data to the AR device, so that the AR device displays the AR effect of the theme park.
In a possible implementation manner, the configuration of the obtaining unit 61, when configured to obtain, in real time, an image of a target scene of a theme park captured by an AR device, includes:
acquiring a real scene image shot by the AR equipment in real time;
detecting whether the view field separator is contained in the real scene image;
and after the view field separator is determined to be contained in the real scene image, taking the real scene image as the target scene image.
In a possible embodiment, the configuration of the acquiring unit 61, when used for detecting whether the view field separator is included in the real scene image, includes:
performing target detection on the real scene image based on a trained target detection network, and extracting a detection target included in the real scene image;
and calculating the similarity between the extracted image feature vector of the detection target and the image feature vector of the view separator, and if the similarity is greater than a set threshold value, determining that the view separator is contained in the real scene image.
In a possible embodiment, the determining unit 62, when determining that the view separator is changed from the closed state to the open state based on a plurality of target scene images acquired in succession, includes:
and inputting the latest continuous N target scene images including the target scene image into a trained neural network for state detection every time one target scene image is acquired until the view separator is determined to be changed from the closed state to the open state.
In a possible implementation, the configuration of the determining unit 62, when being used for generating the AR show data of the theme park for the AR device, includes:
identifying attribute features of the view spacers according to the target scene image;
determining a target theme type of the theme park according to the attribute characteristics of the view spacers;
and generating AR display data of the theme park for the AR equipment according to the determined target theme type.
In a possible implementation manner, the configuration of the presentation unit 63, when configured to generate, for the AR device, AR presentation data of the theme park according to the determined target theme type, includes:
selecting AR materials matched with the target theme type according to the determined target theme type and pre-stored AR materials corresponding to different theme types;
and generating AR display data of the theme park for the AR equipment according to the selected AR materials and the current target scene image of the AR equipment.
In one possible embodiment, the view separator comprises a door and/or a window.
Corresponding to the AR special effect triggering display method in fig. 1, fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure, and as shown in fig. 7, the electronic device includes:
a processor 71, a memory 72, and a bus 73; the memory 72 is used for storing execution instructions and includes a memory 721 and an external memory 722; the memory 721 is also referred to as an internal memory, and is used for temporarily storing the operation data in the processor 71 and the data exchanged with the external memory 722 such as a hard disk, the processor 71 exchanges data with the external memory 722 through the memory 721, and when the electronic device is operated, the processor 71 communicates with the memory 72 through the bus 73, so that the processor 71 executes the following instructions: acquiring a target scene image of a theme park shot by AR equipment in real time; the target scene image comprises a view field separator; after the view separator is determined to be changed from a closed state to an open state according to a plurality of continuously acquired target scene images, generating AR display data of the theme park for the AR equipment; and sending the AR display data to the AR equipment so as to display the AR effect of the theme park on the AR equipment.
The embodiment of the present disclosure further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the AR special effect triggering display method described in the above method embodiment are executed. The storage medium may be a volatile or non-volatile computer-readable storage medium.
The computer program product of the AR special effect trigger display method provided in the embodiments of the present disclosure includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the steps of the AR special effect trigger display method described in the above method embodiments, which may be referred to in the above method embodiments specifically, and are not described herein again.
The embodiments of the present disclosure also provide a computer program, which when executed by a processor implements any one of the methods of the foregoing embodiments. The computer program product may be embodied in hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (10)

1. An AR special effect triggering display method is characterized by comprising the following steps:
acquiring a target scene image of a theme park shot by AR equipment in real time; the target scene image comprises a view field separator;
after the view separator is determined to be changed from a closed state to an open state according to a plurality of continuously acquired target scene images, generating AR display data of the theme park for the AR equipment;
and sending the AR display data to the AR equipment so as to display the AR effect of the theme park on the AR equipment.
2. The method of claim 1, wherein obtaining the target scene image of the theme park taken by the AR device in real time comprises:
acquiring a real scene image shot by the AR equipment in real time;
detecting whether the view field separator is contained in the real scene image;
and after the view field separator is determined to be contained in the real scene image, taking the real scene image as the target scene image.
3. The method of claim 2, wherein detecting whether the view separator is included in the image of the real scene comprises:
performing target detection on the real scene image based on a trained target detection network, and extracting a detection target included in the real scene image;
and calculating the similarity between the extracted image feature vector of the detection target and the image feature vector of the view separator, and if the similarity is greater than a set threshold value, determining that the view separator is contained in the real scene image.
4. The method of claim 1, wherein determining that the view separator changes from the closed state to the open state based on a plurality of images of the target scene acquired in succession comprises:
and inputting the latest continuous N target scene images including the target scene image into a trained neural network for state detection every time one target scene image is acquired until the view separator is determined to be changed from the closed state to the open state.
5. The method of claim 1, wherein generating AR presentation data for the theme park for the AR device comprises:
identifying attribute features of the view spacers according to the target scene image;
determining a target theme type of the theme park according to the attribute characteristics of the view spacers;
and generating AR display data of the theme park for the AR equipment according to the determined target theme type.
6. The method of claim 5, wherein generating AR show data for the theme park for the AR device according to the determined target theme type comprises:
selecting AR materials matched with the target theme type according to the determined target theme type and pre-stored AR materials corresponding to different theme types;
and generating AR display data of the theme park for the AR equipment according to the selected AR materials and the current target scene image of the AR equipment.
7. A method according to any one of claims 1 to 6, wherein the view spacers comprise doors and/or windows.
8. An AR special effect triggering display apparatus, the apparatus comprising:
the acquiring unit is used for acquiring a target scene image of the theme park shot by the AR equipment in real time; the target scene image comprises a view field separator;
the determining unit is used for generating AR display data of the theme park for the AR equipment after determining that the view separator is changed from a closed state to an open state according to a plurality of continuously acquired target scene images;
and the display unit is used for sending the AR display data to the AR equipment so as to display the AR effect of the theme park on the AR equipment.
9. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the AR special effects trigger presentation method according to any one of claims 1 to 7.
10. A computer-readable storage medium, having stored thereon a computer program which, when being executed by a processor, performs the steps of the AR special effects triggering presentation method according to any one of claims 1 to 7.
CN202010514681.9A 2020-06-08 2020-06-08 AR special effect triggering display method and device, electronic equipment and storage medium Pending CN111665942A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010514681.9A CN111665942A (en) 2020-06-08 2020-06-08 AR special effect triggering display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010514681.9A CN111665942A (en) 2020-06-08 2020-06-08 AR special effect triggering display method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111665942A true CN111665942A (en) 2020-09-15

Family

ID=72386973

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010514681.9A Pending CN111665942A (en) 2020-06-08 2020-06-08 AR special effect triggering display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111665942A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112637665A (en) * 2020-12-23 2021-04-09 北京市商汤科技开发有限公司 Display method and device in augmented reality scene, electronic equipment and storage medium
WO2022252509A1 (en) * 2021-06-03 2022-12-08 北京市商汤科技开发有限公司 Data display method and apparatus, device, storage medium, and computer program product

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110293140A1 (en) * 2010-05-28 2011-12-01 Qualcomm Incorporated Dataset Creation For Tracking Targets With Dynamically Changing Portions
US20170221272A1 (en) * 2016-01-29 2017-08-03 Jia Li Local augmented reality persistent sticker objects
US20180165854A1 (en) * 2016-08-11 2018-06-14 Integem Inc. Intelligent interactive and augmented reality based user interface platform
US20190392218A1 (en) * 2017-10-06 2019-12-26 Steve Rad Virtual reality system and kit

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110293140A1 (en) * 2010-05-28 2011-12-01 Qualcomm Incorporated Dataset Creation For Tracking Targets With Dynamically Changing Portions
US20170221272A1 (en) * 2016-01-29 2017-08-03 Jia Li Local augmented reality persistent sticker objects
US20180165854A1 (en) * 2016-08-11 2018-06-14 Integem Inc. Intelligent interactive and augmented reality based user interface platform
US20190392218A1 (en) * 2017-10-06 2019-12-26 Steve Rad Virtual reality system and kit

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112637665A (en) * 2020-12-23 2021-04-09 北京市商汤科技开发有限公司 Display method and device in augmented reality scene, electronic equipment and storage medium
CN112637665B (en) * 2020-12-23 2022-11-04 北京市商汤科技开发有限公司 Display method and device in augmented reality scene, electronic equipment and storage medium
WO2022252509A1 (en) * 2021-06-03 2022-12-08 北京市商汤科技开发有限公司 Data display method and apparatus, device, storage medium, and computer program product

Similar Documents

Publication Publication Date Title
CN112348969B (en) Display method and device in augmented reality scene, electronic equipment and storage medium
CN111638793B (en) Display method and device of aircraft, electronic equipment and storage medium
CN111640202B (en) AR scene special effect generation method and device
CN111694430A (en) AR scene picture presentation method and device, electronic equipment and storage medium
CN111651047B (en) Virtual object display method and device, electronic equipment and storage medium
CN111640171B (en) Historical scene explanation method and device, electronic equipment and storage medium
CN112148197A (en) Augmented reality AR interaction method and device, electronic equipment and storage medium
WO2016122973A1 (en) Real time texture mapping
CN111638797A (en) Display control method and device
CN109254650B (en) Man-machine interaction method and device
JP2022505998A (en) Augmented reality data presentation methods, devices, electronic devices and storage media
CN111643900A (en) Display picture control method and device, electronic equipment and storage medium
CN108037830B (en) Method for realizing augmented reality
CN111665942A (en) AR special effect triggering display method and device, electronic equipment and storage medium
CN108176049A (en) A kind of information cuing method, device, terminal and computer readable storage medium
CN111639613B (en) Augmented reality AR special effect generation method and device and electronic equipment
CN111651057A (en) Data display method and device, electronic equipment and storage medium
CN111640200A (en) AR scene special effect generation method and device
CN113949914A (en) Live broadcast interaction method and device, electronic equipment and computer readable storage medium
CN112905014A (en) Interaction method and device in AR scene, electronic equipment and storage medium
CN111638798A (en) AR group photo method, AR group photo device, computer equipment and storage medium
CN111652971A (en) Display control method and device
CN111651058A (en) Historical scene control display method and device, electronic equipment and storage medium
CN111639615B (en) Trigger control method and device for virtual building
CN109118591A (en) A kind of identification of historical relic cloud and interactive system and method based on augmented reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination