CN111679742A - Interaction control method and device based on AR, electronic equipment and storage medium - Google Patents

Interaction control method and device based on AR, electronic equipment and storage medium Download PDF

Info

Publication number
CN111679742A
CN111679742A CN202010528088.XA CN202010528088A CN111679742A CN 111679742 A CN111679742 A CN 111679742A CN 202010528088 A CN202010528088 A CN 202010528088A CN 111679742 A CN111679742 A CN 111679742A
Authority
CN
China
Prior art keywords
virtual
real scene
musical instrument
instrument model
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010528088.XA
Other languages
Chinese (zh)
Inventor
潘思霁
揭志伟
张一�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shangtang Technology Development Co Ltd
Zhejiang Sensetime Technology Development Co Ltd
Original Assignee
Zhejiang Shangtang Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shangtang Technology Development Co Ltd filed Critical Zhejiang Shangtang Technology Development Co Ltd
Priority to CN202010528088.XA priority Critical patent/CN111679742A/en
Publication of CN111679742A publication Critical patent/CN111679742A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/14Travel agencies
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

The present disclosure provides an interaction control method, apparatus, electronic device and storage medium based on AR, wherein the method comprises: acquiring a real scene image shot by AR equipment; in the case that it is detected that a physical musical instrument is presented in the real scene image, presenting a virtual musical instrument model corresponding to the physical musical instrument in an AR device; receiving a triggering operation for a virtual instrument model presented in the AR device; and controlling the AR equipment to play a sound effect corresponding to the triggering operation based on the triggering operation.

Description

Interaction control method and device based on AR, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer vision technologies, and in particular, to an interaction control method and apparatus based on AR, an electronic device, and a storage medium.
Background
In recent years, with the rapid development of the cultural tourism industry, more and more user groups visit various exhibitions or museums and the like. For some of the exhibited items in an exhibition, especially some historical relics, the exhibited items are usually displayed in a specific exhibition area for users to watch for the purpose of protecting the relics. However, the viewing mode lacks certain interactivity, which results in that some users have insufficient attention to the display item, and the display item is difficult to achieve the expected display effect.
Disclosure of Invention
The embodiment of the disclosure at least provides an interaction control method and device based on AR, electronic equipment and a storage medium.
In a first aspect, an embodiment of the present disclosure provides an augmented reality AR-based interaction control method, including:
acquiring a real scene image shot by an augmented reality AR device;
in the case that it is detected that a physical musical instrument is presented in the real scene image, presenting a virtual musical instrument model corresponding to the physical musical instrument in an AR device;
receiving a triggering operation for a virtual instrument model presented in the AR device;
and controlling the AR equipment to play a sound effect corresponding to the triggering operation based on the triggering operation.
In the embodiment of the disclosure, after the physical musical instrument is identified in the real scene image, the virtual musical instrument model corresponding to the physical musical instrument can be presented in the AR device, and the sound effect corresponding to the triggering operation can be played in response to the triggering operation on the virtual musical instrument model, so that the simulated playing of the virtual musical instrument is realized. When being applied to cultural tourism trade, to the show item of the historical relic of some musical instruments classes, can show virtual musical instrument model and realize simulating the performance to virtual musical instrument based on triggering operation in the AR equipment, help the user more directly perceived, clear understanding entity musical instrument's performance effect etc. when protecting historical relic again, promoted the user experience who visits the show item in-process, make the process of visiting be rich in interactive and interesting more.
In some embodiments, the virtual instrument model includes multiple types of virtual parts, and different types of virtual parts are triggered to present different sound effects.
In some embodiments, the controlling, based on the trigger operation, the AR device to play a sound effect corresponding to the trigger operation includes:
detecting a trigger position of the trigger operation on the displayed virtual instrument model;
determining a triggered virtual component on the virtual instrument model based on the trigger position;
and controlling the AR equipment to play the sound effect corresponding to the triggered virtual component.
In the embodiment, for an entity musical instrument with various components such as a chime bell, a virtual musical instrument model with various virtual components can be synchronously reconstructed, after the triggering operation is received, the triggered virtual component is determined by identifying the position triggered by the triggering operation, and then the sound effect corresponding to the triggered virtual component is played, so that the triggering operation of a user on different virtual components can be responded, music formed by combining the sound effects corresponding to different virtual components is played, the user experience in the process of visiting a display item is improved, and the visiting process is more interactive and interesting.
In some embodiments, said presenting in the AR device a virtual instrument model corresponding to the physical instrument comprises:
acquiring a virtual musical instrument model corresponding to the entity musical instrument based on the characteristic information of the entity musical instrument presented in the real scene image;
and displaying the AR effect of the real scene image combined with the virtual musical instrument model in the AR equipment.
In this embodiment, an AR effect of combining a real scene image and a virtual instrument model may be presented in a manner of reconstructing or acquiring the virtual instrument model from a preset virtual object model library based on feature information of an entity instrument, so as to improve a visual effect.
In some embodiments, the method further comprises:
acquiring shooting pose data of the AR equipment when the AR equipment shoots the real scene image;
determining presenting pose data of the virtual musical instrument model in a preset three-dimensional scene model based on the shooting pose data;
determining, based on the presentation pose data, a target pose of the virtual instrument model presented in the AR device;
the displaying, in the AR device, the AR effect of the real scene image combined with the virtual instrument model includes:
and displaying the AR effect of the real scene image combined with the virtual musical instrument model conforming to the target pose in the AR equipment.
In this embodiment, the presenting pose data of the virtual musical instrument model in the preset three-dimensional scene model can be located according to the shooting pose data when the physical musical instrument is shot, and then the presenting pose of the virtual musical instrument model in the real scene can be determined based on the presenting pose data, so that the presented AR effect of the virtual musical instrument has a more real visual effect.
In a second aspect, an embodiment of the present disclosure provides an AR-based interaction control apparatus, including:
the acquisition module is used for acquiring a real scene image shot by the AR equipment;
the display module is used for displaying a virtual instrument model corresponding to the physical instrument in the AR equipment under the condition that the physical instrument is detected to be presented in the real scene image;
a receiving module for receiving a triggering operation for a virtual instrument model presented in the AR device;
and the control module is used for controlling the AR equipment to play the sound effect corresponding to the trigger operation based on the trigger operation.
In some embodiments, the virtual instrument model includes multiple types of virtual parts, and different types of virtual parts are triggered to present different sound effects.
In some embodiments, when the control module controls the AR device to play the sound effect corresponding to the trigger operation based on the trigger operation, the control module is specifically configured to:
detecting a trigger position of the trigger operation on the displayed virtual instrument model;
determining a triggered virtual component on the virtual instrument model based on the trigger position;
and controlling the AR equipment to play the sound effect corresponding to the triggered virtual component.
In some embodiments, the presentation module, when presenting the virtual instrument model corresponding to the physical instrument in the AR device, is specifically configured to:
acquiring a virtual musical instrument model corresponding to the entity musical instrument based on the characteristic information of the entity musical instrument presented in the real scene image;
and displaying the AR effect of the real scene image combined with the virtual musical instrument model in the AR equipment.
In some embodiments, the obtaining module is further configured to:
acquiring shooting pose data of the AR equipment when the AR equipment shoots the real scene image;
determining presenting pose data of the virtual musical instrument model in a preset three-dimensional scene model based on the shooting pose data;
determining, based on the presentation pose data, a target pose of the virtual instrument model presented in the AR device;
the display module is specifically configured to, when displaying the AR effect combining the real scene image and the virtual musical instrument model in the AR device:
and displaying the AR effect of the real scene image combined with the virtual musical instrument model conforming to the target pose in the AR equipment.
In a third aspect, this disclosure also provides an electronic device, a processor, and a memory, where the memory stores machine-readable instructions executable by the processor, and the processor is configured to execute the machine-readable instructions stored in the memory, and when the machine-readable instructions are executed by the processor, the machine-readable instructions are executed by the processor to perform the steps in the first aspect or any one of the possible implementations of the first aspect.
In a fourth aspect, this disclosure also provides a computer-readable storage medium having a computer program stored thereon, where the computer program is executed to perform the steps in the first aspect or any one of the possible implementation manners of the first aspect.
According to the method, the device, the electronic equipment and the storage medium provided by the embodiment of the disclosure, after the entity musical instrument is identified in the real scene image, the virtual musical instrument model corresponding to the entity musical instrument can be presented in the AR equipment, the triggering operation of the virtual musical instrument model can be responded, and the sound effect corresponding to the triggering operation is played, so that the simulated playing of the virtual musical instrument is realized. When being applied to cultural tourism trade, to the show item of the historical relic of some musical instruments classes, can show virtual musical instrument model and realize simulating the performance to virtual musical instrument based on triggering operation in the AR equipment, help the user more directly perceived, clear understanding entity musical instrument's performance effect etc. when protecting historical relic again, promoted the user experience who visits the show item in-process, make the process of visiting be rich in interactive and interesting more.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1 shows a flowchart of an AR-based interaction control method provided by an embodiment of the present disclosure;
FIG. 2 is a flowchart illustrating an example of an AR-based interaction control method provided by an embodiment of the present disclosure;
fig. 3 is a schematic diagram illustrating an AR-based interactive control apparatus provided in an embodiment of the present disclosure;
fig. 4 shows a schematic diagram of an electronic device provided by an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of the embodiments of the present disclosure, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
Augmented Reality (AR) technology superimposes entity information (visual information, sound, touch, etc.) on the real world after simulation, so that a real environment and a virtual object are presented on the same screen or space in real time.
The embodiments of the present disclosure may be applied to an AR device, which may be any electronic device capable of supporting an AR function, including but not limited to AR glasses, a tablet computer, a smart phone, and the like. For example, the presentation of the AR effect in the AR device may be understood as presenting a virtual object merged into the real scene in the AR device, and may be directly rendering the presentation content of the virtual object and merging with the real scene, for example, presenting a set of virtual tea set, where the display effect is placed on a real desktop in the real scene, or presenting a merged display picture after merging the presentation content of the virtual object with the real scene picture. The specific selection of which presentation manner depends on the device type of the AR device and the adopted picture presentation technology, for example, generally, since a real scene (not an imaged real scene picture) can be directly seen from the AR glasses, the AR glasses can adopt a presentation manner of directly rendering a presentation picture of a virtual object; for mobile terminal devices such as mobile phones and tablet computers, since the pictures formed by imaging the real scene are displayed in the mobile terminal devices, the AR effect can be displayed by fusing the real scene pictures and the presentation content of the virtual object.
The following describes an interaction control method based on AR according to an embodiment of the present disclosure in detail.
Referring to fig. 1, a schematic flow chart of an AR-based interaction control method provided in the embodiment of the present disclosure includes the following steps:
s101, acquiring a real scene image shot by the AR equipment.
In the embodiment of the present disclosure, an image capturing device (such as a camera) in the AR device may be used to capture a real scene image in a real scene, and the image capturing device may capture a single-frame real scene image in a manner of capturing an image, or capture a continuous multi-frame real scene image in a manner of capturing a video.
For example, the user may be placed in a certain exhibition hall, and in the exhibition hall, images of various exhibition halls or exhibits are collected in real time to view AR effects of the images of some exhibition halls or exhibits after the virtual objects are superimposed.
S102, when the fact that the physical musical instrument is presented in the real scene image is detected, displaying a virtual musical instrument model corresponding to the physical musical instrument in the AR equipment.
The real scene image refers to an image of a real scene captured by the AR device. The image of the real scene may include at least one physical object of the real scene. For example, for the real scene image in the exhibition hall, the entity object included in the real scene image may be at least one exhibit in the exhibition hall, such as an entity musical instrument in the exhibition hall.
The physical musical instrument refers to a certain musical instrument which actually exists in a real scene presented by the real scene image, or may also be a physical musical instrument image which actually exists in the real scene presented by the real scene image, such as a physical picture in the real scene or a physical musical instrument image presented in an electronic display screen. The present disclosure is not limited to physical musical instruments, such as chimes and other musical instruments displayed in exhibition halls. For musical instruments such as chimes, which have a plurality of types of parts, different sound effects can be produced by tapping different types of parts, and different sound effects can constitute a piece of music.
The virtual instrument model may be a pre-constructed three-dimensional virtual instrument model. For example, a physical instrument may be photographed from different photographing positions and different photographing angles, and then a virtual instrument model corresponding to the physical instrument may be reconstructed through a three-dimensional reconstruction algorithm based on image features of the physical instrument in the photographed images of the physical instrument.
In the embodiment of the present disclosure, whether an entity musical instrument appears in the real scene image is detected, and in a case where the entity musical instrument appears, a virtual musical instrument model corresponding to the entity musical instrument may be acquired.
Specifically, there are various ways to detect whether a physical instrument appears in a real scene image.
In an example, whether the AR device currently enters the display area of the physical musical instrument may be determined by detecting shooting pose data when the AR device shoots the real scene image, and if the shooting pose data is within a pose data range corresponding to the display area of the physical musical instrument, it may be determined that the AR device currently enters the display area of the physical musical instrument.
In another example, a real scene image captured by the AR device may also be recognized based on an image recognition algorithm to determine whether a physical instrument is present in the real scene image. For example, the similarity between the real scene image and the preset real instrument image is compared, and when the similarity is determined to be greater than the set threshold, it may be determined that the real instrument is present in the real scene image. Or inputting the real scene image into a pre-trained neural network model, performing classification prediction based on the image characteristics of the real scene image, and determining whether the real instrument is present in the real scene image based on the prediction result.
After the fact that the physical musical instrument is presented in the real scene image is detected, further, a virtual musical instrument model corresponding to the physical musical instrument can be obtained based on the feature information of the physical musical instrument presented in the real scene image, and then, an AR effect of combining the real scene image and the virtual musical instrument model is displayed in the AR device. The virtual instrument model is a virtual object presented in the virtual-real combined picture.
In some embodiments, a virtual object matching with the feature information of the physical musical instrument can be found from a preset virtual object model library directly based on the feature information of the physical musical instrument presented in the image of the real scene. The characteristic information of the physical musical instrument may be position information of the photographed physical musical instrument or image information of the photographed physical musical instrument. If the feature information is position information, the virtual object model in the virtual object model library may correspond to a preset display position range, and in the case that the currently shot position information falls into any preset display position range, the virtual object model corresponding to the preset display position range may be used as a virtual instrument model corresponding to the physical instrument. If the feature information is image information, the image information of the photographed physical musical instrument may be matched with the image information of the virtual object model in the virtual object model library, and the successfully matched virtual object model may be used as the virtual musical instrument model corresponding to the physical musical instrument.
In other embodiments, a three-dimensional reconstruction algorithm may be directly used to reconstruct a virtual instrument model corresponding to a physical instrument based on image information of the physical instrument in the acquired real scene image. There are a variety of three-dimensional reconstruction algorithms that may be used, and the present disclosure is not limited thereto.
The embodiment can present the AR effect of the real scene image combined with the virtual musical instrument model by reconstructing or acquiring the virtual musical instrument model from the preset virtual object model library based on the characteristic information of the entity musical instrument, and improve the visual effect.
In some embodiments of the present disclosure, shooting pose data of the AR device when shooting the real scene image may also be acquired, and then based on the shooting pose data, presentation pose data of the virtual musical instrument model in the preset three-dimensional scene model is determined. Thereafter, a target pose of the virtual instrument model presented in the AR device may be determined based on the presentation pose data. Therefore, when the AR effect of the combination of the real scene image and the virtual musical instrument model is displayed in the AR equipment, the AR effect of the combination of the real scene image and the virtual musical instrument model conforming to the target pose can be displayed in the AR equipment. The detailed description of the embodiment will be described in detail in the following examples.
S103, receiving a trigger operation aiming at the virtual instrument model shown in the AR equipment.
For example, when the AR device is a handheld mobile phone, a tablet computer, or other mobile electronic device, the AR effect of combining the real scene picture and the virtual instrument model may be displayed through a display interface of a touch screen of the AR device. The trigger operation may be a setting touch operation of the user on the display interface through a finger or other touch media, for example, a setting touch operation such as clicking, double-clicking, sliding, and the like. When the set touch operation is detected to occur in the display area where the virtual instrument model is located in the display interface, it can be considered that the trigger operation of the virtual instrument model displayed for the AR device is received.
For example, in the case that the AR device is a wearable electronic device such as AR glasses, the trigger operation may be a gesture setting operation made by a user in front of the AR effect screen, which is acquired by an image acquisition device (such as a camera), and the gesture setting operation may be, for example, an action of tapping a musical instrument or other interactive actions. When the setting gesture operation made by the user is detected to appear in front of the AR effect screen, it may be considered that a trigger operation for the virtual instrument model displayed by the AR device is received.
And S104, controlling the AR equipment to play the sound effect corresponding to the trigger operation based on the trigger operation.
Illustratively, given that physical instruments include different types of physical components, the different types of physical components may produce different sound effects. In order to better simulate the playing effect of the virtual instrument model, the virtual instrument model can also comprise multiple types of virtual components based on the corresponding virtual instrument model acquired by the physical instrument, and different types of virtual components can present different sound effects after being triggered. For example, the type of the virtual component may be divided according to the type of the sound effect generated, and the dividing manner of the type of the virtual component is not limited in the present disclosure.
The sound effects corresponding to different virtual components in the virtual instrument model can be configured in advance and stored in the local or cloud, and after any virtual component is detected to be triggered, the sound effect corresponding to the triggered virtual component can be acquired and played.
In some embodiments of the disclosure, the manner of controlling the AR device to play the sound effect corresponding to the trigger operation based on the trigger operation may be: and detecting a trigger position of the trigger operation on the displayed virtual instrument model, then determining a triggered virtual component on the virtual instrument model based on the trigger position, and further controlling the AR equipment to play a sound effect corresponding to the triggered virtual component.
If the trigger operation acts on the touch screen of the AR device, detecting the trigger position of the trigger operation may be detecting a touch position on a display interface of the touch screen, where a virtual instrument model is displayed in the display interface, and based on the touch position, determining a position of an image area where the touched virtual instrument model is located, that is, the trigger position acting on the virtual instrument model. If the trigger operation is a gesture setting operation made in an actual real scene, the gesture setting operation can be mapped to the coordinates of the virtual image where the displayed virtual instrument model is located by detecting the gesture coordinates of the gesture setting operation, and then the trigger position acting on the virtual instrument model can be determined through the mapped coordinates.
For example, a corresponding trigger position range may be preset for each virtual component on the virtual instrument model, and after detecting a trigger position acted on the virtual instrument model by a trigger operation, a virtual part corresponding to the trigger position range that falls may be determined as a triggered virtual component by detecting which trigger position range the trigger position falls into, so as to control the AR device to play a sound effect corresponding to the triggered virtual component.
Through the implementation mode, for the entity musical instruments with various components such as a chime bell, the virtual musical instrument models with various virtual components can be synchronously reconstructed, after the triggering operation is received, the triggered virtual components are determined by identifying the positions triggered by the triggering operation, and then the sound effect corresponding to the triggered virtual components is played, so that the triggering operation of the user on different virtual components can be responded, the music formed by combining the sound effects corresponding to different virtual components is played, the user experience in the process of visiting the display item is improved, and the visiting process is more interactive and interesting.
In the embodiment of the disclosure, after the physical musical instrument is identified in the real scene image, the virtual musical instrument model corresponding to the physical musical instrument can be presented in the AR device, and the sound effect corresponding to the triggering operation can be played in response to the triggering operation on the virtual musical instrument model, so that the simulated playing of the virtual musical instrument is realized. When being applied to cultural tourism trade, to the show item of the historical relic of some musical instruments classes, can show virtual musical instrument model and realize simulating the performance to virtual musical instrument based on triggering operation in the AR equipment, help the user more directly perceived, clear understanding entity musical instrument's performance effect etc. when protecting historical relic again, promoted the user experience who visits the show item in-process, make the process of visiting be rich in interactive and interesting more.
Based on the content of the foregoing embodiments, the embodiments of the present disclosure further provide an exemplary description of a method for presenting an AR effect, and refer to fig. 2, which is a specific execution flowchart of the exemplary description, and includes the following steps:
s201, acquiring a real scene image shot by the AR equipment.
S202, detecting that the physical musical instrument is presented in the real scene image.
S203, acquiring a virtual instrument model corresponding to the entity instrument based on the characteristic information of the entity instrument presented in the real scene image.
And S204, acquiring shooting pose data of the AR equipment when the AR equipment shoots the real scene image.
The shooting pose data of the AR device may include a position and/or a display angle of a display component used for displaying the virtual object when the user holds or wears the AR device, and for convenience of explaining the shooting pose data, a concept of a coordinate system is introduced here, such as a world coordinate system, where the shooting pose data includes a coordinate position of the display component of the AR device in the world coordinate system, or includes an angle between the display component of the AR device and each coordinate axis in the world coordinate system, or includes both a coordinate position of the display component of the AR device in the world coordinate system and an angle between the display component of the AR device and each coordinate axis in the world coordinate system, and the content specifically included in the shooting pose data is related to a display mode set for the virtual object in the augmented reality scene, and is not particularly limited herein.
And S205, based on the shooting pose data, determining the presenting pose data of the virtual instrument model in the preset three-dimensional scene model.
And S206, determining the target pose of the virtual instrument model presented in the AR equipment based on the presented pose data.
And S207, displaying an AR effect of the combination of the real scene image and the virtual instrument model conforming to the target pose in the AR equipment.
In the above steps S205 to S207, the preset three-dimensional scene model may be used to represent the real scene, and is presented in the same coordinate system as the real scene in an equal proportion, for example, the real scene is taken as an example of an exhibition hall, the exhibition hall includes a plurality of display areas, the preset three-dimensional scene model representing the real scene may also include the exhibition hall and each display area in the exhibition hall, and the preset three-dimensional scene model and the real scene are in the same coordinate system according to 1: 1, that is, if a preset three-dimensional scene model is placed in the world coordinate system of the real scene, the preset three-dimensional scene model coincides with the real scene.
The presenting pose data of the virtual instrument model in the preset three-dimensional scene model may include, but is not limited to, at least one of position data, pose data, and appearance data of the virtual instrument model when presented in the preset three-dimensional scene model, such as the above-mentioned position data, pose data, and appearance data of the virtual chime when presented in the real scene.
For example, since the preset three-dimensional scene model and the real scene are in the same coordinate system according to 1: and 1, presenting in different coordinate systems in a medium proportion, so that presenting pose data of the virtual instrument model when presented in a preset three-dimensional scene model is set in advance, and the target pose of the virtual object presented in a real scene can be determined according to the presenting pose data.
In addition, considering the situation that the virtual musical instrument model is possibly blocked by the solid object in the real scene when being presented, in this case, whether the virtual musical instrument model is blocked by the solid object in the real scene corresponding to the preset three-dimensional scene model can be determined according to the position coordinates of the preset three-dimensional scene model, the shooting pose data of the AR device and the presentation pose data of the virtual musical instrument model in the preset three-dimensional scene model, when determining that a partial area of the virtual musical instrument model is blocked by the solid object in the real scene corresponding to the preset three-dimensional scene model, the blocked partial area is not rendered, the preset three-dimensional scene model can be processed into a transparent form in the real scene represented by the preset three-dimensional scene model, namely, the user can not see the preset three-dimensional scene model in the transparent form in the AR device, and can see the presentation effect that part of the virtual musical instrument model is blocked by the solid object in the real scene, so as to improve the reality of AR effect display.
And S208, receiving a triggering operation aiming at the virtual instrument model shown in the AR equipment.
And S209, detecting a trigger position of the trigger operation on the displayed virtual instrument model.
And S210, determining the triggered virtual part on the virtual instrument model based on the triggering position.
And S211, controlling the AR equipment to play the sound effect corresponding to the triggered virtual component.
In the above implementation flows, reference may be made to the explanation of the related features in the previous embodiments for the features related to the previous embodiments, and descriptions in this disclosure are not repeated.
The following is an illustration of a specific application scenario of the disclosed embodiments.
Taking a chime displayed in an exhibition hall as an example, the virtual model AR of the virtual chime can appear in AR equipment by scanning an entity chime through cameras of the AR equipment such as a mobile phone and a tablet based on the computer vision recognition technology by the AR technology, a user can click the virtual chime displayed on a screen of the AR equipment and click a certain chime component on the virtual chime, and then a corresponding chime sound effect can be triggered, so that the display and interactive experience of playing the virtual chime are realized.
The method includes the steps that an AR virtual model of a virtual chime clock can appear in the AR equipment by scanning an entity chime clock through cameras of the AR equipment such as a mobile phone and a tablet, and specifically, a simultaneous localization and mapping (SLAM) technology can be adopted, an SLAM algorithm can realize accurate 6DoF space localization of the current AR equipment based on various sensor information of the AR equipment, meanwhile, 3D perception is conducted on surrounding environment, such as point cloud recovery, plane reconstruction, grid reconstruction and the like, a preset three-dimensional scene model used for describing a real scene is reconstructed, and the AR virtual model of the virtual chime clock is configured at a specified position of the preset three-dimensional scene model.
Furthermore, the AR special effect of the virtual chime clock can be displayed on the screen of the AR equipment by collecting the characteristics of the entity chime clock in a real scene and scanning the entity chime clock through a camera on the AR equipment.
Can be fast, convenient trigger the virtual chime bell of AR present after discerning the entity chime bell based on computer vision technique, can also simulate simultaneously through the virtual chime bell and play the show, help the user more directly perceived, clear understanding the performance effect of chime bell etc. again when protecting the historical relic, promoted the user experience who visits the show project in-process, be rich in interactive and interesting more.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same technical concept, an AR-based interaction control device corresponding to the AR-based interaction control method is also provided in the embodiments of the present disclosure, and since the principle of solving the problem of the device in the embodiments of the present disclosure is similar to that of the above-mentioned AR-based interaction control method in the embodiments of the present disclosure, the implementation of the device may refer to the implementation of the method, and repeated details are not repeated.
Referring to fig. 3, a schematic diagram of an AR-based interaction control apparatus provided in an embodiment of the present disclosure is shown, where the apparatus includes: the device comprises an acquisition module 31, a display module 32, a receiving module 33 and a control module 34. Wherein the content of the first and second substances,
the acquiring module 31 is configured to acquire a real scene image captured by the AR device;
a presentation module 32, configured to present, in the AR device, a virtual instrument model corresponding to a physical instrument in the real scene image if the physical instrument is detected to be present in the real scene image;
a receiving module 33, configured to receive a triggering operation for a virtual instrument model shown in the AR device;
and the control module 34 is configured to control the AR device to play a sound effect corresponding to the trigger operation based on the trigger operation.
In some embodiments, the virtual instrument model includes multiple types of virtual parts, and different types of virtual parts are triggered to present different sound effects.
In some embodiments, when the control module 34 controls the AR device to play the sound effect corresponding to the trigger operation based on the trigger operation, specifically, the control module is configured to:
detecting a trigger position of the trigger operation on the displayed virtual instrument model;
determining a triggered virtual component on the virtual instrument model based on the trigger position;
and controlling the AR equipment to play the sound effect corresponding to the triggered virtual component.
In some embodiments, the presentation module 32, when presenting the virtual instrument model corresponding to the physical instrument in the AR device, is specifically configured to:
acquiring a virtual musical instrument model corresponding to the entity musical instrument based on the characteristic information of the entity musical instrument presented in the real scene image;
and displaying the AR effect of the real scene image combined with the virtual musical instrument model in the AR equipment.
In some embodiments, the obtaining module 31 is further configured to:
acquiring shooting pose data of the AR equipment when the AR equipment shoots the real scene image;
determining presenting pose data of the virtual musical instrument model in a preset three-dimensional scene model based on the shooting pose data;
determining, based on the presentation pose data, a target pose of the virtual instrument model presented in the AR device;
the display module 32, when displaying the AR effect of combining the real scene image with the virtual musical instrument model in the AR device, is specifically configured to:
and displaying the AR effect of the real scene image combined with the virtual musical instrument model conforming to the target pose in the AR equipment.
In some embodiments, the functions of the apparatus provided in the embodiments of the present disclosure or the included templates may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, no further description is provided here.
Based on the same technical concept, the embodiment of the disclosure also provides an electronic device. Referring to fig. 4, a schematic structural diagram of an electronic device provided in the embodiment of the present disclosure includes: a processor 11 and a memory 12; the memory 12 stores machine-readable instructions executable by the processor 11, which when executed by the electronic device are executed by the processor 11 to perform the steps of:
acquiring a real scene image shot by an augmented reality AR device;
in the case that it is detected that a physical musical instrument is presented in the real scene image, presenting a virtual musical instrument model corresponding to the physical musical instrument in an AR device;
receiving a triggering operation for a virtual instrument model presented in the AR device;
and controlling the AR equipment to play a sound effect corresponding to the triggering operation based on the triggering operation.
For the specific execution process of the instruction, reference may be made to the steps of the AR-based interaction control method described in the embodiments of the present disclosure, and details are not described here.
Furthermore, the embodiments of the present disclosure also provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the AR-based interaction control method described in the above method embodiments.
The computer program product of the AR-based interaction control method provided in the embodiments of the present disclosure includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the steps of the augmented reality data presentation method described in the above method embodiments, which may be referred to in detail in the above method embodiments, and are not described herein again.
According to the method, the device, the electronic equipment, the storage medium and the computer program product provided by the embodiment of the disclosure, after the physical musical instrument is identified in the real scene image, the virtual musical instrument model corresponding to the physical musical instrument can be presented in the AR equipment, the triggering operation of the virtual musical instrument model can be responded, and the sound effect corresponding to the triggering operation can be played, so that the simulated playing of the virtual musical instrument is realized. When being applied to cultural tourism trade, to the show item of the historical relic of some musical instruments classes, can show virtual musical instrument model and realize simulating the performance to virtual musical instrument based on triggering operation in the AR equipment, help the user more directly perceived, clear understanding entity musical instrument's performance effect etc. when protecting historical relic again, promoted the user experience who visits the show item in-process, make the process of visiting be rich in interactive and interesting more.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above are only specific embodiments of the present disclosure, but the scope of the present disclosure is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present disclosure, and shall be covered by the scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (10)

1. An interaction control method based on AR is characterized by comprising the following steps:
acquiring a real scene image shot by an augmented reality AR device;
in the case that it is detected that a physical musical instrument is presented in the real scene image, presenting a virtual musical instrument model corresponding to the physical musical instrument in an AR device;
receiving a triggering operation for a virtual instrument model presented in the AR device;
and controlling the AR equipment to play a sound effect corresponding to the triggering operation based on the triggering operation.
2. The method of claim 1, wherein the virtual instrument model comprises a plurality of types of virtual components, wherein different types of virtual components are triggered to exhibit different sound effects.
3. The method according to claim 2, wherein the controlling, based on the trigger operation, the AR device to play the sound effect corresponding to the trigger operation comprises:
detecting a trigger position of the trigger operation on the displayed virtual instrument model;
determining a triggered virtual component on the virtual instrument model based on the trigger position;
and controlling the AR equipment to play the sound effect corresponding to the triggered virtual component.
4. The method according to any one of claims 1 to 3, wherein said presenting a virtual instrument model corresponding to said physical instrument in the AR device comprises:
acquiring a virtual musical instrument model corresponding to the entity musical instrument based on the characteristic information of the entity musical instrument presented in the real scene image;
and displaying the AR effect of the real scene image combined with the virtual musical instrument model in the AR equipment.
5. The method of any of claims 1 to 4, further comprising:
acquiring shooting pose data of the AR equipment when the AR equipment shoots the real scene image;
determining presenting pose data of the virtual musical instrument model in a preset three-dimensional scene model based on the shooting pose data;
determining, based on the presentation pose data, a target pose of the virtual instrument model presented in the AR device;
the displaying, in the AR device, the AR effect of the real scene image combined with the virtual instrument model includes:
and displaying the AR effect of the real scene image combined with the virtual musical instrument model conforming to the target pose in the AR equipment.
6. An AR-based interaction control apparatus, comprising:
the acquisition module is used for acquiring a real scene image shot by the AR equipment;
the display module is used for displaying a virtual instrument model corresponding to the physical instrument in the AR equipment under the condition that the physical instrument is detected to be presented in the real scene image;
a receiving module for receiving a triggering operation for a virtual instrument model presented in the AR device;
and the control module is used for controlling the AR equipment to play the sound effect corresponding to the trigger operation based on the trigger operation.
7. The apparatus of claim 6, wherein the virtual instrument model comprises a plurality of types of virtual components, wherein different types of virtual components are triggered to exhibit different sound effects.
8. The apparatus according to claim 7, wherein the control module, when controlling the AR device to play the sound effect corresponding to the trigger operation based on the trigger operation, is specifically configured to:
detecting a trigger position of the trigger operation on the displayed virtual instrument model;
determining a triggered virtual component on the virtual instrument model based on the trigger position;
and controlling the AR equipment to play the sound effect corresponding to the triggered virtual component.
9. An electronic device, comprising: a processor, a memory storing machine readable instructions executable by the processor, the processor for executing the machine readable instructions stored in the memory, the processor performing the steps of the AR based interaction control method of any one of claims 1 to 5 when the machine readable instructions are executed by the processor.
10. A computer-readable storage medium, having stored thereon a computer program, which, when executed by an electronic device, performs the steps of the AR-based interaction control method according to any one of claims 1 to 5.
CN202010528088.XA 2020-06-10 2020-06-10 Interaction control method and device based on AR, electronic equipment and storage medium Pending CN111679742A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010528088.XA CN111679742A (en) 2020-06-10 2020-06-10 Interaction control method and device based on AR, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010528088.XA CN111679742A (en) 2020-06-10 2020-06-10 Interaction control method and device based on AR, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111679742A true CN111679742A (en) 2020-09-18

Family

ID=72454593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010528088.XA Pending CN111679742A (en) 2020-06-10 2020-06-10 Interaction control method and device based on AR, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111679742A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112148125A (en) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 AR interaction state control method, device, equipment and storage medium
CN112148188A (en) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 Interaction method and device in augmented reality scene, electronic equipment and storage medium
CN112988007A (en) * 2021-03-12 2021-06-18 深圳市慧鲤科技有限公司 Three-dimensional material interaction method and device
CN113066190A (en) * 2021-04-09 2021-07-02 四川虹微技术有限公司 Cultural relic interaction method based on desktop true three-dimension
CN114445500A (en) * 2020-10-30 2022-05-06 北京字跳网络技术有限公司 Augmented reality scene construction method and device, terminal equipment and storage medium
WO2022193467A1 (en) * 2021-03-15 2022-09-22 深圳市慧鲤科技有限公司 Sound playing method and apparatus, electronic device, storage medium, and program
WO2022252966A1 (en) * 2021-06-03 2022-12-08 腾讯科技(深圳)有限公司 Method and apparatus for processing audio of virtual instrument, electronic device, computer readable storage medium, and computer program product
CN116030228A (en) * 2023-02-22 2023-04-28 杭州原数科技有限公司 Method and device for displaying mr virtual picture based on web

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105184858A (en) * 2015-09-18 2015-12-23 上海历影数字科技有限公司 Method for augmented reality mobile terminal
CN105844714A (en) * 2016-04-12 2016-08-10 广州凡拓数字创意科技股份有限公司 Augmented reality based scenario display method and system
CN107015655A (en) * 2017-04-11 2017-08-04 苏州和云观博数字科技有限公司 Museum virtual scene AR experiences eyeglass device and its implementation
CN108227921A (en) * 2017-12-30 2018-06-29 北京工业大学 A kind of digital Zeng Houyi ancient Chinese chime with 12 bells interactive system based on immersive VR equipment
CN108520552A (en) * 2018-03-26 2018-09-11 广东欧珀移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN109213728A (en) * 2017-06-29 2019-01-15 深圳市掌网科技股份有限公司 Cultural relic exhibition method and system based on augmented reality
CN109508090A (en) * 2018-11-06 2019-03-22 燕山大学 A kind of augmented reality display board system having interactivity
CN109685906A (en) * 2017-10-18 2019-04-26 深圳市掌网科技股份有限公司 Scene fusion method and device based on augmented reality
CN111061365A (en) * 2019-11-28 2020-04-24 武汉渲奇数字科技有限公司 Bell set playing guide system based on VR technology

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105184858A (en) * 2015-09-18 2015-12-23 上海历影数字科技有限公司 Method for augmented reality mobile terminal
CN105844714A (en) * 2016-04-12 2016-08-10 广州凡拓数字创意科技股份有限公司 Augmented reality based scenario display method and system
CN107015655A (en) * 2017-04-11 2017-08-04 苏州和云观博数字科技有限公司 Museum virtual scene AR experiences eyeglass device and its implementation
CN109213728A (en) * 2017-06-29 2019-01-15 深圳市掌网科技股份有限公司 Cultural relic exhibition method and system based on augmented reality
CN109685906A (en) * 2017-10-18 2019-04-26 深圳市掌网科技股份有限公司 Scene fusion method and device based on augmented reality
CN108227921A (en) * 2017-12-30 2018-06-29 北京工业大学 A kind of digital Zeng Houyi ancient Chinese chime with 12 bells interactive system based on immersive VR equipment
CN108520552A (en) * 2018-03-26 2018-09-11 广东欧珀移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN109508090A (en) * 2018-11-06 2019-03-22 燕山大学 A kind of augmented reality display board system having interactivity
CN111061365A (en) * 2019-11-28 2020-04-24 武汉渲奇数字科技有限公司 Bell set playing guide system based on VR technology

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112148125A (en) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 AR interaction state control method, device, equipment and storage medium
CN112148188A (en) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 Interaction method and device in augmented reality scene, electronic equipment and storage medium
CN114445500A (en) * 2020-10-30 2022-05-06 北京字跳网络技术有限公司 Augmented reality scene construction method and device, terminal equipment and storage medium
CN114445500B (en) * 2020-10-30 2023-11-10 北京字跳网络技术有限公司 Augmented reality scene construction method, device, terminal equipment and storage medium
CN112988007A (en) * 2021-03-12 2021-06-18 深圳市慧鲤科技有限公司 Three-dimensional material interaction method and device
CN112988007B (en) * 2021-03-12 2022-09-09 深圳市慧鲤科技有限公司 Three-dimensional material interaction method and device
WO2022193467A1 (en) * 2021-03-15 2022-09-22 深圳市慧鲤科技有限公司 Sound playing method and apparatus, electronic device, storage medium, and program
CN113066190A (en) * 2021-04-09 2021-07-02 四川虹微技术有限公司 Cultural relic interaction method based on desktop true three-dimension
WO2022252966A1 (en) * 2021-06-03 2022-12-08 腾讯科技(深圳)有限公司 Method and apparatus for processing audio of virtual instrument, electronic device, computer readable storage medium, and computer program product
CN116030228A (en) * 2023-02-22 2023-04-28 杭州原数科技有限公司 Method and device for displaying mr virtual picture based on web

Similar Documents

Publication Publication Date Title
CN111679742A (en) Interaction control method and device based on AR, electronic equipment and storage medium
US10460512B2 (en) 3D skeletonization using truncated epipolar lines
CN110716645A (en) Augmented reality data presentation method and device, electronic equipment and storage medium
JP7079231B2 (en) Information processing equipment, information processing system, control method, program
JP5877219B2 (en) 3D user interface effect on display by using motion characteristics
CN112148197A (en) Augmented reality AR interaction method and device, electronic equipment and storage medium
EP2887322B1 (en) Mixed reality holographic object development
US20130135295A1 (en) Method and system for a augmented reality
WO2015102904A1 (en) Augmented reality content adapted to space geometry
KR20140082610A (en) Method and apaaratus for augmented exhibition contents in portable terminal
CN111833457A (en) Image processing method, apparatus and storage medium
CN111625100A (en) Method and device for presenting picture content, computer equipment and storage medium
CN111639613B (en) Augmented reality AR special effect generation method and device and electronic equipment
CN112905014A (en) Interaction method and device in AR scene, electronic equipment and storage medium
CN112882576A (en) AR interaction method and device, electronic equipment and storage medium
CN111651052A (en) Virtual sand table display method and device, electronic equipment and storage medium
JP2022507502A (en) Augmented Reality (AR) Imprint Method and System
CN114153548A (en) Display method and device, computer equipment and storage medium
CN111651054A (en) Sound effect control method and device, electronic equipment and storage medium
US11961190B2 (en) Content distribution system, content distribution method, and content distribution program
CN111640195A (en) History scene reproduction method and device, electronic equipment and storage medium
CN112333498A (en) Display control method and device, computer equipment and storage medium
CN111918114A (en) Image display method, image display device, display equipment and computer readable storage medium
CN111599292A (en) Historical scene presenting method and device, electronic equipment and storage medium
WO2023124691A1 (en) Display of augmented reality scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination