CN112148189A - Interaction method and device in AR scene, electronic equipment and storage medium - Google Patents

Interaction method and device in AR scene, electronic equipment and storage medium Download PDF

Info

Publication number
CN112148189A
CN112148189A CN202011011073.2A CN202011011073A CN112148189A CN 112148189 A CN112148189 A CN 112148189A CN 202011011073 A CN202011011073 A CN 202011011073A CN 112148189 A CN112148189 A CN 112148189A
Authority
CN
China
Prior art keywords
virtual object
special effect
equipment
picture
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011011073.2A
Other languages
Chinese (zh)
Inventor
王鼎禄
周玉杰
刘旭
栾青
侯欣如
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sensetime Technology Development Co Ltd
Original Assignee
Beijing Sensetime Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sensetime Technology Development Co Ltd filed Critical Beijing Sensetime Technology Development Co Ltd
Priority to CN202011011073.2A priority Critical patent/CN112148189A/en
Publication of CN112148189A publication Critical patent/CN112148189A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Abstract

The disclosure provides an interaction method, an interaction device, electronic equipment and a storage medium in an AR scene, wherein the method comprises the following steps: displaying an AR picture matched with the real scene image on the AR equipment based on the real scene image shot by the AR equipment; a first AR special effect of at least one first virtual object is displayed in the AR picture; and under the condition that target control operation on the AR equipment is detected and any one first virtual object meets the AR special effect switching condition, displaying a second AR special effect after any one first virtual object is switched in the AR picture. The method and the device increase the interactivity between the user and the AR equipment based on the target control operation, can automatically realize special effect switching based on the judgment of the AR special effect switching condition, and improve the interaction efficiency.

Description

Interaction method and device in AR scene, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of Augmented Reality (AR) technologies, and in particular, to an interaction method and apparatus in an AR scene, an electronic device, and a storage medium.
Background
The AR technology is a technology for superimposing a corresponding image, video, and 3-Dimensional (3D) model on a video to realize fusion of a virtual world and a real world according to a position and an angle of a camera image calculated in real time, and provides a new interactive experience for a user, and thus is widely applied to various technical fields such as consumption, medical care, and games.
Therefore, optimization of the effect of the augmented reality scene presented by the AR device and optimization of the interaction mode become more important.
Disclosure of Invention
The embodiment of the disclosure provides at least one interactive scheme in an AR scene.
In a first aspect, an embodiment of the present disclosure provides an interaction method in an AR scene, where the method includes:
displaying an AR picture matched with a real scene image on the AR equipment based on the real scene image shot by the AR equipment; a first AR special effect of at least one first virtual object is displayed in the AR picture;
and under the condition that target control operation on the AR equipment is detected and any one of the first virtual objects meets the AR special effect switching condition, displaying a second AR special effect after any one of the first virtual objects is switched in the AR picture.
In the embodiment of the present disclosure, an AR screen having a first AR special effect of at least one first virtual object may be displayed by an AR device, and a second AR special effect of any first virtual object may be displayed when a target control operation on the AR device is detected and it is detected that any first virtual object satisfies an AR special effect switching condition. Therefore, the embodiment of the present disclosure, in combination with the target control operation for the AR device and the determination result of the AR special effect switching condition for the first virtual object, realizes special effect switching of the relevant virtual object, increases interactivity between the user and the AR device based on the target control operation, and can automatically realize special effect switching based on the determination of the AR special effect switching condition, thereby improving interaction efficiency.
In a possible implementation manner, a second virtual object is also displayed in the AR picture, and a preset relative pose relationship is provided between the second virtual object and the AR device;
the detecting that any one of the first virtual objects meets the AR special effect switching condition includes:
detecting the second virtual object or the running track information of the associated sub-object of the second virtual object;
and detecting that any one first virtual object meets the AR special effect switching condition based on the running track information and the display poses of the at least one first virtual object in the AR picture.
In the embodiment of the present disclosure, a second virtual object that correspondingly triggers the first virtual object to perform AR special effect switching may be set, so that, when the second virtual object or a related sub-object of the virtual object is detected, whether the first virtual object satisfies the AR special effect switching condition may be determined based on the running track information of the object and the display pose of the first virtual object in the AR picture. Therefore, the second virtual object and the first object can be linked based on the running track information and the display pose of the first virtual object, so that interaction between the two virtual objects is formed, and the interaction experience is improved.
In a possible implementation manner, the detecting that any one of the first virtual objects satisfies the AR special effect switching condition based on the moving trajectory information and the display poses of the at least one first virtual object in the AR picture, respectively, includes:
and if the track point in the running track information falls into a preset position range corresponding to any one of the first virtual objects, detecting that any one of the first virtual objects meets the AR special effect switching condition.
In a possible implementation, the detecting motion trajectory information of the second virtual object or an associated sub-object of the second virtual object includes:
acquiring control parameter information corresponding to the target control operation;
and determining the running track information of the second virtual object or the associated sub-object according to the display pose of the second virtual object in the AR picture and the determined control parameter information.
In the embodiment of the present disclosure, the operation track information of the second virtual object or the associated sub-object may be determined based on the display pose of the second virtual object in the AR picture and the control parameter information corresponding to the target control operation, where the control parameter information may be information such as operation strength and operation direction, and the operation track information determined by different operation directions and strengths may also be different. Therefore, in the process of displaying the AR picture on the AR equipment, the user can change the running track information of the second virtual object or the associated sub-object by adjusting the control parameter information, and the AR interaction experience is improved.
In one possible embodiment, the display pose of the second virtual object in the AR picture is determined according to the following manner:
determining an initial positioning pose of the AR equipment based on a real scene image shot by the AR equipment and a pre-constructed three-dimensional scene map;
determining the real-time positioning pose of the AR equipment through real-time positioning and map building SLAM based on the initial positioning pose of the AR equipment;
and determining the display pose of the second virtual object in the AR picture based on the real-time positioning pose of the AR equipment and the preset relative pose relation between the second virtual object and the AR equipment in the three-dimensional scene map.
The positioning of the AR equipment can be realized by combining the three-dimensional scene map and the SLAM, the accuracy and the real-time performance are good, the display pose of the determined second virtual object in the AR picture can be synchronously updated along with the pose of the AR equipment, and the interaction experience is improved.
In one possible embodiment, the detecting the target control operation on the AR device includes:
detecting a target trigger operation acting on a target virtual object displayed on the AR device;
wherein the target virtual object comprises at least one of:
any of the first virtual objects; the second virtual object; the button is activated.
In one possible implementation, a target trigger action on a target virtual object presented on the AR device is detected according to the following steps:
responding to a trigger operation acted on the screen of the AR equipment, and determining a screen coordinate position corresponding to the trigger operation;
converting the screen coordinate position to a camera coordinate position in a camera coordinate system based on the determined screen coordinate position and a first conversion relationship between the screen coordinate system and the camera coordinate system;
converting the camera coordinate position to a world coordinate system based on the converted camera coordinate position and a second conversion relation between the camera coordinate system and the world coordinate system to obtain a world coordinate position;
and under the condition that the world coordinate position falls into the position range corresponding to the target virtual object, determining that the target trigger operation of the target virtual object acting on the AR equipment is detected.
In the embodiment of the disclosure, the screen coordinate position triggered by the user on the screen of the AR device may be converted into the world coordinate system based on the coordinate system conversion relationship, and then whether the target triggering operation is detected may be determined based on the matching result of the converted world coordinate position and the position range corresponding to the target virtual object. Therefore, through the interactive operation of the user and the AR equipment, the control of the target virtual object is realized, and the interactive experience is increased.
In one possible embodiment, a first AR special effect of the first virtual object is presented in an AR picture according to the following steps:
and under the condition that the AR equipment reaches a target interaction area indicated by the three-dimensional scene map based on the determined real-time positioning pose of the AR equipment and the pre-constructed three-dimensional scene map, displaying a first AR special effect of a first virtual object matched with the real-time positioning pose of the AR equipment in an AR picture.
In one possible embodiment, presenting a first AR special effect of a first virtual object matching a real-time positioning pose of the AR device in an AR picture includes:
determining a target AR special effect data packet corresponding to the current interaction stage according to AR special effect data packets of the first virtual object respectively corresponding to different preset interaction stages;
and displaying a first AR special effect of the first virtual object matched with the real-time positioning pose of the AR equipment in an AR picture based on the target AR special effect data packet.
Here, the target AR special effect data packet corresponding to the current interaction stage may be determined based on the AR special effect data packets of the first virtual object corresponding to the different interaction stages, and then the first AR special effect of the first virtual object is displayed, that is, the first AR special effects displayed in the different interaction stages may be different, which may enrich the display effect and improve the interaction experience.
In a second aspect, an embodiment of the present disclosure further provides an interaction apparatus in an AR scene, where the apparatus includes:
the display module is used for displaying an AR picture matched with a real scene image on the AR equipment based on the real scene image shot by the AR equipment; a first AR special effect of at least one first virtual object is displayed in the AR picture;
and the switching module is used for displaying a second AR special effect after any one first virtual object is switched in the AR picture under the condition that target control operation on the AR equipment is detected and any one first virtual object is detected to meet the AR special effect switching condition.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor being configured to execute the machine-readable instructions stored in the memory, the processor and the memory communicating via the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the interaction method in the AR scenario as described in the first aspect and any of its various embodiments.
In a fourth aspect, an embodiment of the present disclosure further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by an electronic device, the electronic device performs the steps of the interaction method in the AR scenario according to the first aspect and any one of the various implementation manners thereof.
For the description of the effects of the interaction apparatus, the electronic device, and the computer-readable storage medium in the AR scene, reference is made to the description of the interaction method in the AR scene, and details are not repeated here.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1 shows a flowchart of an interaction method in an AR scenario provided by an embodiment of the present disclosure;
fig. 2(a) is a scene schematic diagram illustrating an interaction method in an AR scene according to an embodiment of the present disclosure;
fig. 2(b) is a scene schematic diagram illustrating an interaction method in an AR scene according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram illustrating an interaction apparatus in an AR scene according to an embodiment of the present disclosure;
fig. 4 shows a schematic diagram of an electronic device provided by an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of embodiments of the present disclosure, as generally described and illustrated herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure is not intended to limit the scope of the disclosure, as claimed, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
The disclosure provides at least an interaction scheme in an AR scene, and realizes special effect switching of a related virtual object by combining target control operation for an AR device and a judgment result of an AR special effect switching condition for a first virtual object.
The above-mentioned drawbacks are the results of the inventor after practical and careful study, and therefore, the discovery process of the above-mentioned problems and the solutions proposed by the present disclosure to the above-mentioned problems should be the contribution of the inventor in the process of the present disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
To facilitate understanding of the present embodiment, first, an interaction method in an AR scene disclosed in the embodiments of the present disclosure is described in detail, where an execution subject of the interaction method in the AR scene provided in the embodiments of the present disclosure is generally an electronic device with a certain computing capability, and the electronic device includes, for example: the user terminal or the server or other processing devices may be, for example, a server connected to the user terminal, the user terminal may be a tablet computer, a smart phone, a smart wearable device, an AR device (e.g., AR glasses, AR helmet, etc.), and other devices having a display function and a data processing capability, and the user terminal may be connected to the server through an application program. In some possible implementations, the interaction method in the AR scenario may be implemented by a processor calling computer-readable instructions stored in a memory.
The following describes an interaction method in an AR scenario provided in the embodiments of the present disclosure.
Referring to fig. 1, which is a flowchart of an interaction method in an AR scene provided in the embodiment of the present disclosure, the method includes steps S101 to S102, where:
s101, displaying an AR picture matched with a real scene image on AR equipment based on the real scene image shot by the AR equipment; a first AR special effect of at least one first virtual object is displayed in the AR picture;
s102, under the condition that target control operation on the AR equipment is detected and any one first virtual object meets the AR special effect switching condition, displaying a second AR special effect after any one first virtual object is switched in an AR picture.
The interaction method under the AR scene may be mainly applied to various scenes in which AR scene interaction is required, for example, may be applied to an AR game scene to implement special effect switching of virtual game characters in the AR game scene, may also be applied to an AR navigation scene to implement special effect switching of virtual characters in the AR navigation scene, and may also be applied to other various application scenes, which is not limited specifically herein.
The interaction method in the AR scene provided in the embodiment of the present disclosure may first detect whether a target control operation is acquired on an AR device displaying an AR picture before performing special effect switching on a first virtual object in the AR picture, and may display a switched second AR special effect in the AR picture through the AR device if it is determined that the first virtual object satisfies an AR special effect switching condition under the condition that the target control operation is detected.
The AR special effect switching condition may be realized based on a target control operation, for example, after a single click, a double click, a sliding operation, or the like is performed on the screen of the AR device, the special effect switching of the first virtual object is realized.
Taking the AR game scene as an example, after performing a double-click operation on the AR device or a first virtual object (e.g., a virtual character) in an AR picture presented by the AR device, switching from a first AR special effect of walking to a second AR special effect of jumping may be achieved.
In addition, in different application scenarios, besides a plurality of AR special effects, a switching sequence between different special effects can be set, so that sequential display of the plurality of AR special effects is realized, for example, under the condition of 3 AR special effects in total, the special effects can be switched once every time one click operation is executed.
The interaction method in the AR scene provided by the embodiment of the present disclosure may directly implement the AR special effect switching of the first virtual object according to the above target control operation for the first virtual object, may also indirectly implement the AR special effect switching of the first virtual object through the target control operation for the second virtual object, and in addition, may also implement the AR special effect switching of the first virtual object through the target control operation for the trigger button.
In some embodiments, the first virtual object, the second virtual object, and the trigger button may be all target virtual objects displayed on the AR device, and a target control operation acting on the AR device may be detected by performing a target trigger operation on the target virtual objects.
In the embodiment of the present disclosure, the process of detecting the target trigger operation acting on the target virtual object displayed on the AR device may specifically be implemented by the following steps:
responding to a trigger operation acted on an AR equipment screen, and determining a screen coordinate position corresponding to the trigger operation;
secondly, converting the screen coordinate position to a camera coordinate position under a camera coordinate system based on the determined screen coordinate position and a first conversion relation between the screen coordinate system and the camera coordinate system;
converting the camera coordinate position to a world coordinate system based on the converted camera coordinate position and a second conversion relation between the camera coordinate system and the world coordinate system to obtain a world coordinate position;
and step four, under the condition that the world coordinate position falls into the position range corresponding to the target virtual object, determining the target trigger operation of the target virtual object which is displayed on the AR equipment.
Here, a trigger operation performed by the user on the screen of the AR device may be responded to first, and here, a screen coordinate position corresponding to the trigger may be determined. Because a first conversion relation exists between the screen coordinate system and the camera coordinate system and a second conversion relation exists between the camera coordinate system and the world coordinate system, at this time, the screen coordinate position can be converted into the camera coordinate position under the camera coordinate system, and then the camera coordinate position obtained by conversion is converted into the world coordinate system, so that the world coordinate position is obtained.
Since the AR device is a target virtual object presented in the real world in the process of displaying the AR screen, for the indicated actual physical location, it may be determined whether the current target trigger operation is applied to the target virtual object based on a determination result of whether the actual physical location falls into a location range corresponding to the target virtual object. That is, if it is determined that the actual physical location falls within the location range corresponding to the target virtual object, it may be considered that the current target trigger operation is applied to the target virtual object.
Based on the analysis result of the above-described target trigger operation on the target virtual object, it may be determined whether the first virtual object satisfies the AR special effect switching condition. In view of the fact that the AR special effect switching achieved by the indirect trigger operation on the second virtual object has a better interactive experience, the AR special effect switching may also be achieved by using the indirect trigger operation in the embodiment of the present disclosure.
In some embodiments, the second virtual object may be a virtual object having a preset relative pose relationship with the AR device. The preset relative pose relationship may include a preset relative position relationship and may also include a preset relative posture relationship.
The preset relative position relationship may be a position relationship in which the second virtual object is in front of the AR device and is a preset distance away from the AR device, so that the second virtual object in the AR screen may be in a guiding function in front of the AR device, and in addition, the second virtual object may be in the same position as the AR device and move synchronously, which is not limited in this disclosure.
The preset relative posture relationship may be a posture relationship in which the second virtual object and the AR device form a preset included angle, or may be in the same posture.
Exemplarily, the pose of the second virtual object and the AR device can be set in an associated manner, so as to further improve the interaction experience.
Still taking the AR game scene as an example here, the second virtual object here may be a barrel for aiming at a monster, which is the first virtual object, and based on the above-mentioned preset relative pose relationship, it may be ensured that the operation between the barrel and the user holding the AR device is synchronous, for example, in the case that the user turns the AR device up by 15 °, the corresponding barrel may also be raised by 15 °, thereby further ensuring the experience of interaction.
In the embodiment of the present disclosure, it may be determined whether the first virtual object satisfies the AR special effect switching condition by:
step one, detecting a second virtual object or the running track information of an associated sub-object of the second virtual object;
and secondly, detecting that any one first virtual object meets the AR special effect switching condition based on the running track information and the display pose of at least one first virtual object in the AR picture.
Here, the trajectory information on the second virtual object or the trajectory information of the associated sub-object of the second virtual object may be determined based on the manipulation parameter information corresponding to the target control operation, so that in the case where the display pose of the second virtual object in the AR screen is determined, the trajectory information may be determined in conjunction with the above-mentioned manipulation parameter information.
In different application scenarios, the control parameter information required for determining the operation track information is also different. The manipulation parameter information may be related parameter information of a trajectory capable of manipulating the second virtual object or its associated sub-object, for example, information of a manipulation strength, a manipulation direction, and the like determined based on the target control operation.
Still taking the AR game scene as an example here, in the case where the second virtual object is a barrel and the associated sub-object of the second virtual object is a projectile that can be launched from the barrel, the above-mentioned control parameter information may be information such as a shooting strength and a shooting distance determined based on the target control operation, and the information and the display pose of the barrel may be combined to determine information about a travel trajectory of the projectile based on the launch start point, with the projectile as the launch start point. The information such as the shooting strength and the shooting distance may be obtained by mapping actual operation results such as a trigger duration and a trigger frequency for executing the target control operation on the second virtual object.
Here, when the track point in the trajectory information falls within the preset position range corresponding to any one of the first virtual objects, it may be determined that any one of the first virtual objects satisfies the AR special effect switching condition.
In some embodiments, to further improve the experience of AR scene interaction, multiple target control operations may be performed on the second virtual object, and the AR special effect switching of the first virtual object may be controlled based on the cumulative control effect of the multiple target control operations. In this case, when the track points of the movement track information determined by accumulating the target control operations for a plurality of times (for example, 5 times) all fall within the preset position range corresponding to a certain first virtual object, the first virtual object may be considered to satisfy the AR special effect switching condition.
To facilitate further understanding of the AR special effect switching achieved based on multiple executions of the target control operation as described above, which is provided by the embodiments of the present disclosure, an example may be next described in conjunction with the AR shooting game scenes shown in fig. 2(a) and 2 (b).
In the AR shooting game scene shown in fig. 2(a), a total of three first virtual objects, which are octopus, sea urchin, and cuttlefish, respectively, and a gun barrel as a second virtual object, it is possible to determine an AR special effect of switching to disappearance of sea urchin in the case where 5 shooting operations are performed on sea urchin, as shown in fig. 2 (b).
In the interaction method in the AR scene provided by the embodiment of the present disclosure, the display pose of the second virtual object in the AR picture may be determined based on the positioning pose information and a preset relative pose relationship between the second virtual object and the AR device in the three-dimensional scene map when the positioning pose information of the AR device is determined. Still taking the AR game scene as an example here, in the case where the second virtual object is a barrel, the shooting direction of the AR device may be determined as the shooting direction of the barrel, and the two may be kept in synchronization.
Considering the determination of the real-time positioning pose of the AR device as a key step in determining the presentation pose of the second virtual object, the following may describe in detail the determination process of the real-time positioning pose of the AR device.
In the embodiment of the disclosure, the positioning pose information of the AR device may be determined based on a real scene image shot by the AR device and a three-dimensional scene map constructed in advance.
Wherein, above-mentioned reality scene image can be the user who wears AR equipment carries out the in-process that AR experienced in the AR scene, the relevant image of shooting through the camera that sets up on the AR equipment. Based on the shot related images, on one hand, local positioning of the positioning pose information of the AR device based on the pre-constructed three-dimensional scene map can be performed on the AR device, and on the other hand, remote positioning of the server can be performed, that is, after the shot real scene images are uploaded to the server, the server can determine the positioning pose information of the AR device based on the pre-constructed three-dimensional scene map.
The three-dimensional scene map in the embodiment of the present disclosure may be a high-precision map constructed based on point cloud data. Here, a large number of photos or videos may be collected for a specific location, for example, a set of photos including different shooting times, different shooting angles, and different shooting positions, a sparse feature point cloud of the specific location may be recovered based on the collected set of photos, and a high-precision map corresponding to the specific location may be constructed based on the recovered sparse feature point cloud. In some embodiments, this may be embodied based on a three-dimensional reconstruction technique of Motion from Motion (SFM).
Therefore, after the real scene image uploaded by the AR equipment is obtained, the feature points in the picture can be extracted firstly, then the real scene image is matched with the sparse feature point cloud corresponding to the high-precision map, and the position and the posture of the AR equipment in the process of shooting the real scene image can be determined based on the matching result.
Therefore, the three-dimensional scene map can be used for confirming the pose of the AR equipment with high precision and high accuracy.
The embodiment of the disclosure can also combine the real-time positioning And Mapping (SLAM) technology to realize the positioning of the AR device. The disclosed embodiment can realize joint positioning according to the following steps:
determining an initial positioning pose of the AR equipment based on a real scene image shot by the AR equipment and a pre-constructed three-dimensional scene map;
and step two, determining the real-time positioning pose of the AR equipment through real-time positioning and map building SLAM based on the initial positioning pose of the AR equipment.
Here, the initial positioning pose of the AR device may be determined based on the three-dimensional scene map, and the real-time positioning pose of the AR device may be determined by the SLAM on the basis of the initial positioning pose. The SLAM can build an incremental map on the basis of the self positioning of the AR equipment, so that the position and the direction of the equipment moving in the space can be determined under the condition of determining the initial positioning pose of the AR equipment, and the real-time positioning of the AR equipment is realized, so that the high precision and the high accuracy of the positioning are ensured, the positioning delay is reduced, and the higher real-time performance is achieved.
In some embodiments, in the real-time positioning based on SLAM, the positioning calibration may be performed in combination with the high-precision positioning of the three-dimensional scene map at every preset time interval (e.g., 5 seconds), so as to further improve the accuracy and precision of the positioning.
After the real-time positioning pose of the AR equipment is determined based on the positioning mode, whether the AR equipment is located in the target interaction area or not can be determined based on the positioning pose information. The target interaction area may be a preset area based on a three-dimensional scene map, and after the spatial position range entering the AR interaction is determined, the target interaction area corresponding to the spatial position range may be determined on the three-dimensional scene map.
In some embodiments, considering that the Unity engine tool provides a very complete set of graphical interfaces including a text window, an input box, a drag box, and the like, the embodiments of the present disclosure may implement the setting of the target interaction area in the three-dimensional scene map based on the Unity engine tool.
Here, the setting of the framing for the target interaction area in the Unity display map may be implemented through a Unity graphical interface, and then the preset mapping relationship between the Unity display map and the three-dimensional scene map is utilized to map the framed target interaction area onto the three-dimensional scene map, so as to implement the setting of the target interaction area in the three-dimensional scene map.
Taking an AR game scene as an example, the target interaction area may correspond to different game tasks of a game, and the corresponding target AR interaction scene may be an AR game scene that depends on corresponding different game tasks; the target interaction area may also correspond to different games, and the corresponding target AR interaction scene may be an AR game scene on which the corresponding different games depend.
In the embodiment of the disclosure, under the condition that it is determined that the AR device reaches the target interaction area, the first AR special effect of the first virtual object, which is matched with the real-time positioning pose of the AR device, may be in the AR picture.
In the embodiment of the present disclosure, in different target interaction areas, different AR special effect packets (first AR special effects) may be configured in advance for a first virtual object, and the special effect packets may include special effect data of the first virtual object, for example, the special effect data may be a voice broadcast special effect for a virtual character. In the process of displaying the first AR special effect, the corresponding display effect may be determined based on the real-time positioning pose of the AR device, for example, when the shooting direction of the AR device has a certain included angle with respect to the front surface of the first virtual object, the virtual character in the preset first AR special effect may be rotated by a corresponding angle and then displayed.
In order to further improve the interactive experience, the interaction method in the AR scene provided by the embodiment of the present disclosure may set different AR special effects for different interaction stages, and specifically may include the following steps:
determining a target AR special effect data packet corresponding to a current interaction stage according to AR special effect data packets of first virtual objects respectively corresponding to different preset interaction stages;
and secondly, displaying a first AR special effect of the first virtual object matched with the real-time positioning pose of the AR equipment in the AR picture based on the target AR special effect data packet.
In the embodiment of the present disclosure, a target AR special effect data packet corresponding to a current interaction stage may be determined based on AR special effect data packets of a first virtual object corresponding to different interaction stages, and then a first AR special effect of the first virtual object may be displayed.
For example, the interaction stage may be determined for an AR experience level of a user wearing the AR device, where the AR experience level may be determined based on information such as the number of times, duration, and the like that the user performs AR experiences on the same application scene, for example, for a user with a specific application scene, the AR experience level is higher.
For users with different AR experience levels, different interaction phases can be correspondingly set.
In addition, the interaction stage in the embodiment of the present disclosure may also be determined by combining specific scene information of the interaction scene, which is not described herein again.
Here, still taking an AR game scene as an example, three different interaction stages may be preset, the higher the interaction stage is, the larger the corresponding game difficulty is, the different game difficulties may correspond to different AR special effect data packets, and here, the movement speed special effect of the first virtual object indicated by the AR special effect data packet with the larger game difficulty may be set to be faster than the game difficulty, so that while the display effect is ensured, the AR interaction experience degree is further improved.
In some embodiments, not only the special effect setting of the movement speed but also the special effect setting of the generation number and the like of the first virtual object may be performed for different interaction phases.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same inventive concept, the embodiment of the present disclosure further provides an interaction apparatus in an AR scene corresponding to the interaction method in the AR scene, and since the principle of the apparatus in the embodiment of the present disclosure for solving the problem is similar to the interaction method in the AR scene described above in the embodiment of the present disclosure, the implementation of the apparatus may refer to the implementation of the method, and repeated details are not described again.
Referring to fig. 3, which is a schematic diagram of an interaction device in an AR scene provided in an embodiment of the present disclosure, the device includes: a display module 301 and a switching module 302; wherein the content of the first and second substances,
the display module 301 is configured to display, on the basis of a real scene image captured by the AR device, an AR picture matched with the real scene image on the AR device; a first AR special effect of at least one first virtual object is displayed in the AR picture;
the switching module 302 is configured to, when detecting a target control operation on the AR device and detecting that any one of the first virtual objects satisfies an AR special effect switching condition, show a second AR special effect after switching of any one of the first virtual objects in the AR screen.
The AR device can display the AR picture of the first AR special effect of at least one first virtual object, the second AR special effect of any first virtual object can be displayed under the condition that target control operation on the AR device is detected and any first virtual object meets the AR special effect switching condition is detected, special effect switching of the relevant virtual objects is achieved based on control operation on the AR device, the process of operating the virtual objects is enabled to be more convenient and fast, and in addition, the experience of a user in the process of participating in AR interaction is improved.
In a possible implementation manner, a second virtual object is also displayed in the AR picture, and a preset relative pose relationship is formed between the second virtual object and the AR device;
a switching module 302, configured to detect that any of the first virtual objects satisfies the AR special effect switching condition according to the following steps:
detecting the second virtual object or the running track information of the associated sub-object of the second virtual object;
and detecting that any first virtual object meets the AR special effect switching condition based on the running track information and the display pose of at least one first virtual object in the AR picture.
In a possible implementation manner, the switching module 302 is configured to detect that any first virtual object satisfies the AR special effect switching condition based on the trajectory information and the display poses of the at least one first virtual object in the AR picture, respectively, according to the following steps:
and if the track point in the running track information falls into the preset position range corresponding to any first virtual object, detecting that any first virtual object meets the AR special effect switching condition.
In a possible implementation, the switching module 302 is configured to detect the motion trajectory information of the second virtual object or the associated sub-object of the second virtual object according to the following steps:
acquiring control parameter information corresponding to target control operation;
and determining the running track information of the second virtual object or the associated sub-object according to the display pose of the second virtual object in the AR picture and the determined control parameter information.
In one possible implementation, the presentation module 301 is configured to determine a presentation pose of the second virtual object in the AR picture according to the following manner:
determining an initial positioning pose of the AR equipment based on a real scene image shot by the AR equipment and a pre-constructed three-dimensional scene map;
determining the real-time positioning pose of the AR equipment through real-time positioning and map building SLAM based on the initial positioning pose of the AR equipment;
and determining the display pose of the second virtual object in the AR picture based on the real-time positioning pose of the AR equipment and the preset relative pose relationship between the second virtual object and the AR equipment in the three-dimensional scene map.
In one possible implementation, the switching module 302 is configured to detect a target control operation on the AR device according to the following steps:
detecting a target trigger operation acting on a target virtual object displayed on the AR device;
wherein the target virtual object comprises at least one of:
any first virtual object; a second virtual object; the button is activated.
In one possible implementation, the switching module 302 is configured to detect a target trigger operation acting on a target virtual object presented on the AR device according to the following steps:
responding to a trigger operation acted on the screen of the AR equipment, and determining a screen coordinate position corresponding to the trigger operation;
converting the screen coordinate position to a camera coordinate position in a camera coordinate system based on the determined screen coordinate position and a first conversion relationship between the screen coordinate system and the camera coordinate system;
converting the camera coordinate position to a world coordinate system based on the converted camera coordinate position and a second conversion relation between the camera coordinate system and the world coordinate system to obtain a world coordinate position;
and under the condition that the world coordinate position falls into the position range corresponding to the target virtual object, determining that the target trigger operation acting on the target virtual object displayed on the AR equipment is detected.
In one possible implementation, the presentation module 301 is configured to present a first AR special effect of a first virtual object in an AR picture according to the following steps:
and under the condition that the AR equipment reaches the target interaction area indicated by the three-dimensional scene map based on the determined real-time positioning pose of the AR equipment and the pre-constructed three-dimensional scene map, displaying a first AR special effect of the first virtual object matched with the real-time positioning pose of the AR equipment in an AR picture.
In one possible implementation, the presentation module 301 is configured to present a first AR special effect of a first virtual object matching a real-time positioning pose of an AR device in an AR picture according to the following steps:
determining a target AR special effect data packet corresponding to the current interaction stage according to AR special effect data packets of the first virtual object respectively corresponding to different preset interaction stages;
and displaying a first AR special effect of the first virtual object matched with the real-time positioning pose of the AR equipment in the AR picture based on the target AR special effect data packet.
The description of the processing flow of each module in the device and the interaction flow between the modules may refer to the related description in the above method embodiments, and will not be described in detail here.
An embodiment of the present disclosure further provides an electronic device, as shown in fig. 4, which is a schematic structural diagram of the electronic device provided in the embodiment of the present disclosure, and the electronic device includes: a processor 401, a memory 402, and a bus 403. The memory 402 stores machine-readable instructions executable by the processor 401 (for example, execution instructions corresponding to the presentation module 301 and the switching module 302 in the interactive apparatus in the AR scenario in fig. 3, and the like), when the electronic device is operated, the processor 401 and the memory 402 communicate through the bus 403, and when the machine-readable instructions are executed by the processor 401, the following processes are performed:
displaying an AR picture matched with the real scene image on the AR equipment based on the real scene image shot by the AR equipment; a first AR special effect of at least one first virtual object is displayed in the AR picture;
and under the condition that target control operation on the AR equipment is detected and any one first virtual object meets the AR special effect switching condition, displaying a second AR special effect after any one first virtual object is switched in the AR picture.
In a possible implementation manner, a second virtual object is also displayed in the AR picture, and a preset relative pose relationship is formed between the second virtual object and the AR device; the instructions executed by the processor 401, detecting that any one of the first virtual objects satisfies the AR special effect switching condition, includes:
detecting the second virtual object or the running track information of the associated sub-object of the second virtual object;
and detecting that any first virtual object meets the AR special effect switching condition based on the running track information and the display pose of at least one first virtual object in the AR picture.
In a possible implementation manner, the instructions executed by the processor 401, detecting that any first virtual object satisfies the AR special effect switching condition based on the running track information and the display poses of the at least one first virtual object in the AR pictures, includes:
and if the track point in the running track information falls into the preset position range corresponding to any first virtual object, detecting that any first virtual object meets the AR special effect switching condition.
In a possible implementation manner, the instructions executed by the processor 401 to detect the motion trajectory information of the second virtual object or the associated sub-object of the second virtual object includes:
acquiring control parameter information corresponding to target control operation;
and determining the running track information of the second virtual object or the associated sub-object according to the display pose of the second virtual object in the AR picture and the determined control parameter information.
In a possible implementation manner, the processor 401 executes instructions to determine the display pose of the second virtual object in the AR picture according to the following manner:
determining an initial positioning pose of the AR equipment based on a real scene image shot by the AR equipment and a pre-constructed three-dimensional scene map;
determining the real-time positioning pose of the AR equipment through real-time positioning and map building SLAM based on the initial positioning pose of the AR equipment;
and determining the display pose of the second virtual object in the AR picture based on the real-time positioning pose of the AR equipment and the preset relative pose relationship between the second virtual object and the AR equipment in the three-dimensional scene map.
In a possible implementation, the instructions executed by the processor 401 to detect a target control operation on the AR device includes:
detecting a target trigger operation acting on a target virtual object displayed on the AR device;
wherein the target virtual object comprises at least one of:
any first virtual object; a second virtual object; the button is activated.
In one possible implementation, the processor 401 executes instructions to detect a target trigger operation acting on a target virtual object presented on the AR device according to the following steps:
responding to a trigger operation acted on the screen of the AR equipment, and determining a screen coordinate position corresponding to the trigger operation;
converting the screen coordinate position to a camera coordinate position in a camera coordinate system based on the determined screen coordinate position and a first conversion relationship between the screen coordinate system and the camera coordinate system;
converting the camera coordinate position to a world coordinate system based on the converted camera coordinate position and a second conversion relation between the camera coordinate system and the world coordinate system to obtain a world coordinate position;
and under the condition that the world coordinate position falls into the position range corresponding to the target virtual object, determining that the target trigger operation acting on the target virtual object displayed on the AR equipment is detected.
In one possible implementation, the processor 401 executes instructions to display a first AR special effect of a first virtual object in an AR screen according to the following steps:
and under the condition that the AR equipment reaches the target interaction area indicated by the three-dimensional scene map based on the determined real-time positioning pose of the AR equipment and the pre-constructed three-dimensional scene map, displaying a first AR special effect of the first virtual object matched with the real-time positioning pose of the AR equipment in an AR picture.
In a possible implementation, the instructions executed by the processor 401 to present, in an AR screen, a first AR special effect of a first virtual object matching a real-time positioning pose of an AR device includes:
determining a target AR special effect data packet corresponding to the current interaction stage according to AR special effect data packets of the first virtual object respectively corresponding to different preset interaction stages;
and displaying a first AR special effect of the first virtual object matched with the real-time positioning pose of the AR equipment in the AR picture based on the target AR special effect data packet.
The specific execution process of the instruction may refer to the steps of the interaction method in the AR scene in the embodiment of the present disclosure, and details are not repeated here.
The embodiments of the present disclosure also provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the interaction method in the AR scenario in the foregoing method embodiments are executed. The storage medium may be a volatile or non-volatile computer-readable storage medium.
The computer program product of the interaction method in the AR scene provided in the embodiments of the present disclosure includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute steps of the interaction method in the AR scene described in the above method embodiments, which may be referred to in the above method embodiments specifically, and are not described herein again.
The embodiments of the present disclosure also provide a computer program, which when executed by a processor implements any one of the methods of the foregoing embodiments. The computer program product may be embodied in hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing an electronic device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (12)

1. An interaction method in an Augmented Reality (AR) scene, the method comprising:
displaying an AR picture matched with a real scene image on the AR equipment based on the real scene image shot by the AR equipment; a first AR special effect of at least one first virtual object is displayed in the AR picture;
and under the condition that target control operation on the AR equipment is detected and any one of the first virtual objects meets the AR special effect switching condition, displaying a second AR special effect after any one of the first virtual objects is switched in the AR picture.
2. The method according to claim 1, wherein a second virtual object is also displayed in the AR picture, and the second virtual object has a preset relative pose relationship with the AR device;
the detecting that any one of the first virtual objects meets the AR special effect switching condition includes:
detecting the second virtual object or the running track information of the associated sub-object of the second virtual object;
and detecting that any one first virtual object meets the AR special effect switching condition based on the running track information and the display poses of the at least one first virtual object in the AR picture.
3. The method according to claim 2, wherein the detecting that any one of the first virtual objects satisfies the AR special effect switching condition based on the moving trajectory information and the display pose of the at least one first virtual object in the AR picture, respectively, comprises:
and if the track point in the running track information falls into a preset position range corresponding to any one of the first virtual objects, detecting that any one of the first virtual objects meets the AR special effect switching condition.
4. The method according to claim 2 or 3, wherein the detecting motion trajectory information of the second virtual object or an associated sub-object of the second virtual object comprises:
acquiring control parameter information corresponding to the target control operation;
and determining the running track information of the second virtual object or the associated sub-object according to the display pose of the second virtual object in the AR picture and the determined control parameter information.
5. The method according to claim 4, wherein the display pose of the second virtual object in the AR picture is determined according to the following manner:
determining an initial positioning pose of the AR equipment based on a real scene image shot by the AR equipment and a pre-constructed three-dimensional scene map;
determining the real-time positioning pose of the AR equipment through real-time positioning and map building SLAM based on the initial positioning pose of the AR equipment;
and determining the display pose of the second virtual object in the AR picture based on the real-time positioning pose of the AR equipment and the preset relative pose relation between the second virtual object and the AR equipment in the three-dimensional scene map.
6. The interaction method according to any of claims 2 to 5, wherein said detecting a target control operation on said AR device comprises:
detecting a target trigger operation acting on a target virtual object displayed on the AR device;
wherein the target virtual object comprises at least one of:
any of the first virtual objects; the second virtual object; the button is activated.
7. The interaction method according to claim 6, characterized in that a target trigger action acting on a target virtual object presented on the AR device is detected according to the following steps:
responding to a trigger operation acted on the screen of the AR equipment, and determining a screen coordinate position corresponding to the trigger operation;
converting the screen coordinate position to a camera coordinate position in a camera coordinate system based on the determined screen coordinate position and a first conversion relationship between the screen coordinate system and the camera coordinate system;
converting the camera coordinate position to a world coordinate system based on the converted camera coordinate position and a second conversion relation between the camera coordinate system and the world coordinate system to obtain a world coordinate position;
and under the condition that the world coordinate position falls into the position range corresponding to the target virtual object, determining that the target trigger operation of the target virtual object acting on the AR equipment is detected.
8. The interaction method according to any of claims 1 to 7, wherein the first AR special effect of the first virtual object is presented in an AR screen according to the following steps:
and under the condition that the AR equipment reaches a target interaction area indicated by the three-dimensional scene map based on the determined real-time positioning pose of the AR equipment and the pre-constructed three-dimensional scene map, displaying a first AR special effect of a first virtual object matched with the real-time positioning pose of the AR equipment in an AR picture.
9. The interaction method according to claim 8, wherein presenting a first AR special effect of a first virtual object matching a real-time positioning pose of the AR device in an AR picture comprises:
determining a target AR special effect data packet corresponding to the current interaction stage according to AR special effect data packets of the first virtual object respectively corresponding to different preset interaction stages;
and displaying a first AR special effect of the first virtual object matched with the real-time positioning pose of the AR equipment in an AR picture based on the target AR special effect data packet.
10. An interaction apparatus in an Augmented Reality (AR) scene, the apparatus comprising:
the display module is used for displaying an AR picture matched with a real scene image on the AR equipment based on the real scene image shot by the AR equipment; a first AR special effect of at least one first virtual object is displayed in the AR picture;
and the switching module is used for displaying a second AR special effect after any one first virtual object is switched in the AR picture under the condition that target control operation on the AR equipment is detected and any one first virtual object is detected to meet the AR special effect switching condition.
11. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor being configured to execute the machine-readable instructions stored in the memory, the processor and the memory communicating via the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the interaction method in the augmented reality AR scenario according to any one of claims 1 to 9.
12. A computer-readable storage medium, having a computer program stored thereon, wherein the computer program, when executed by an electronic device, causes the electronic device to perform the steps of the interaction method in an augmented reality AR scenario according to any one of claims 1 to 9.
CN202011011073.2A 2020-09-23 2020-09-23 Interaction method and device in AR scene, electronic equipment and storage medium Pending CN112148189A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011011073.2A CN112148189A (en) 2020-09-23 2020-09-23 Interaction method and device in AR scene, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011011073.2A CN112148189A (en) 2020-09-23 2020-09-23 Interaction method and device in AR scene, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112148189A true CN112148189A (en) 2020-12-29

Family

ID=73897881

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011011073.2A Pending CN112148189A (en) 2020-09-23 2020-09-23 Interaction method and device in AR scene, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112148189A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112882576A (en) * 2021-02-26 2021-06-01 北京市商汤科技开发有限公司 AR interaction method and device, electronic equipment and storage medium
CN112905014A (en) * 2021-02-26 2021-06-04 北京市商汤科技开发有限公司 Interaction method and device in AR scene, electronic equipment and storage medium
CN113194329A (en) * 2021-05-10 2021-07-30 广州繁星互娱信息科技有限公司 Live broadcast interaction method, device, terminal and storage medium
CN113407267A (en) * 2021-05-07 2021-09-17 上海纽盾科技股份有限公司 AR auxiliary data processing method, device and system in equal insurance evaluation
CN113473019A (en) * 2021-07-01 2021-10-01 北京字跳网络技术有限公司 Image processing method, device, equipment and storage medium
CN113721804A (en) * 2021-08-20 2021-11-30 北京市商汤科技开发有限公司 Display method, display device, electronic equipment and computer readable storage medium
CN114155605A (en) * 2021-12-03 2022-03-08 北京字跳网络技术有限公司 Control method, control device and computer storage medium
WO2022156367A1 (en) * 2021-01-21 2022-07-28 北京字跳网络技术有限公司 Data generation control method and apparatus, and electronic device and storage medium
CN116020122A (en) * 2023-03-24 2023-04-28 深圳游禧科技有限公司 Game attack recommendation method, device, equipment and storage medium
WO2023124691A1 (en) * 2021-12-31 2023-07-06 上海商汤智能科技有限公司 Display of augmented reality scene
WO2023138559A1 (en) * 2022-01-21 2023-07-27 北京字跳网络技术有限公司 Virtual reality interaction method and apparatus, and device and storage medium
WO2023174097A1 (en) * 2022-03-15 2023-09-21 北京字跳网络技术有限公司 Interaction method and apparatus, device and computer-readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107025662A (en) * 2016-01-29 2017-08-08 成都理想境界科技有限公司 A kind of method for realizing augmented reality, server, terminal and system
CN109345581A (en) * 2018-07-30 2019-02-15 中国科学院自动化研究所 Augmented reality method, apparatus and system based on more mesh cameras
US20200126313A1 (en) * 2018-10-23 2020-04-23 Disney Enterprises, Inc. Distorted view augmented reality
CN111610998A (en) * 2020-05-26 2020-09-01 北京市商汤科技开发有限公司 AR scene content generation method, display method, device and storage medium
CN111617471A (en) * 2020-06-08 2020-09-04 浙江商汤科技开发有限公司 Virtual shooting display method and device, electronic equipment and storage medium
CN111638793A (en) * 2020-06-04 2020-09-08 浙江商汤科技开发有限公司 Aircraft display method and device, electronic equipment and storage medium
CN111640197A (en) * 2020-06-09 2020-09-08 上海商汤智能科技有限公司 Augmented reality AR special effect control method, device and equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107025662A (en) * 2016-01-29 2017-08-08 成都理想境界科技有限公司 A kind of method for realizing augmented reality, server, terminal and system
CN109345581A (en) * 2018-07-30 2019-02-15 中国科学院自动化研究所 Augmented reality method, apparatus and system based on more mesh cameras
US20200126313A1 (en) * 2018-10-23 2020-04-23 Disney Enterprises, Inc. Distorted view augmented reality
CN111610998A (en) * 2020-05-26 2020-09-01 北京市商汤科技开发有限公司 AR scene content generation method, display method, device and storage medium
CN111638793A (en) * 2020-06-04 2020-09-08 浙江商汤科技开发有限公司 Aircraft display method and device, electronic equipment and storage medium
CN111617471A (en) * 2020-06-08 2020-09-04 浙江商汤科技开发有限公司 Virtual shooting display method and device, electronic equipment and storage medium
CN111640197A (en) * 2020-06-09 2020-09-08 上海商汤智能科技有限公司 Augmented reality AR special effect control method, device and equipment

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022156367A1 (en) * 2021-01-21 2022-07-28 北京字跳网络技术有限公司 Data generation control method and apparatus, and electronic device and storage medium
CN112905014A (en) * 2021-02-26 2021-06-04 北京市商汤科技开发有限公司 Interaction method and device in AR scene, electronic equipment and storage medium
CN112882576A (en) * 2021-02-26 2021-06-01 北京市商汤科技开发有限公司 AR interaction method and device, electronic equipment and storage medium
CN113407267B (en) * 2021-05-07 2023-01-06 上海纽盾科技股份有限公司 AR auxiliary data processing method, device and system in equal insurance evaluation
CN113407267A (en) * 2021-05-07 2021-09-17 上海纽盾科技股份有限公司 AR auxiliary data processing method, device and system in equal insurance evaluation
CN113194329A (en) * 2021-05-10 2021-07-30 广州繁星互娱信息科技有限公司 Live broadcast interaction method, device, terminal and storage medium
CN113473019A (en) * 2021-07-01 2021-10-01 北京字跳网络技术有限公司 Image processing method, device, equipment and storage medium
CN113721804A (en) * 2021-08-20 2021-11-30 北京市商汤科技开发有限公司 Display method, display device, electronic equipment and computer readable storage medium
CN114155605A (en) * 2021-12-03 2022-03-08 北京字跳网络技术有限公司 Control method, control device and computer storage medium
CN114155605B (en) * 2021-12-03 2023-09-15 北京字跳网络技术有限公司 Control method, device and computer storage medium
WO2023124691A1 (en) * 2021-12-31 2023-07-06 上海商汤智能科技有限公司 Display of augmented reality scene
WO2023138559A1 (en) * 2022-01-21 2023-07-27 北京字跳网络技术有限公司 Virtual reality interaction method and apparatus, and device and storage medium
WO2023174097A1 (en) * 2022-03-15 2023-09-21 北京字跳网络技术有限公司 Interaction method and apparatus, device and computer-readable storage medium
CN116020122A (en) * 2023-03-24 2023-04-28 深圳游禧科技有限公司 Game attack recommendation method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN112148189A (en) Interaction method and device in AR scene, electronic equipment and storage medium
CN111551188B (en) Navigation route generation method and device
CN111638793B (en) Display method and device of aircraft, electronic equipment and storage medium
CN112148197A (en) Augmented reality AR interaction method and device, electronic equipment and storage medium
CN111694430A (en) AR scene picture presentation method and device, electronic equipment and storage medium
CN109671141B (en) Image rendering method and device, storage medium and electronic device
CN112198959A (en) Virtual reality interaction method, device and system
CN112148125A (en) AR interaction state control method, device, equipment and storage medium
CN111679742A (en) Interaction control method and device based on AR, electronic equipment and storage medium
CN111638797A (en) Display control method and device
CN110545442A (en) live broadcast interaction method and device, electronic equipment and readable storage medium
CN114363689B (en) Live broadcast control method and device, storage medium and electronic equipment
CN112905014A (en) Interaction method and device in AR scene, electronic equipment and storage medium
CN112637665B (en) Display method and device in augmented reality scene, electronic equipment and storage medium
CN111882674A (en) Virtual object adjusting method and device, electronic equipment and storage medium
CN111651057A (en) Data display method and device, electronic equipment and storage medium
CN106536004B (en) enhanced gaming platform
CN111667588A (en) Person image processing method, person image processing device, AR device and storage medium
CN112882576A (en) AR interaction method and device, electronic equipment and storage medium
CN110544315B (en) Virtual object control method and related equipment
CN111651052A (en) Virtual sand table display method and device, electronic equipment and storage medium
Bikos et al. An interactive augmented reality chess game using bare-hand pinch gestures
Lee et al. A development of virtual reality game utilizing kinect, oculus rift and smartphone
CN111569414B (en) Flight display method and device of virtual aircraft, electronic equipment and storage medium
CN113359983A (en) Augmented reality data presentation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201229