CN113325946A - Virtual gift interaction method based on augmented reality and related device - Google Patents

Virtual gift interaction method based on augmented reality and related device Download PDF

Info

Publication number
CN113325946A
CN113325946A CN202010129769.9A CN202010129769A CN113325946A CN 113325946 A CN113325946 A CN 113325946A CN 202010129769 A CN202010129769 A CN 202010129769A CN 113325946 A CN113325946 A CN 113325946A
Authority
CN
China
Prior art keywords
interface
client
image
virtual gift
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010129769.9A
Other languages
Chinese (zh)
Inventor
何碧莹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010129769.9A priority Critical patent/CN113325946A/en
Publication of CN113325946A publication Critical patent/CN113325946A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/542Event management; Broadcasting; Multicasting; Notifications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a virtual gift interaction method based on augmented reality and a related device. The method comprises the following steps: a first client receives an object scanning instruction; the first client displays a prompt message and an image to be recognized on an object recognition interface according to the object scanning instruction; if the image to be recognized is successfully matched with the prompt message, the first client generates an augmented reality animation corresponding to the virtual gift according to the image to be recognized and an animation playing effect; and the first client plays the augmented reality animation corresponding to the virtual gift on the image to be identified. This application can fuse virtual gift and real scene, has increased the interactive interest of gift and flexibility, still fuses the media information of treating the broadcast that the gift giver set up and real scene, generates the augmented reality animation to broadcast this augmented reality animation on the client that the gift receiver used, thereby promoted the interactivity between gift giver and the gift receiver.

Description

Virtual gift interaction method based on augmented reality and related device
Technical Field
The present application relates to the field of computer technologies, and in particular, to a virtual gift interaction method and a related apparatus based on augmented reality.
Background
The Augmented Reality (AR) technology is a technology for calculating the position and angle of a camera image in real time and adding a corresponding image and a video 3D model, and along with the improvement of the computing capability of a Central Processing Unit (CPU) of a terminal device, the AR technology is more and more widely used.
Currently, in an instant messaging application, a user may present a virtual gift to a friend, for example, a user a enters a gift presenting interface, and then selects a virtual gift to be presented (e.g., gift a) and a user receiving the virtual gift (e.g., user b), and after receiving a prompt message, the user b clicks the prompt message to receive the gift a.
However, the current virtual gift receiving method is single, and for the gift receiver, the virtual gift can be obtained only by interacting with the terminal interface, which is not favorable for improving the interactivity between users.
Disclosure of Invention
The embodiment of the application provides a virtual gift interaction method and a related device based on augmented reality, which can fuse a virtual gift with a real scene, increase the interestingness and flexibility of gift interaction, further fuse to-be-played media information set by a gift giver with the real scene, generate an augmented reality animation, and play the augmented reality animation on a client used by a gift receiver, thereby improving the interactivity between the gift giver and the gift receiver.
In view of this, an embodiment of the present application provides an augmented reality-based virtual gift interaction method, including:
the method comprises the steps that a first client receives an object scanning instruction, wherein the object scanning instruction is used for starting an image acquisition device;
the method comprises the steps that a first client displays a prompt message and an image to be recognized on an object recognition interface according to an object scanning instruction, wherein the prompt message is generated based on a virtual gift, and the image to be recognized is an image collected through an image collecting device;
if the image to be recognized is successfully matched with the prompt message, the first client generates an augmented reality animation corresponding to the virtual gift according to the image to be recognized and an animation playing effect;
and the first client plays the augmented reality animation corresponding to the virtual gift on the image to be identified.
The embodiment of the application provides another virtual gift interaction method based on augmented reality, which comprises the following steps:
the second client receives an object selection instruction, wherein the object selection instruction carries an identifier of the virtual gift;
the second client displays the virtual gift on the object setting interface according to the object selection instruction;
the second client receives an object selection instruction through an object selection interface provided by the object setting interface, wherein the object selection instruction carries a receiver identifier of the virtual gift;
the second client receives the object transmission instruction and sends the virtual gift to the first client according to the object transmission instruction, wherein the object transmission instruction at least carries an identifier of the virtual gift and an identifier of a receiver of the virtual gift, and the identifier of the receiver of the virtual gift has a corresponding relation with the first client.
The embodiment of the application provides a virtual gift interaction device based on augmented reality, includes:
the receiving module is used for receiving an object scanning instruction, wherein the object scanning instruction is used for starting the image acquisition device;
the display module is used for displaying a prompt message and an image to be recognized on an object recognition interface according to an object scanning instruction, wherein the prompt message is a message generated based on a virtual gift, and the image to be recognized is an image acquired by an image acquisition device;
the generating module is used for generating an augmented reality animation corresponding to the virtual gift according to the image to be recognized and the animation playing effect if the image to be recognized is successfully matched with the prompt message;
and the playing module is used for playing the augmented reality animation corresponding to the virtual gift on the image to be identified.
In one possible design, the embodiments of the present application further include,
the receiving module is specifically used for displaying a conversation prompting interface, wherein the conversation prompting interface is used for providing a scanning interface;
and receiving an object scanning command through a scanning interface.
In one possible design, the embodiments of the present application further include,
the display module is specifically used for displaying the prompt message and the identification area on the object identification interface according to the object scanning instruction;
and displaying a target image on the object recognition interface according to the object scanning instruction, and highlighting the image to be recognized in the recognition area, wherein the image to be recognized belongs to one part of the target image.
In one possible design, the client in the embodiment of the present application further includes a sending module, a determining module, and a shooting module;
the device comprises a sending module, a recognition module and a processing module, wherein the sending module is used for sending an image to be recognized to a server so as to enable the server to recognize the image to be recognized based on a stored image set to obtain a recognition result, and the stored image set comprises at least one stored image;
the determining module is used for determining that the image to be recognized is successfully matched with the prompt message by the first client if the virtual gift is recognized in the recognition result;
and the shooting module is used for continuously shooting the real-time image through the image acquisition device by the first client side if the virtual gift is not identified in the identification result, and sending the real-time image to the server so as to enable the server to identify the real-time image.
In one possible design, the embodiments of the present application further include,
the generating module is specifically used for acquiring the media information to be played, wherein the media information to be played is set by the second client;
and generating the augmented reality animation according to the image to be recognized, the animation playing effect and the media information to be played.
In one possible design, the embodiments of the present application further include,
the receiving module is also used for receiving an interface return instruction;
the display module is also used for displaying a conversation prompt interface according to the interface return instruction, wherein the conversation prompt interface is used for providing an information viewing interface and an animation reproduction interface;
and the display module is also used for displaying the image corresponding to the virtual gift on the conversation prompt interface.
In one possible design, the embodiments of the present application further include,
the receiving module is also used for receiving an information viewing instruction through the information viewing interface;
and the display module is also used for displaying a gift detail interface according to the information viewing instruction, wherein the gift detail interface is used for displaying at least one of sender information, receiver information and blessing message information of the virtual gift.
In one possible design, the embodiments of the present application further include,
the receiving module is also used for receiving the animation reproduction instruction through the animation reproduction interface;
the display module is also used for displaying the object identification interface according to the animation reproduction instruction;
and the display module is also used for displaying the prompt message and the real-time image shot by the image acquisition device on the object identification interface.
The embodiment of the application provides another kind of virtual gift interaction device based on augmented reality, includes:
the receiving module is used for receiving an object selection instruction, wherein the object selection instruction carries an identifier of the virtual gift;
the display module is used for displaying the virtual gift on the object setting interface according to the object selection instruction;
the receiving module is further used for receiving an object selection instruction through an object selection interface provided by the object setting interface, wherein the object selection instruction carries a receiver identifier of the virtual gift;
the receiving module is further used for receiving an object transmission instruction and sending the virtual gift to the first client according to the object transmission instruction, wherein the object transmission instruction at least carries an identifier of the virtual gift and an identifier of a receiver of the virtual gift, and the identifier of the receiver of the virtual gift has a corresponding relation with the first client.
In a possible design, the client in the embodiment of the present application further includes an obtaining module;
the object setting interface comprises a date setting interface, when the object transmission instruction carries target date information,
the acquisition module is used for acquiring target date information through a date setting interface, wherein the target date information is the date when the first client side receives the virtual gift;
or the object setting interface comprises a blessing words filling area, when the object transmission instruction carries the blessing words information,
the obtaining module is further used for obtaining blessing message through the blessing message filling area, wherein the blessing message is displayed on the gift detail interface by the first client;
or the object setting interface comprises a package selection interface, when the object transmission instruction carries the identification of the target package,
the receiving module is further used for receiving a package selection instruction through the package selection interface, wherein the package selection instruction carries an identifier of a target package, the identifier of the target package and the target package have a unique corresponding relation, and the target package is a package image displayed by the first client when receiving the virtual gift;
or the object setting interface also comprises an effect selection interface, when the object transmission instruction also carries an identification of the animation playing effect,
the receiving module is further used for receiving an effect selection instruction through the effect selection interface, wherein the effect selection instruction carries an identification of an animation playing effect, the identification of the animation playing effect corresponds to the animation playing effect, and the animation playing effect is used for the first client to generate the augmented reality animation.
In one possible design, the client according to the embodiment of the present application further includes a determining module and a sending module;
the object setting interface also comprises a media information adding interface;
the receiving module is also used for receiving a media adding instruction through the media information adding interface;
the display module is also used for displaying a media information selection interface according to the media adding instruction, wherein the media information selection interface displays at least one piece of media information;
the receiving module is also used for receiving a media uploading instruction;
the determining module is used for determining the media information to be played from at least one piece of media information according to the media uploading instruction;
and the sending module is used for sending the media information to be played to the server when the second client sends the object transmission instruction to the server, so that the server stores the media information to be played, and the media information to be played is used for the first client to generate the augmented reality animation.
An embodiment of the present application provides a terminal device, including: a memory, a transceiver, a processor, and a bus system;
wherein, the memory is used for storing programs;
a processor for executing the program in the memory, the processor for performing the above-described aspects of the method according to instructions in the program code;
the bus system is used for connecting the memory and the processor so as to enable the memory and the processor to communicate.
The present application provides a computer-readable storage medium having stored therein instructions, which when executed on a computer, cause the computer to perform the methods of the above-described aspects.
According to the technical scheme, the embodiment of the application has the following advantages:
in the embodiment of the application, an augmented reality-based virtual gift interaction method is provided, a first client can receive an object scanning instruction, the object scanning instruction is used for starting an image acquisition device, then the first client can display a prompt message and an image to be identified on an object identification interface according to the object scanning instruction, the prompt message is a message generated based on a virtual gift, the image to be identified is an image acquired through the image acquisition device, when the image to be identified is successfully matched with the prompt message, the first client generates an augmented reality animation corresponding to the virtual gift according to the image to be identified and an animation playing effect, and finally the first client plays the augmented reality animation corresponding to the virtual gift on the image to be identified. Through the mode, the virtual gift is fused with the real scene, the interestingness and flexibility of gift interaction are increased, meanwhile, the media information to be played, which is set by the gift giver, is fused with the real scene to generate the augmented reality animation, and the augmented reality animation is played on the client used by the gift receiving party, so that the interactivity between the gift giver and the gift receiving party is improved.
Drawings
Fig. 1 is a schematic diagram of an architecture of a virtual gift interactive system in an embodiment of the present application;
FIG. 2 is a schematic diagram of an embodiment of an augmented reality-based virtual gift interaction method in an embodiment of the present application;
FIG. 3 is a schematic diagram of an interface for a virtual gift notification message in an embodiment of the present application;
FIG. 4 is a schematic view of an object recognition interface in an embodiment of the present application;
FIG. 5 is a schematic diagram illustrating AR animation displayed through an object recognition interface according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a session prompt interface in an embodiment of the present application;
FIG. 7 is a schematic diagram illustrating an identification area and a prompt message displayed via an object identification interface in an embodiment of the application;
FIG. 8 is a schematic flow chart illustrating a method for virtual gift interaction based on augmented reality according to an embodiment of the present application;
FIG. 9 is a schematic diagram of an interface for generating an AR animation according to the embodiment of the present application, that is, the media information to be played;
FIG. 10 is another illustration of a session prompt interface in an embodiment of the present application;
FIG. 11 is a schematic illustration of a gift details interface in an embodiment of the present application;
FIG. 12 is a schematic flow chart illustrating a method for virtual gift interaction based on augmented reality according to an embodiment of the present application;
FIG. 13 is a schematic illustration of a recurring object identification interface in an embodiment of the present application;
FIG. 14 is a schematic diagram of another embodiment of an augmented reality-based virtual gift interaction method in an embodiment of the present application;
FIG. 15 is a schematic illustration of an interface for selecting a virtual gift in an embodiment of the subject application;
fig. 16 is a schematic diagram of an interface for selecting a recipient of a virtual gift in an embodiment of the present application;
FIG. 17 is a schematic flow chart illustrating a method for virtual gift interaction based on augmented reality according to an embodiment of the present application;
FIG. 18 is a schematic illustration of an interface after triggering an object transfer command in an embodiment of the present application;
FIG. 19 is a schematic view of an object placement interface in accordance with an embodiment of the present application;
FIG. 20 is a schematic diagram of an interface for adding media information to be played according to an embodiment of the present application;
FIG. 21 is a schematic diagram of an embodiment of an augmented reality based virtual gift interaction apparatus according to an embodiment of the present application;
FIG. 22 is a schematic diagram of an embodiment of another augmented reality-based virtual gift interaction apparatus according to an embodiment of the present application;
fig. 23 is a schematic structural diagram of a terminal device in the embodiment of the present application.
Detailed Description
The embodiment of the application provides a virtual gift interaction method based on augmented reality and a related device, which are used for fusing a virtual gift with a real scene, increasing the interestingness and flexibility of gift interaction, fusing to-be-played media information set by a gift giver with the real scene, generating an augmented reality animation, and playing the augmented reality animation on a client used by a gift receiver, thereby improving the interactivity between the gift giver and the gift receiver.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "corresponding" and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that the present application may be applied to a scene in which virtual gift interaction is performed based on an Augmented Reality (AR) technology, where the AR technology refers to a technology that performs position and angle refinement through a camera image and adds an image analysis technology, for example, by using multiple technical means such as multimedia, three-dimensional modeling, real-time tracking and registration, intelligent interaction, and sensing, virtual information such as characters, images, three-dimensional models, music, and video generated by a computer is simulated and applied to the real world, and the two kinds of information are supplemented with each other, so as to achieve "augmentation" of the real world, that is, the AR technology is a technology that enables a virtual world on a screen to be combined and interacted with a real world scene, and along with the improvement of the CPU computing capability of a terminal device, the AR technology is more and more widely used.
Specifically, taking an application to an instant messaging application as an example, a client of a virtual gift sender in the instant messaging application may enter a virtual gift mall to select an AR gift to be sent, and select a prompt, scan an object, and add additional User Generated Content (UGC), where the UGC represents original content selected or generated by the virtual gift sender, such as a picture, a video, and text of the virtual gift sender. After the virtual gift receiving party receives the notification of the virtual gift through the client, the image collecting device can collect images, after the corresponding object is identified, the preset animation is played in the scene in combination with the scanning object in real time, and UGC is displayed. For the virtual gift receiver, the virtual gift can be fused with the real scene, the interestingness and flexibility of interaction between the virtual gift sender and the virtual gift receiver are increased, UGC set by the virtual gift sender is fused with the real scene to generate augmented reality animation, the augmented reality animation can be played on a client used by the gift receiver, and therefore the interactivity between the gift giver and the gift receiver is improved again.
In order to improve the interactivity between the gift-presenting party and the gift-receiving party, the present application provides a virtual gift interaction method based on augmented reality, which is applied to the virtual gift interaction system shown in fig. 1, please refer to fig. 1, where fig. 1 is an architecture diagram of the virtual gift interaction system in the embodiment of the present application, as shown in the figure, the virtual gift interaction system includes a server, a first client and a second client, where the first client is a client used by the virtual gift-receiving party, and the second client is a client used by the virtual gift-sending party. Specifically, the user a clicks the AR gift in the mall list through the second client, and the second client generates and renders the detail page according to the information transmitted from the mall detail page. The user A clicks the 'gift friend', so that a friend list is requested from the second client, the second client requests data from the server, the server searches the database and returns the 'friend list' data to the second client, and the second client generates and renders the friend list. And then the user A selects the corresponding gift-receiving friend through the second client, and the second client destroys the friend list page and updates the detailed page data. And the user A continues to add media information such as pictures or videos through the second client, the second client calls the functions of the camera and the album in the native system, the user A selects the pictures, the videos, the photos or the videos, and the second client stores the multimedia information selected by the user A in the file object. In addition, the user A can also enter related text information, and the second client side updates the detail page. And finally, the user A clicks a 'give away' button through the second client, the second client sends the related information and files uploaded and recorded by the user A to the server, the server receives the data and records the data into the database, the server sends a command of successful data transmission to the second client, and the second client automatically skips and renders a successful prompt page.
Accordingly, the user B is a virtual gift receiver, the user B clicks the notification information in the chat dialog box through the first client, then the second client invokes the camera and transmits the image stream to the server in real time, and the server receives the real-time image stream and analyzes the image to mark out the detection object. And the server identifies the detection object image according to the image identification library so as to obtain an identification result. And the server feeds back the recognition result to the first client, and the first client draws and renders prompt information in the camera picture according to the recognition result and plays the corresponding AR animation. And the user B exits the camera interface, the first client closes the camera interface and returns to the chat dialog box interface. And when the user B clicks the notification information again, the first client opens the gift page and requests page data, the first client sends the request data to the server, and the server receives the request data and returns data related to the detail page to the first client. And the first client renders a page according to the data sent by the server.
The server in fig. 1 may be one server or a server cluster composed of multiple servers, or a cloud computing center, and the like, which are not limited herein. The first client and the second client may be respectively deployed on different terminal devices, or may be deployed on the same terminal device. The terminal device includes, but is not limited to, a tablet computer, a notebook computer, a palm computer, a mobile phone, a Personal Computer (PC), and a voice interaction device, and is not limited herein.
With reference to the above description, an augmented reality-based virtual gift interaction method in the present application will be described below from the perspective of a first client used by a virtual gift receiver, please refer to fig. 2, where fig. 2 is a schematic diagram of an embodiment of the augmented reality-based virtual gift interaction method in the embodiment of the present application, and as shown in the drawing, an embodiment of the augmented reality-based virtual gift interaction method in the embodiment of the present application includes:
101. the method comprises the steps that a first client receives an object scanning instruction, wherein the object scanning instruction is used for starting an image acquisition device;
in this embodiment, the first client receives the object scanning instruction, and is configured to start the image capturing device according to the object scanning instruction, where the first client may be a client used by a receiver of the virtual gift. It should be noted that the first client is deployed on the terminal device, and the image acquisition device may be an internal camera of the terminal device, or the image acquisition device may be an external camera connected to the terminal device corresponding to the first client, such as an external wide-angle lens, an external telephoto lens, an external range-extending lens, an external fisheye lens, and the like. However, it should be understood that, in practical applications, the image acquisition device may enable the terminal device to complete image acquisition, and the specific image acquisition device should be flexibly determined according to practical situations, which is not limited here.
For convenience of understanding, referring to fig. 3, fig. 3 is an interface schematic diagram of a virtual gift notification message in an embodiment of the present application, as shown in the figure, the first client may receive an object scanning instruction triggered by the virtual gift receiver, that is, the virtual gift receiver triggers the object scanning instruction by clicking a "start scanning" module shown as a1 in fig. 3, and then may start the image acquisition device to acquire an image by using the object scanning instruction. It should be understood that the present embodiment is illustrated by taking the "start scan" shown in a1 as an example, and in practical applications, the content of the document shown in a1 includes but is not limited to "click scan", "please scan", and "click here to scan". In addition, the virtual gift notification message displayed by the first client further includes the name and the head portrait of the sender of the virtual gift, such as "xiaoming" and the head portrait, and displays the related prompt, i.e. xiaoming "sends you a gift, finds and scans the corresponding item according to the prompt", thereby prompting the receiver of the virtual gift to trigger the object scanning instruction.
102. The method comprises the steps that a first client displays a prompt message and an image to be recognized on an object recognition interface according to an object scanning instruction, wherein the prompt message is generated based on a virtual gift, and the image to be recognized is an image collected through an image collecting device;
in this embodiment, after the first client receives the object scanning instruction through step 101, the first client may jump to an object identification interface according to the object scanning instruction, and display a prompt message and an image to be identified on the object identification interface, where the prompt message is a message generated based on a virtual gift, and the image to be identified is an image acquired by an image acquisition device started according to the object scanning instruction. The prompt message may include, but is not limited to, "find favorite fat house water", "find my favorite fat house happy chicken", and "find my fat house happy cake that is frequently eaten". For example, when the virtual gift is a cola, a prompt message "find favorite fertile home water" may be generated according to the cola. When the virtual gift is a fried chicken, a prompt message 'find the favorite fat house happy chicken' can be generated according to the fried chicken. When the virtual gift is pizza, a prompt message 'find a happy cake of fat house that I eat frequently' can be generated according to the pizza. It should be understood that the present disclosure is only illustrative and should not be construed as limiting the present application.
For convenience of understanding, a prompt message is taken as an example of "finding a favorite fat home water", please refer to fig. 4, and fig. 4 is a schematic diagram of an object identification interface in an embodiment of the present application, as shown in fig. 4, (a) is a schematic diagram of an object scanning instruction triggered by a virtual gift receiver by clicking a "start scanning" module shown as B1 in fig. 4(a), and a first client receives the object scanning instruction, and then starts an image capture device, so as to jump to an object identification interface B2 shown in fig. 4 (B), where the object identification interface B2 may display a prompt message B3 and an image to be recognized B4, the prompt message B3 is "finding a favorite fat home water", and the image to be recognized B4 is a real-time image captured by an image capture device started by the object scanning instruction.
103. If the image to be recognized is successfully matched with the prompt message, the first client generates an augmented reality animation corresponding to the virtual gift according to the image to be recognized and an animation playing effect;
in this embodiment, after the first client displays the image to be recognized and the prompt message through step 102, the server may further obtain a recognition result of matching the image to be recognized and the prompt message, and if the image to be recognized and the prompt message are successfully matched, the first client may generate the augmented reality animation corresponding to the virtual gift according to the image to be recognized and the animation playing effect. Specifically, if the prompt message is "find favorite fat home water", the image to be recognized includes the identified cola image, the prompt message is successfully matched with the image to be recognized, and then the AR animation corresponding to the cola can be generated according to the image to be recognized and the animation playing effect. For example, the prompt message is "find a good house happy cake that i eat frequently", and the image to be recognized includes the recognized pizza image, then the prompt message is successfully matched with the image to be recognized, and then the augmented reality animation corresponding to the pizza can be generated according to the pizza in the image to be recognized and the animation playing effect.
For convenience of understanding, the prompt message is "find favorite fat home water", and the image to be recognized includes a cola image as an example for explanation, please refer to fig. 5, fig. 5 is a schematic diagram illustrating AR animation displayed through an object recognition interface in the embodiment of the present application, as shown in fig. 5 (a), a prompt message C2 and an image to be recognized C3 are displayed through an object recognition interface C1, where the prompt message C2 is "find favorite fat home water", and the image to be recognized C3 includes a cola image, that is, the prompt message and the image to be recognized match successfully. A matching success message C4 and an image to be recognized C3 are shown on the object recognition interface C1 shown in fig. 5 (B), and then an AR animation as shown in fig. 5 (C) is generated from the image to be recognized C3 and the animation play effect. It should be noted that the matching success message shown in fig. 5 is "successfully found: cola ", in practice, the document content matching the success message includes, but is not limited to" Cola, successfully found "and" Congratulous found: cola ". The example in fig. 5 is only used for understanding the present solution, and the image to be recognized, the prompt message, the animation playing effect, and the matching success message should be flexibly determined according to the actual situation.
104. And the first client plays the augmented reality animation corresponding to the virtual gift on the image to be identified.
In this embodiment, after the first client generates the AR animation according to the image to be recognized and the animation playing effect through step 103, the first client may play the AR animation on the image to be recognized. For convenience of understanding, the example of including the cola image in the image to be recognized is taken as an example for explanation, please refer to fig. 5 again, as shown in the figure, after the AR animation shown in fig. 5 (C) is generated, the first client may play the AR animation on the image to be recognized.
In the embodiment of the application, a virtual gift interaction method based on augmented reality is provided, and by the above mode, the virtual gift is fused with a real scene, so that the interestingness and flexibility of gift interaction are increased, media information to be played, which is set by a gift giver, is fused with the real scene to generate augmented reality animation, and the augmented reality animation is played on a client used by a gift receiver, so that the interactivity between the gift giver and the gift receiver is improved.
Optionally, on the basis of the embodiment corresponding to fig. 2, in an optional embodiment of the virtual gift interaction method based on augmented reality provided in the embodiment of the present application, the receiving, by the first client, an object scanning instruction may include:
the first client displays a session prompt interface, wherein the session prompt interface is used for providing a scanning interface;
the first client receives an object scanning instruction through the scanning interface.
In this embodiment, a method for receiving an object scanning instruction is introduced, where the first client may further display a session prompt interface, the session prompt interface provides a scanning interface, and then the first client receives, through the scanning interface, an object scanning instruction triggered by the virtual gift receiver.
Specifically, for the convenience of understanding, please refer to fig. 6, and fig. 6 is a schematic diagram of a session prompt interface in the embodiment of the present application, as shown in the figure, the first client displays a session prompt interface D1, the session prompt interface D1 provides a scan interface D2, as in the case described in the foregoing embodiment, the document content of the scan interface D2 may be "start scanning", and the first client receives an object scan command triggered by the virtual gift receiver through the scan interface D2. It should be understood that the example in fig. 6 is only used for understanding the present solution, and the position and shape size of the specific scanning interface should be flexibly determined according to the actual situation.
Secondly, in the embodiment of the application, a method for receiving an object scanning instruction is provided, and by the above manner, based on a scanning interface provided by a session prompt interface, the object scanning instruction can be received through the scanning interface, so that the feasibility and operability of the scheme are improved.
Optionally, on the basis of the embodiment corresponding to fig. 2, in an optional embodiment of the virtual gift interaction method based on augmented reality provided in the embodiment of the present application, the displaying, by the first client, the prompt message and the image to be recognized on the object recognition interface according to the object scanning instruction may include:
the first client displays a prompt message and an identification area on an object identification interface according to the object scanning instruction;
the first client displays a target image on the object recognition interface according to the object scanning instruction, and highlights an image to be recognized in the recognition area, wherein the image to be recognized belongs to one part of the target image.
In this embodiment, a method for displaying a prompt message and an image to be recognized is introduced, where the first client may further display the prompt message and a recognition area on the object recognition interface at the same time according to an object scanning instruction, and the recognition area may prominently display an image captured in real time by the image capture device, and further, the first client may further display a target image on the object recognition interface according to the object scanning instruction and prominently display the image to be recognized in the recognition area. The target image is a shot panoramic image, and the image presented in the identification area is a part of the panoramic image. It should be noted that the identification area may be located at different positions of the object identification interface, for example, at the center of the object identification interface, directly below the object identification interface, or directly below the object identification interface, and the like, which is not limited herein. It should be noted that the identification area may also be in different shapes, for example, the identification area is a circle, an ellipse, a square, a triangle, or a pentagram.
For convenience of understanding, the prompt message is "find favorite fat home water", and the image to be recognized includes cola as an example, please refer to fig. 7, fig. 7 is a schematic diagram illustrating the recognition area and the prompt message through the object recognition interface in the embodiment of the present application, as shown in the figure, the object recognition interface E1 may display the prompt message E2 and the recognition area E3, where the recognition area E3 may highlight the image captured in real time by the image capturing device, such as the image in the highlight recognition area E3. The first client presents the target image E4 on the object recognition interface E1 according to the object scanning instruction, and presents the image to be recognized E5 in a highlighted (highlighted) manner within the recognition area E3, and can see that the image to be recognized E5 belongs to a part of the target image E4. It is understood that the example in fig. 7 is only used for understanding the present solution, and the specific position and shape size of the recognition area should be flexibly determined according to the actual situation.
In addition, in the embodiment of the application, a method for displaying the prompt message and the image to be recognized is provided, and through the mode, the first client highlights the displayed image to be recognized according to the recognition area in the recognition area displayed on the object recognition interface, and meanwhile the virtual gift receiver can see the matching condition of the shot image in real time through displaying the image to be recognized and the prompt message, so that the matching intuitiveness is improved.
Optionally, on the basis of the embodiment corresponding to fig. 2, in an optional embodiment of the virtual gift interaction method based on augmented reality provided in the embodiment of the present application, after the first client displays the prompt message and the image to be recognized on the object recognition interface according to the object scanning instruction, the virtual gift interaction method may further include:
the method comprises the steps that a first client sends an image to be identified to a server, so that the server identifies the image to be identified based on a stored image set to obtain an identification result, wherein the stored image set comprises at least one stored image;
if the identification result is that the virtual gift is identified, the first client determines that the image to be identified is successfully matched with the prompt message;
and if the virtual gift is not identified in the identification result, the first client continuously shoots the real-time image through the image acquisition device and sends the real-time image to the server so that the server identifies the real-time image.
In this embodiment, a method for identifying an image to be identified is introduced, where the first client displays a prompt message and the image to be identified on an object identification interface according to an object scanning instruction, and then may send the image to be identified to the server. The server can extract a stored image set from the image database, and identify the image to be identified based on the stored image set to obtain an identification result, wherein the stored image set comprises at least one stored image. If the virtual gift is recognized as a result of the recognition, the first client may determine that the image to be recognized matches the prompt message successfully. On the contrary, if the virtual gift is not identified in the identification result, the first client continuously shoots the real-time image through the image acquisition device and sends the real-time image to the server, so that the server identifies the real-time image, and the identification of the image stream is further realized.
For convenience of understanding, please refer to fig. 8, and fig. 8 is a schematic flowchart of a virtual gift interaction method based on augmented reality according to an embodiment of the present application, and as shown in the figure, specifically:
in step F1, the virtual gift sender (such as user a) triggers an object scan instruction through the first client in use.
In step F2, the first client activates the image capturing device based on the object scanning instruction, and presents the prompting message and the image to be recognized.
In step F3, the first client may send the image to be recognized acquired by the image acquisition device to the server, and then the server recognizes the image to be recognized based on the stored image set, where the recognition may be performed in such a manner that the server locally has the stored image set, and therefore, the server performs local recognition. Alternatively, the server may call a set of stored images in other databases for identification.
In step F4, the server determines whether the virtual gift is recognized, that is, whether the image to be recognized and the stored image are successfully matched, if yes, step F5 is performed, and if not, execution jumps to F3 and continues recognition.
In step F5, the first client may determine that the image to be recognized matches the prompting message successfully.
In step F6, the first client generates an AR animation according to the image to be recognized and the animation playback effect.
In step F7, the first client plays an augmented reality animation on the image to be recognized.
For convenience of understanding, the prompt message is "find favorite fat home water", and the image to be recognized includes a cola image as an example for explanation, the recognition result in this embodiment may be "recognized virtual gift", and it is understood that, in practical applications, the recognition result may further include but is not limited to "the image to be recognized is cola", "the image to be recognized includes cola", and "the image to be recognized is virtual gift", and the server may determine that the virtual gift is recognized through the recognition result, and the specific recognition result should be determined flexibly according to practical situations.
If the server determines that the virtual gift is not identified according to the identification result in step F4, the first client continues to capture the real-time image through the image capturing device and sends the real-time image to the server, so that the server jumps to step F3 to continue to identify the real-time image. It is understood that the image capturing device captures images in real time, and thus the server recognizes an image to be recognized or a real-time image separately, but the server needs to recognize each image in the image stream because the first client transmits the real-time image captured by the image capturing device to the server in real time.
Secondly, in the embodiment of the application, a method for recognizing an image to be recognized is provided, and in the above manner, the server recognizes the image to be recognized based on the stored image set to obtain a recognition result, and determines that the image to be recognized is successfully matched with the prompt message based on the recognition result, so that the first client can improve the efficiency of generating the augmented reality animation, and thus the playing efficiency of the augmented reality animation is improved. In addition, the server can identify the image stream formed by the real-time images, so that the first client can generate the augmented reality animation according to the real-time images, and the real-time performance of playing the augmented reality animation is improved.
Optionally, on the basis of the embodiment corresponding to fig. 2, in an optional embodiment of the virtual gift interaction method based on augmented reality provided in the embodiment of the present application, the generating, by the first client, an augmented reality animation according to the image to be recognized and the animation playing effect may include:
the method comprises the steps that a first client side obtains media information to be played, wherein the media information to be played is set by a second client side;
and the first client generates the augmented reality animation according to the image to be identified, the animation playing effect and the media information to be played.
In this embodiment, a method for generating an augmented reality animation is introduced, where a first client may obtain to-be-played media information set by a second client, where the second client may be a client used by a sender of a virtual gift, and then the first client generates an AR animation according to an image to be identified, an animation playing effect, and the to-be-played media information.
It should be noted that the media information to be played may include, but is not limited to, images, videos, and audios, and the images include, but are not limited to, Bitmap (BMP) format, personal computer exchange (PCX) format, Tagged Image File Format (TIFF), graphics exchange format (GIF), and Joint Photographic Experts Group (JPEG) format. It should be noted that, the video includes, but is not limited to, Moving Picture Experts Group (MPEG), Audio Video Interleaved (AVI) format, Advanced Streaming Format (ASF), and media video format (WMV), and the audio includes, but is not limited to, Compact Disc (CD) format, audio exchange file format (AIFF), Moving Picture Experts Group (MPEG) format, and moving picture experts group (MP) compression standard audio layer III (MP 3) format.
For convenience of understanding, the prompt message is "find favorite fat home water", the image to be recognized includes a cola image, and the media information to be played is an image as an example for explanation, please refer to fig. 9, fig. 9 is an interface schematic diagram of generating an AR animation according to the embodiment of the present application, that is, the media information to be played in this application is an image, as shown in fig. 9, an object recognition interface G1 shown in (a) in fig. 9 shows an image to be recognized G2, and fig. 9 (B) in this application may show media information to be played of an image (for example, a map of a kitten), combines the media information to be played and the image to be recognized G2, and generates an AR animation G3 shown in (C) in fig. 9 by combining with an animation playing effect of an animation. It is understood that the example in fig. 9 is only used for understanding the present solution, and the specific image to be recognized, the animation playing effect, and the media information to be played should be flexibly determined according to the actual situation.
Secondly, in the embodiment of the application, a method for generating an augmented reality animation is provided, by the above manner, the media information to be played set by the gift-giver can be fused with the real scene, the augmented reality animation is generated, and then the augmented reality animation is played on the client used by the gift-receiver, so that the interactivity between the gift-giver and the gift-receiver is improved. And the virtual gifts and the real scenes can be fused, so that the diversity and the flexibility of gift interaction are increased.
Optionally, on the basis of the embodiment corresponding to fig. 2, in an optional embodiment of the virtual gift interaction method based on augmented reality provided in the embodiment of the present application, the virtual gift interaction method may further include:
the first client receives an interface return instruction;
the first client displays a conversation prompt interface according to the interface return instruction, wherein the conversation prompt interface is used for providing an information viewing interface and an animation reproduction interface;
the first client displays an image corresponding to the virtual gift on a conversation prompt interface.
In this embodiment, a method for displaying a virtual gift again on a session prompt interface is introduced, where the first client may further receive an interface return instruction triggered by the virtual gift receiver, the first client may display the session prompt interface according to the received interface return instruction, the session prompt interface may provide an information viewing interface and an animation reproduction interface, and the first client may further display an image corresponding to the virtual gift on the session prompt interface. Specifically, after the first client plays the AR animation on the image to be recognized, the session prompt interface may be returned to, at this time, the session prompt interface may display a specific image of the virtual gift, and in addition, blessing information corresponding to the virtual gift may be displayed on the session prompt interface. For example, taking the virtual gift as the cola as an example, the first client may display the session prompt interface by returning an instruction after playing the AR animation on the image to be recognized including the cola image, and display the cola image corresponding to the virtual gift on the session prompt interface, and may also display blessing information corresponding to the virtual gift cola, such as "sending you a bottle of water to get home, getting happy and flying up". Alternatively, taking the virtual gift as a pizza as an example, the first client may display a conversation prompting interface by returning an instruction after playing the AR animation on the image to be recognized including the pizza, and display a pizza image corresponding to the virtual gift on the conversation prompting interface, and may also display blessing information corresponding to the virtual gift pizza may be "send you a happy home, the pizza enjoys satisfaction".
For convenience of understanding, the virtual gift is illustrated as a cola as an example, please refer to fig. 10, fig. 10 is another schematic diagram of the session prompt interface in the embodiment of the present application, as shown in fig. 10(a), the session prompt interface may be shown at the first client, and the session prompt interface may provide a scan interface, such as the "start scan" interface shown in fig. 10 (a). The presentation session prompt interface may also present the virtual gift image H1 before the virtual gift is not detached, i.e. the virtual gift receiver cannot know from the virtual gift image H1 what the virtual gift given by the virtual gift sender is specifically before the virtual gift receiver activates the image capturing device. After the first client receives the object scanning instruction through the scanning interface, the AR animation is generated and played, and the step of generating the AR animation is similar to that in the foregoing embodiment, and is not described herein again. Further, after playing the AR animation, fig. 10 (B) may further show an interface return interface H2 at the first client, and when the first client receives an interface return instruction through the interface return interface H2, the first client may display a session prompt interface H3 as shown in fig. 10 (C) according to the interface return instruction, and the session prompt interface H3 may provide an information viewing interface H4 and an animation reproduction interface H5, and further, the first client may display an image H6 after opening the virtual gift on the session prompt interface H3, and may see that after playing the AR animation, the image H6 after opening the virtual gift displayed on the session prompt interface H3 shown in fig. 10 (C) is compared with the image H1 after opening the virtual gift displayed on the session prompt interface H3 shown in fig. 10(a), and the image H6 after opening the virtual gift displayed on the session prompt interface H3 shown in fig. 10 (C) may be a cola. It is to be understood that the example of fig. 10 is only for understanding the present solution, and the specific information viewing interface, animation reproduction interface, and virtual gift should be flexibly determined in combination with the actual situation.
In the embodiment of the application, a method for displaying a virtual gift on a conversation prompt interface again is provided, and through the above manner, after an augmented reality animation is generated and played, an instruction can be returned to the conversation prompt interface through the interface, and the virtual gift is displayed, so that when a virtual gift receiver and a virtual gift sender perform conversation interaction, the received virtual gift can be seen, and thus the interactivity between the gift sender and the gift receiver is improved.
Optionally, on the basis of the embodiment corresponding to fig. 2, in an optional embodiment of the virtual gift interaction method based on augmented reality provided in the embodiment of the present application, after the first client displays the session prompt interface according to the interface return instruction, the virtual gift interaction method further includes:
the first client receives an information viewing instruction through an information viewing interface;
the first client displays a gift detail interface according to the information viewing instruction, wherein the gift detail interface is used for displaying at least one of sender information, receiver information and blessing message information of the virtual gift.
In this embodiment, a method for displaying a gift detail interface is provided, where a first client returns a command according to an interface, and after displaying a session prompt interface, the first client may further receive an information viewing command through the information viewing interface, and then the first client may display the gift detail interface according to the received information viewing command, where the gift detail interface may display at least one of sender information of a virtual gift, receiver information of the virtual gift, and blessing information. Specifically, the sender information of the virtual gift may include, but is not limited to, a virtual name, a virtual account number, and a virtual avatar of the sender of the virtual gift, and the receiver information of the virtual gift may include, but is not limited to, a virtual name, a virtual account number, and a virtual avatar of the receiver of the virtual gift, and the blessing message may include, but is not limited to, "send your full home water one bottle, i.e., turn to the day, happy home you cake," send your full home happy cake, cake let you enjoy satisfying, "and" send your full home happy chicken, eat chicken all the night.
For convenience of understanding, the virtual gift is cola, and the gift detail interface can display sender information, receiver information and blessing message information of the virtual gift as an example, please refer to fig. 11, fig. 11 is a schematic diagram of the gift detail interface in the embodiment of the present application, as shown in fig. 11 (a), a session prompt interface I1 is displayed on a first client, the session prompt interface I1 can provide an information viewing interface I2 and a gift-returning interface I3, and the first client can further display an image I4 after the virtual gift is opened on the session prompt interface I1, the first client can receive an information viewing instruction through the information viewing interface I2, and the first client can display a gift detail interface I5 as shown in fig. 11 (B) according to the information viewing instruction, and can see that the gift detail interface I5 can display sender information I6, receiver information and blessing message information of the virtual gift Specifically, the sender information I6 of the virtual gift shown in fig. 11 (B) includes a virtual name "small lovely" and a virtual head portrait, the receiver information I7 of the virtual gift includes a virtual name "large lovely" of the receiver of the virtual gift, and the blessing information I8 is blessing information "you get home water one bottle, that is, the day, and is happy to get up" corresponding to the virtual gift cola. It should be understood that the example in fig. 11 is only used for understanding the present solution, and the specific sender information, the receiver information and the blessing message should be flexibly determined according to the actual situation.
With continued reference to fig. 11, the gift detail interface I5 shown in fig. 11 (B) may further display a gift sharing interface I9 and a gift loopback interface I10, and the recipient of the virtual gift may operate the gift sharing interface I9 accordingly, so that the first client may share the virtual gift detail with others who do not include the recipient of the virtual gift. Secondly, the gift loopback interface I10 can be operated accordingly, so that the first client can loopback the gift to the donor of the virtual gift according to the operation.
For convenience of understanding, please refer to fig. 12, and fig. 12 is another schematic flow chart of the virtual gift interaction method based on augmented reality according to the embodiment of the present application, as shown in the figure, specifically:
in step J1, the virtual gift receiving party sends an information viewing instruction to the first client.
In step J2, the first client displays a gift detail interface, specifically, the first client may open a web view (web view), and then start to create the gift detail interface, the web view may request the server for information related to the gift detail interface, after the server obtains the information related to the gift detail interface, the server renders a page according to the information related to the gift detail interface, and after the page rendering is completed, the first client may display the gift detail interface, where the web view is a control that may be used to display a page, and the client performs a function similar to a browser therein. Optionally, the gift detail interface may further display a gift sharing interface and a gift loopback interface, and the virtual gift receiver operates the gift sharing interface.
In step J3, the virtual gift receiver triggers a gift sharing instruction to the server through the first client.
In step J4, the server may assemble the gift detail interface related information by the gift share instruction.
In step J5, the first client displays a gift detail interface including interface-related information, which is at least one of sender information, receiver information, and blessing message information of the virtual gift. Specifically, the first client can open a web view through a gift sharing instruction, the web view can actively call a related Software Development Kit (SDK), so that the SDK can open the related client, so that the virtual gift receiving party can perform sharing operation of the virtual gift, when the virtual gift receiving party returns to the first client, the first client can acquire shared information and feed back the shared information to the web view through JS Bridge (JS calls android and IOS native system interface), and the web view can perform drawing and rendering of related prompts.
After the gift detail interface displays the gift loopback interface in step J6, the virtual gift receiver operates the gift loopback interface, i.e., the virtual gift receiver causes the first client to send a gift loopback instruction to the service.
In step J7, the server may assemble the gift detail interface related information by the gift loopback instruction.
In step J8, the gift loopback page is presented by the first client. Specifically, the virtual gift receiver may click on the gift loopback interface, the first client starts to create the gift detail interface after opening the web view, the web view requests the server for the information related to the gift detail interface, the server assembles the information related to the gift detail interface and starts to render a page after acquiring the information related to the gift detail interface, and then the first client can display the gift loopback page. It is understood that the example in fig. 12 is only used for understanding the present solution, and the specific interface display and the flow should be flexibly determined according to the actual situation.
Furthermore, in the embodiment of the application, a method for displaying a gift detail interface is provided, by the above method, an instruction can be checked according to information, and any one of sender information, receiver information and blessing message information in the gift detail interface is displayed, so that a virtual gift receiver can know the specific information of the sender of the virtual gift, the accuracy of virtual interaction is improved, and by the method for displaying the blessing message information, the interactivity and flexibility between the virtual gift sender and the virtual gift receiver are improved.
Optionally, on the basis of the embodiment corresponding to fig. 2, in an optional embodiment of the virtual gift interaction method based on augmented reality provided in the embodiment of the present application, after the first client displays the session prompt interface according to the interface return instruction, the virtual gift interaction method further includes:
the first client receives an animation reproduction instruction through an animation reproduction interface;
the first client displays an object identification interface according to the animation reproduction instruction;
the first client displays the prompt message and the real-time image shot by the image acquisition device on the object recognition interface.
In this embodiment, a method for redisplaying an object identification interface is introduced, where a first client returns an instruction according to an interface, and after a session prompt interface is displayed, the first client may further receive an animation reproduction instruction through the animation reproduction interface, and then the first client displays the object identification interface according to the received animation reproduction instruction, that is, a virtual gift receiver may recognize a virtual gift multiple times through an image capture device. Similarly, the first client displays the prompt message and the real-time image shot by the image acquisition device on the object recognition interface.
For convenience of understanding, the virtual gift is a cola as an example, please refer to fig. 13, fig. 13 is a schematic diagram of a reproduced object recognition interface in an embodiment of the present application, as shown in fig. 13, (a) a session prompt interface K1 may be shown on a first client, an information viewing interface K2 and a back-to-date interface K3 are provided on the session prompt interface K1, in addition, the first client may also show an image K4 after the virtual gift is opened and an animation reproduction interface K5 on the session prompt interface K1, after an animation reproduction instruction is triggered by a virtual gift receiver through the animation reproduction interface K5, the first client may show an object recognition interface K6 as shown in (B) of fig. 13 according to the animation reproduction instruction, show a prompt message K7 and a real-time image K8 captured by an image capture device on the object recognition interface K6, assuming that the virtual gift is a cola, then the prompt message corresponding to the virtual gift may be "find favorite fertile home water". It is understood that the example in fig. 13 is only used for understanding the present solution, and the file of the specific animation reproduction interface, the content of the object recognition interface, the file of the prompt message, and the real-time image captured by the image capturing device should be flexibly determined according to the actual situation.
Further, in the embodiment of the application, a method for redisplaying an object recognition interface is provided, and by the above manner, a prompt message and a real-time image shot by an image acquisition device can be displayed on the displayed object recognition interface according to an animation reproduction instruction. The virtual gift receiver can repeatedly play the augmented reality animation, and display the prompt message and the real-time image in real time, so that the real-time performance of virtual interaction is improved, and the flexibility of the virtual gift receiver in playing the augmented reality animation is also improved.
With reference to fig. 14, referring to fig. 14, a schematic diagram of another embodiment of an augmented reality-based virtual gift interaction method in an embodiment of the present application is shown, where the embodiment of the present application includes:
201. the second client receives an object selection instruction, wherein the object selection instruction carries an identifier of the virtual gift;
in this embodiment, before performing the virtual gift interaction, the second client may receive an object selection instruction triggered by the virtual gift sender, where the object selection instruction carries an identifier of the virtual gift. For example, the identifier corresponding to the virtual gift "cola" is 1, and if the object selection instruction carries the identifier 1 of the virtual gift "cola", the virtual gift "cola" can be selected to be gifted through the identifier. For another example, if the identifier corresponding to the virtual gift "rice" is 2, and the object selection command carries the identifier 2 of the virtual gift "rice", the virtual gift "rice" can be presented through the identifier.
202. The second client displays the virtual gift on the object setting interface according to the object selection instruction;
in this embodiment, after receiving the object selection instruction through step 201, the second client may display the virtual gift on the object setting interface according to the object selection instruction, and may also display a prompt message corresponding to the virtual gift on the object setting interface, where the object setting interface is used to provide an object selection interface, and the prompt message is a message generated based on the virtual gift. For example, in the case that the virtual gift is a cola, the cola may be displayed on the object setting interface, and a prompt message "favorite fertile home water" may be displayed based on the cola. Under the condition that the virtual gift is fried chicken, the fried chicken can be displayed on an object setting interface, and a prompt message 'i favorite fat house happy chicken' generated based on the fried chicken can be displayed. And under the condition that the virtual gift is a pizza, the pizza can be displayed on an object setting interface, and a prompt message ' fat house happy cake ' which is frequently eaten by me ' can be displayed on the basis of the pizza.
For convenience of understanding, an object selection instruction is described as an example of carrying an identifier of a virtual gift cola, please refer to fig. 15, fig. 15 is an interface schematic diagram for selecting a virtual gift in an embodiment of the present application, as shown in fig. 15, an interface for a first client to receive the object selection instruction is shown in (a), a virtual gift sender operates an object selection interface L1, for example, clicks the object selection interface L1, and then displays the virtual gift L3 and a prompt message L4 corresponding to the virtual gift on an object setting interface L2 shown in (B) in fig. 15, that is, when the virtual gift L3 is a cola, the prompt message L4 corresponding to the cola is the most favored home fat water. Further, the virtual gift sender may further operate the prompting message interface corresponding to the position of the prompting message L4, for example, click the prompting message interface (i.e., the position shown by the prompting message L4 in the figure), and display the virtual gift replacing interface shown in fig. 15 (C) on the first client. For example, the virtual gift L3 is a cola, the replacement interface L5 corresponding to the cola can be selected, the interface shown by the interface L5 includes an icon of the virtual gift cola, a prompt message "favorite fat home water" corresponding to the virtual gift cola, and a text message "cola". For example, the virtual gift L3 is rice, the replacement interface L6 corresponding to the cola can be selected, and the interface shown by the interface L6 includes the virtual gift rice, the prompt message "essential for each meal" corresponding to the rice, and the text message "rice". It should be understood that the example in fig. 15 is only used for understanding the present solution, and both the specific virtual gift and the prompt message corresponding to the virtual gift should be flexibly determined according to the actual situation.
203. The second client receives an object selection instruction through an object selection interface provided by the object setting interface, wherein the object selection instruction carries a receiver identifier of the virtual gift;
in this embodiment, the second client may receive an object selection instruction through an object selection interface provided by the object setting interface, where the object selection instruction carries the identifier of the receiver of the virtual gift. For example, if the identifier corresponding to the receiving object user a is 11, and the object selection instruction carries the identifier 11 of the virtual gift receiving object, the virtual gift sender may be instructed to select the user a as the virtual gift receiver to give the virtual gift based on the identifier. For another example, if the identifier corresponding to the receiving-target user B is 12, and the object selection instruction carries the identifier 12 of the virtual gift receiving target, the virtual gift sender may be instructed to select the user B as the virtual gift receiver to present the virtual gift based on the identifier.
For convenience of understanding, please refer to fig. 16, where fig. 16 is a schematic diagram of an interface for selecting a virtual gift receiver in the embodiment of the present application, as shown in the figure, a list opening instruction triggered by a virtual gift sender may be received through the object selection interface M1, for example, after the object selection interface M1 is clicked, a buddy list may be displayed, and a buddy in the buddy list is selected as a virtual gift receiver, so as to generate an object selection instruction, that is, the object selection instruction carries a receiver identifier of the virtual gift. It should be understood that the example in fig. 16 is only used for understanding the present solution, and the shape, position and size of the specific object selection interface should be flexibly determined according to the actual situation.
204. The second client receives the object transmission instruction and sends the virtual gift to the first client according to the object transmission instruction, wherein the object transmission instruction at least carries an identifier of the virtual gift and an identifier of a receiver of the virtual gift, and the identifier of the receiver of the virtual gift has a corresponding relation with the first client.
In this embodiment, after the second client receives the object selection instruction through the object selection interface according to step 203, the second client may further receive an object transmission instruction, where the object transmission instruction at least carries an identifier of the virtual gift and an identifier of a receiver of the virtual gift, and the identifier of the receiver of the virtual gift has a corresponding relationship with the first client. The second client may also send the virtual gift to the first client according to the object transmission instruction. For example, the identifier corresponding to the virtual gift cola is 1, the identifier corresponding to the receiving target user a is 11, and when the second client receives the object transmission instruction and carries the identifier 1 of the virtual gift cola and the identifier 11 of the receiving party of the virtual gift, the virtual gift cola can be sent to the user a, and in a general case, the identifier 11 of the receiving party of the virtual gift has a corresponding relationship with the first client used by the user a. For another example, the identifier corresponding to the virtual gift rice is 2, the identifier corresponding to the receiving user B is 12, and when the second client receives the object transmission instruction and carries the identifier 2 of the virtual gift rice and the identifier 12 of the receiving user B of the virtual gift, the virtual gift rice may be sent to the user B, where the identifier 12 of the receiving user of the virtual gift is usually in a corresponding relationship with the first client used by the user B. It should be understood that the foregoing examples are only for understanding the present embodiment, and both the identification of the specific virtual gift and the identification of the recipient of the virtual gift should be flexibly determined in combination with the actual situation.
To further understand the embodiment, an object selection instruction is used to carry an identifier of a virtual gift cola as an example for explanation, please refer to fig. 17, and fig. 17 is another schematic flow chart of the virtual gift interaction method based on augmented reality in the embodiment of the present application, and as shown in the figure, specifically:
in step N1, the virtual gift sender triggers an object selection instruction by the second client, the object selection instruction carrying an identification of the virtual gift.
In step N2, the second client displays the virtual gift and a prompt message corresponding to the virtual gift on the object setting interface according to the object selection instruction, where the object setting interface may provide an object selection interface, and the prompt message is a message generated based on the virtual gift, that is, after the virtual gift sender selects the virtual gift, the first client may generate and draw an object setting interface with default information according to the identifier of the virtual gift by using a Uniform Resource Locator (URL) parameter, and the process may not request the server, and the default information may be obtained from the list page and then transmitted to the object setting interface in the form of a URL parameter.
In step N3, the second client displays an object selection interface on the object setting interface, and receives an object selection instruction triggered by the virtual gift sender through the object selection interface, where the object selection instruction carries an identifier of a receiver of the virtual gift, and specifically, when the virtual gift sender selects the receiver of the virtual gift, the buddy list page of the second client is opened first, and at this time, the web view may request buddy list data of the virtual gift sender from the server, and when the server returns the buddy list data of the virtual gift sender, the web view performs rendering on the buddy list page of the virtual gift sender, and after the page rendering is completed, other interactive operations such as clicking or sliding may be performed.
In step N4, the second client may further receive an object transmission instruction triggered by the virtual gift sending party, where the object transmission instruction carries at least an identifier of the virtual gift and an identifier of a receiving party of the virtual gift, and the identifier of the receiving party of the virtual gift has a corresponding relationship with the first client.
In step N5, the second client sends the object transmission instruction to the server, and the server receives the object transmission instruction sent by the second client, so that the server sends the virtual gift to the first client according to the object transmission instruction, performs database entry processing at the server, and informs the web view that the data transmission is successful, and the web view can automatically skip and render a success prompt page. Based on the foregoing embodiment, a method for sending a virtual gift to a first client by a server according to an object transmission instruction has been described, and details are not repeated herein.
For convenience of understanding, please refer to fig. 18, fig. 18 is a schematic interface diagram after an object transmission instruction is triggered in the embodiment of the present application, as shown in fig. 18, (a) may show an object setting interface, which may further include a confirmation interface O1, it should be noted that the content of the document on the confirmation interface O1 may be a prompt such as "give away" or "please send", which is only an illustration here and should not be construed as a limitation of the present application. The second client receives the object transfer instruction through the confirmation interface O1, and then the second client can present the gift-delivery interface as shown in fig. 18 (B) through the object transfer instruction. It should be noted that the document content on the gift delivery interface may be "space courier has received an order, does not fear rain, and we intend to deliver the gift on time", or may be other types of documents, which is not limited herein.
In the embodiment of the application, the virtual gift interaction method based on the augmented reality is provided, and through the mode, the designated virtual gift can be presented to the designated object through the object selection instruction and the object selection instruction, so that the accuracy of receiving the virtual gift by the virtual gift receiving party can be improved, and the interactivity between the gift presenting party and the gift receiving party can also be improved.
Optionally, on the basis of the embodiment corresponding to fig. 14, in an optional embodiment of the virtual gift interaction method based on augmented reality provided in the embodiment of the present application,
the object setting interface comprises a date setting interface, when the object transmission instruction carries target date information,
the virtual gift interaction method further comprises the following steps:
the second client side obtains target date information through a date setting interface, wherein the target date information is the date when the first client side receives the virtual gift;
or the object setting interface comprises a blessing words filling area, when the object transmission instruction carries the blessing words information,
the virtual gift interaction method further comprises the following steps:
the second client side obtains blessing message through the blessing message filling area, wherein the blessing message is displayed on the gift detail interface of the first client side;
or the object setting interface comprises a package selection interface, when the object transmission instruction carries the identification of the target package,
the virtual gift interaction method may further include:
the second client receives a package selection instruction through a package selection interface, wherein the package selection instruction carries an identifier of a target package, the identifier of the target package and the target package have a unique corresponding relation, and the target package is a package image displayed when the first client receives the virtual gift;
or the object setting interface further comprises an effect selection interface, and when the object transmission instruction further carries an identification of an animation playing effect, the virtual gift interaction method further comprises the following steps:
and the second client receives the effect selection instruction through the effect selection interface, wherein the effect selection instruction carries an identification of the animation playing effect, the identification of the animation playing effect corresponds to the animation playing effect, and the animation playing effect is used for the first client to generate the augmented reality animation.
In this embodiment, a manner of setting a plurality of related contents on an object setting interface is introduced, the object setting interface displayed by the second client may further include a date setting interface, and the object transmission instruction further carries target date information, so that after the object setting interface is displayed according to the object selection instruction, the second client may further obtain the target date information through the date setting interface, where the target date information is a date when the link message virtual gift is received by the first client. Optionally, the object setting interface displayed by the second client may further include a blessing message filling area, and the object transmission instruction further carries blessing message information, so that after the second client displays the object setting interface according to the object selection instruction, the second client may further obtain blessing message information through the blessing message filling area, where the blessing message information is displayed on the gift detail interface by the first client. Optionally, the object setting interface displayed by the second client may further include a package selection interface, and the object transmission instruction further carries an identifier of the target package, so that after the second client displays the object setting interface according to the object selection instruction, the second client may further receive the package selection instruction through the package selection interface, where the package selection instruction carries the identifier of the target package, the identifier of the target package and the target package have a unique corresponding relationship, and the target package is a package image displayed by the first client when receiving the link message virtual gift. Optionally, the object setting interface displayed by the second client may further include an effect selection interface, and the object transmission instruction further carries an identifier of an animation playing effect, so that the second client receives the effect selection instruction through the effect selection interface, the effect selection instruction carries the identifier of the animation playing effect, the identifier of the animation playing effect and the animation playing effect have a unique corresponding relationship, and the animation playing effect is used for the first client to generate and display the AR animation.
For convenience of understanding, referring to fig. 19, fig. 19 is a schematic diagram of an object setting interface in the embodiment of the present application, and as shown in the figure, the object setting interface P1 displayed by the second client may further include a date setting interface P2, and target date information set by the virtual gift sender may be received through the date setting interface P2, where the target date information is a date on which the link message virtual gift is received by the first client. For example, the date that the first client receives the link message virtual gift is 2020.1.1 may be acquired through the date setting interface P2.
Optionally, the object setting interface P1 displayed by the second client may further include a blessing words filling area P3, and the virtual gift sender may input blessing words information in the blessing words filling area P3, the blessing words information being information displayed on the gift particulars interface by the first client. For example, taking the virtual gift as an example of cola, the default blessing message "one bottle is ready to day, happy and flying" may be displayed in the blessing words filling area P3. Further, taking the virtual gift as pizza as an example, default blessing message "send your happy home pie and let you enjoy it" may be displayed in the blessing message filling area P3. It is understood that, in practical applications, the virtual gift-sending party may also fill in blessing messages, such as "get full of happy home, we are fat together", etc., which improves the interactivity between the gift-giving party and the gift-receiving party, so the blessing messages should not be construed as a limitation of the present application.
Optionally, the object setting interface P1 displayed by the second client may further include a package selection interface P4, and the package selection interface P4 may receive a package selection instruction, where the package selection instruction carries an identifier of a target package, the identifier of the target package and the target package have a unique corresponding relationship, and the target package is a package image displayed by the first client when receiving the link message virtual gift. Optionally, the object setting interface P1 displayed by the second client may further include an effect selection interface P5, and the effect selection interface P5 may receive an effect selection instruction, where the effect selection instruction carries an identifier of an animation playing effect, and the animation playing effect may be used by the first client to generate the AR animation because the identifier of the animation playing effect and the animation playing effect have a unique corresponding relationship. The foregoing embodiment has introduced a method for generating an augmented reality animation by a first client, and is not described herein again.
In the embodiment of the application, the mode of setting up multiple relevant content on the object setting interface is provided, and through the mode, through the date setting interface of object setting interface, the area is filled in to the blessing words, packing selection interface to and effect selection interface, can provide the multiple selection of date, blessing words, packing and effect, increase virtual gift's interactive variety and flexibility.
Optionally, on the basis of the embodiment corresponding to fig. 14, in an optional embodiment of the virtual gift interaction method based on augmented reality provided in the embodiment of the present application, the object setting interface further includes a media information adding interface;
the virtual gift interaction method may further include:
the second client receives a media adding instruction through the media information adding interface;
the second client displays a media information selection interface according to the media adding instruction, wherein the media information selection interface displays at least one piece of media information;
the second client receives a media uploading instruction;
the second client determines the media information to be played from the at least one piece of media information according to the media uploading instruction;
the virtual gift interaction method may further include:
and when the second client sends the object transmission instruction to the server, sending the to-be-played media information to the server so that the server stores the to-be-played media information, wherein the to-be-played media information is used for the first client to generate the augmented reality animation.
In this embodiment, a method for uploading media information to be played is introduced, where an object setting interface displayed by a second client further includes a media information adding interface, so that the second client may further receive a media adding instruction through the media information adding interface, then the second client displays a media information selecting interface according to the media adding instruction, the media information selecting interface displays at least one piece of addable media information, further the second client receives a media uploading instruction triggered by a virtual gift sender, and then the second client determines the media information to be played from the at least one piece of addable media information according to the media uploading instruction. And when the second client sends the object transmission instruction to the server, the second client can also send the media information to be played to the server, so that the server stores the media information to be played, and the media information to be played can be used for the first client to generate the AR animation.
For easy understanding, please refer to fig. 20, fig. 20 is a schematic diagram of an interface for adding media information to be played according to the embodiment of the present application, as shown in fig. 20, an object setting interface Q1 displayed by the second client shown in (a) further includes a media information adding interface Q2, the virtual gift sender may trigger a media adding instruction through the media information adding interface Q2, and then the second client displays a media information selecting interface Q3 including at least one piece of added media information as shown in (B) of fig. 20. The virtual gift sender triggers a media uploading instruction through the media information selection interface Q3, and the second client determines to-be-played media information Q4 from at least one piece of addable media information according to the media uploading instruction. Then, an object setting interface Q1 is obtained as shown in fig. 20 (C), and a thumbnail Q5 of the added media information is also included in the object setting interface Q1, the thumbnail Q5 of the added media information indicating that the media information has been added. In addition, when the second client sends the object transmission instruction to the server, the second client may also send the to-be-played media information to the server, so that the server stores the to-be-played media information, and the to-be-played media information may be used by the first client to generate the AR animation. It should be noted that the media information to be played may include, but is not limited to, images, videos, and audios.
In the embodiment of the application, a method for uploading to-be-played media information is provided, and in the above manner, the to-be-played media information is determined from at least one piece of addable media information through a media information adding interface, that is, a virtual gift sending party can freely add the to-be-played media information according to the preference, so that the virtual gift receiving party can see the added to-be-played media information, and the diversity and flexibility of information interaction are improved.
Referring to fig. 21, fig. 21 is a schematic view of an embodiment of an augmented reality-based virtual gift interaction apparatus 30 according to the embodiment of the present application, and as shown in the figure, the client is described in detail below, and includes:
a receiving module 301, configured to receive an object scanning instruction, where the object scanning instruction is used to start an image acquisition device;
a display module 302, configured to display, according to an object scanning instruction, a prompt message and an image to be recognized on an object recognition interface, where the prompt message is a message generated based on a virtual gift, and the image to be recognized is an image acquired by an image acquisition device;
a generating module 303, configured to generate an augmented reality animation corresponding to the virtual gift according to the image to be recognized and an animation playing effect if the image to be recognized is successfully matched with the prompt message;
and the playing module 304 is configured to play the augmented reality animation corresponding to the virtual gift on the image to be recognized.
Optionally, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the virtual gift interactive device 30 based on augmented reality provided by the embodiment of the present application,
the receiving module 301 is specifically configured to display a session prompt interface, where the session prompt interface is used to provide a scanning interface;
and receiving an object scanning command through a scanning interface.
In the embodiment of the application, a method for receiving an object scanning instruction is provided, and in the above manner, a virtual gift receiver can receive the object scanning instruction according to a scanning interface provided by a displayed session prompt interface, so that the efficiency of receiving the object scanning instruction is improved, the time-lapse property of virtual gift interaction is reduced, and the interactivity between the gift giver and the gift receiver is improved.
Optionally, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the virtual gift interactive device 30 based on augmented reality provided by the embodiment of the present application,
a display module 302, configured to display a prompt message and an identification area on an object identification interface according to an object scanning instruction;
and displaying a target image on the object recognition interface according to the object scanning instruction, and highlighting the image to be recognized in the recognition area, wherein the image to be recognized belongs to one part of the target image.
In the embodiment of the application, another virtual gift interaction method based on augmented reality is provided, and in the above manner, in the identification area displayed on the object identification interface, the image to be identified is highlighted according to the identification area, so that the matching speed of the image to be identified and the prompt message can be increased, the speed of generating the augmented reality animation is increased, the client used by the gift receiver can play the augmented reality animation faster, and the real-time performance and the feasibility of virtual gift interaction are improved.
Optionally, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the virtual gift interaction device 30 based on augmented reality provided in the embodiment of the present application, the virtual gift interaction device 30 based on augmented reality further includes a sending module 305, a determining module 306, and a shooting module 307;
a sending module 305, configured to send an image to be recognized to a server, so that the server recognizes the image to be recognized based on a stored image set, to obtain a recognition result, where the stored image set includes at least one stored image;
the determining module 306 is configured to determine, by the first client, that the image to be recognized is successfully matched with the prompt message if the virtual gift is recognized as the recognition result;
and a shooting module 307, configured to, if the virtual gift is not identified as a result of the identification, continue shooting the real-time image by the first client through the image acquisition device, and send the real-time image to the server, so that the server identifies the real-time image.
In the embodiment of the application, another virtual gift interaction method based on augmented reality is provided, and in the above manner, the image to be recognized can be recognized by the server based on the stored image set to obtain the recognition result, and the image to be recognized is determined to be successfully matched with the prompt message based on the recognition result, so that the efficiency of generating augmented reality animation can be improved, and the playing efficiency of the augmented reality animation is improved. Secondly, the server can identify the image stream formed by the real-time images, so that the augmented reality animation can be generated according to the real-time images, and the real-time performance of playing the augmented reality animation is improved.
Optionally, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the virtual gift interactive device 30 based on augmented reality provided by the embodiment of the present application,
a generating module 303, configured to specifically acquire media information to be played, where the media information to be played is set by the second client;
and generating the augmented reality animation according to the image to be recognized, the animation playing effect and the media information to be played.
In the embodiment of the application, a method for generating an augmented reality animation is provided, by the above method, the media information to be played set by the gift-giver can be fused with the real scene, the augmented reality animation is generated, and then the augmented reality animation is played on the client used by the gift-receiver, so that the interactivity between the gift-giver and the gift-receiver is improved. And the virtual gift can be fused with the real scene, so that the interest and flexibility of gift interaction are increased.
Optionally, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the virtual gift interactive device 30 based on augmented reality provided by the embodiment of the present application,
the receiving module 301 is further configured to receive an interface return instruction;
the display module 302 is further configured to display a session prompt interface according to the interface return instruction, where the session prompt interface is used to provide an information viewing interface and an animation reproduction interface;
the display module 302 is further configured to display an image corresponding to the virtual gift on the session prompt interface.
In the embodiment of the application, another virtual gift interaction method based on augmented reality is provided, and in the above manner, after the augmented reality animation is generated and played, an instruction can be returned to the conversation prompt interface through the interface, and the virtual gift is displayed, so that the virtual gift receiving party and the virtual gift sending party can see the received virtual gift when conversation interaction is carried out, and the interactivity between the gift giving party and the gift receiving party is improved.
Optionally, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the virtual gift interactive device 30 based on augmented reality provided by the embodiment of the present application,
the receiving module 301 is further configured to receive an information viewing instruction through an information viewing interface;
the display module 302 is further configured to display a gift detail interface according to the information viewing instruction, where the gift detail interface is configured to display at least one of sender information, receiver information, and blessing information of the virtual gift.
Optionally, on the basis of the embodiment corresponding to fig. 21, in another embodiment of the virtual gift interactive device 30 based on augmented reality provided by the embodiment of the present application,
the receiving module 301 is further configured to receive an animation playback instruction through an animation playback interface;
the display module 302 is further configured to display an object identification interface according to the animation reproduction instruction;
the display module 302 is further configured to display the prompt message and the real-time image captured by the image capturing device on the object recognition interface.
Referring to fig. 22, fig. 22 is a schematic view of another embodiment of an augmented reality-based virtual gift interaction device according to an embodiment of the present application, and as shown in the drawing, the augmented reality-based virtual gift interaction device 40 includes:
a receiving module 401, configured to receive an object selection instruction, where the object selection instruction carries an identifier of a virtual gift;
a display module 402, configured to display a virtual gift on an object setting interface according to an object selection instruction;
the receiving module 401 is further configured to receive an object selection instruction through an object selection interface provided by the object setting interface, where the object selection instruction carries a receiver identifier of the virtual gift;
the receiving module 401 is further configured to receive an object transmission instruction, and send the virtual gift to the first client according to the object transmission instruction, where the object transmission instruction at least carries an identifier of the virtual gift and an identifier of a receiver of the virtual gift, and the identifier of the receiver of the virtual gift has a corresponding relationship with the first client.
Optionally, on the basis of the embodiment corresponding to fig. 22, in another embodiment of the virtual gift interaction device 40 based on augmented reality provided in the embodiment of the present application, the virtual gift interaction device 40 based on augmented reality further includes an obtaining module 403;
the object setting interface comprises a date setting interface, when the object transmission instruction carries target date information,
an obtaining module 403, configured to obtain target date information through a date setting interface, where the target date information is a date when the first client receives the virtual gift;
or the object setting interface comprises a blessing words filling area, when the object transmission instruction carries the blessing words information,
the obtaining module 403 is further configured to obtain blessing message through the blessing message filling area, where the blessing message is displayed on the gift detail interface by the first client;
or the object setting interface comprises a package selection interface, when the object transmission instruction carries the identification of the target package,
the receiving module 401 is further configured to receive a package selection instruction through a package selection interface, where the package selection instruction carries an identifier of a target package, the identifier of the target package and the target package have a unique corresponding relationship, and the target package is a package image displayed when the first client receives the virtual gift;
or the object setting interface also comprises an effect selection interface, when the object transmission instruction also carries an identification of the animation playing effect,
the receiving module 401 is further configured to receive an effect selection instruction through the effect selection interface, where the effect selection instruction carries an identifier of an animation playing effect, the identifier of the animation playing effect corresponds to the animation playing effect, and the animation playing effect is used by the first client to generate the augmented reality animation.
Optionally, on the basis of the embodiment corresponding to fig. 22, in another embodiment of the virtual gift interaction device 40 based on augmented reality provided in the embodiment of the present application, the virtual gift interaction device 40 based on augmented reality further includes a determining module 404 and a sending module 405;
the object setting interface also comprises a media information adding interface;
the receiving module 401 is further configured to receive a media adding instruction through a media information adding interface;
the display module 402 is further configured to display a media information selection interface according to the media addition instruction, where the media information selection interface displays at least one piece of media information;
the receiving module 401 is further configured to receive a media uploading instruction;
a determining module 404, configured to determine, according to the media uploading instruction, media information to be played from the at least one piece of media information;
the sending module 405 is configured to send to-be-played media information to the server when the second client sends an object transmission instruction to the server, so that the server stores the to-be-played media information, and the to-be-played media information is used by the first client to generate the augmented reality animation.
Referring to fig. 23, for convenience of description, only the portions related to the embodiments of the present application are shown, and details of the method portion of the embodiments of the present application are not disclosed. The client is deployed in a terminal device, and the terminal device may be any terminal device including a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (Point of Sales), a vehicle-mounted computer, and the like, in this embodiment, the terminal is taken as the mobile phone for example:
fig. 23 is a block diagram illustrating a partial structure of a mobile phone related to a terminal provided in an embodiment of the present application. Referring to fig. 23, the cellular phone includes: radio Frequency (RF) circuitry 1110, memory 1120, input unit 1130, display unit 1140, sensors 1150, audio circuitry 1160, wireless fidelity (WiFi) module 1170, processor 1180, and power supply 1190. Those skilled in the art will appreciate that the handset configuration shown in fig. 23 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 23:
RF circuit 1110 may be used for receiving and transmitting signals during a message transmission or call, and in particular, for receiving downlink messages from a base station and then processing the received downlink messages to processor 1180; in addition, the data for designing uplink is transmitted to the base station. In general, RF circuit 1110 includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 1110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The memory 1120 may be used to store software programs and modules, and the processor 1180 may execute various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 1120. The memory 1120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 1130 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 1130 may include a touch panel 1131 and other input devices 1132. Touch panel 1131, also referred to as a touch screen, can collect touch operations of a user on or near the touch panel 1131 (for example, operations of the user on or near touch panel 1131 by using any suitable object or accessory such as a finger or a stylus pen), and drive corresponding connection devices according to a preset program. Alternatively, the touch panel 1131 may include two parts, namely, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1180, and can receive and execute commands sent by the processor 1180. In addition, the touch panel 1131 can be implemented by using various types, such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 1130 may include other input devices 1132 in addition to the touch panel 1131. In particular, other input devices 1132 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 1140 may be used to display information input by the user or information provided to the user and various menus of the cellular phone. The Display unit 1140 may include a Display panel 1141, and optionally, the Display panel 1141 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 1131 can cover the display panel 1141, and when the touch panel 1131 detects a touch operation on or near the touch panel, the touch panel is transmitted to the processor 1180 to determine the type of the touch event, and then the processor 1180 provides a corresponding visual output on the display panel 1141 according to the type of the touch event. Although in fig. 23, the touch panel 1131 and the display panel 1141 are two independent components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 1131 and the display panel 1141 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 1150, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1141 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 1141 and/or the backlight when the mobile phone moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 1160, speakers 1161, and microphone 1162 may provide an audio interface between a user and a cell phone. The audio circuit 1160 may transmit the electrical signal converted from the received audio data to the speaker 1161, and convert the electrical signal into a sound signal for output by the speaker 1161; on the other hand, the microphone 1162 converts the collected sound signals into electrical signals, which are received by the audio circuit 1160 and converted into audio data, which are then processed by the audio data output processor 1180, and then transmitted to, for example, another cellular phone via the RF circuit 1110, or output to the memory 1120 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the cell phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 1170, and provides wireless broadband internet access for the user. Although fig. 23 shows the WiFi module 1170, it is understood that it does not belong to the essential component of the handset.
The processor 1180 is a control center of the mobile phone, and is connected to various parts of the whole mobile phone through various interfaces and lines, and executes various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 1120 and calling data stored in the memory 1120, thereby performing overall monitoring of the mobile phone. Optionally, processor 1180 may include one or more processing units; preferably, the processor 1180 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated within processor 1180.
The phone also includes a power supply 1190 (e.g., a battery) for powering the various components, and preferably, the power supply may be logically connected to the processor 1180 via a power management system, so that the power management system may manage charging, discharging, and power consumption management functions.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
In this embodiment of the application, the processor 1180 included in the terminal may execute the function of the first client in the embodiment shown in fig. 2 or may execute the function of the second client in the embodiment shown in fig. 14, which is not described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (15)

1. An augmented reality-based virtual gift interaction method, comprising:
a first client receives an object scanning instruction, wherein the object scanning instruction is used for starting an image acquisition device;
the first client displays a prompt message and an image to be recognized on an object recognition interface according to the object scanning instruction, wherein the prompt message is a message generated based on a virtual gift, and the image to be recognized is an image acquired by the image acquisition device;
if the image to be recognized is successfully matched with the prompt message, the first client generates an augmented reality animation corresponding to the virtual gift according to the image to be recognized and an animation playing effect;
and the first client plays the augmented reality animation corresponding to the virtual gift on the image to be identified.
2. The method of claim 1, wherein the first client receives object scanning instructions, comprising:
the first client displays a session prompt interface, wherein the session prompt interface is used for providing a scanning interface;
and the first client receives the object scanning instruction through the scanning interface.
3. The method according to claim 1, wherein the first client displays a prompt message and an image to be recognized on an object recognition interface according to the object scanning instruction, and the method comprises:
the first client displays the prompt message and the identification area on the object identification interface according to the object scanning instruction;
and the first client displays a target image on the object recognition interface according to the object scanning instruction, and displays the image to be recognized in a highlighted mode in the recognition area, wherein the image to be recognized belongs to one part of the target image.
4. The method of claim 1, further comprising:
the first client sends the image to be identified to a server so that the server identifies the image to be identified based on a stored image set to obtain an identification result, wherein the stored image set comprises at least one stored image;
if the virtual gift is identified as the identification result, the first client determines that the image to be identified is successfully matched with the prompt message;
if the virtual gift is not identified in the identification result, the first client side continues to shoot a real-time image through the image acquisition device and sends the real-time image to the server, so that the server identifies the real-time image.
5. The method according to claim 1, wherein the first client generates an augmented reality animation according to the image to be recognized and an animation playing effect, and comprises:
the first client side obtains media information to be played, wherein the media information to be played is set by a second client side;
and the first client generates the augmented reality animation according to the image to be identified, the animation playing effect and the media information to be played.
6. The method according to any one of claims 1 to 5, further comprising:
the first client receives an interface return instruction;
the first client displays a conversation prompt interface according to the interface return instruction, wherein the conversation prompt interface is used for providing an information viewing interface and an animation reproduction interface;
and the first client displays the image corresponding to the virtual gift on the session prompt interface.
7. The method of claim 6, wherein after the first client presents the session prompt interface according to the interface return instruction, the method further comprises:
the first client receives the information viewing instruction through the information viewing interface;
and the first client displays a gift detail interface according to the information viewing instruction, wherein the gift detail interface is used for displaying at least one of sender information, receiver information and blessing message information of the virtual gift.
8. The method of claim 6, wherein after the first client presents the session prompt interface according to the interface return instruction, the method further comprises:
the first client receives the animation reproduction instruction through the animation reproduction interface;
the first client displays the object identification interface according to the animation reproduction instruction;
and the first client displays the prompt message and the real-time image shot by the image acquisition device on the object identification interface.
9. An augmented reality-based virtual gift interaction method, comprising:
the method comprises the steps that a second client receives an object selection instruction, wherein the object selection instruction carries an identifier of a virtual gift;
the second client displays the virtual gift on an object setting interface according to the object selection instruction;
the second client receives an object selection instruction through an object selection interface provided by the object setting interface, wherein the object selection instruction carries a receiver identifier of the virtual gift;
the second client receives an object transmission instruction and sends the virtual gift to the first client according to the object transmission instruction, wherein the object transmission instruction at least carries an identifier of the virtual gift and an identifier of a receiver of the virtual gift, and the identifier of the receiver of the virtual gift has a corresponding relation with the first client.
10. The method of claim 9, wherein the object setup interface comprises a date setup interface, and when the object transmission instruction carries target date information, the method further comprises:
the second client acquires the target date information through the date setting interface, wherein the target date information is the date when the first client receives the virtual gift;
or the object setting interface comprises a blessing words filling area, and when the object transmission instruction carries blessing words information, the method further comprises the following steps:
the second client acquires the blessing message through the blessing message filling area, wherein the blessing message is displayed on a gift detail interface of the first client;
or the object setting interface comprises a package selection interface, and when the object transmission instruction carries the identifier of the target package, the method further comprises the following steps:
the second client receives a package selection instruction through the package selection interface, wherein the package selection instruction carries an identifier of the target package, the identifier of the target package and the target package have a unique corresponding relation, and the target package is a package image displayed by the first client when receiving the virtual gift;
or when the object setting interface further comprises an effect selection interface and the object transmission instruction further carries an identifier of an animation playing effect, the method further comprises:
and the second client receives an effect selection instruction through the effect selection interface, wherein the effect selection instruction carries an identifier of the animation playing effect, the identifier of the animation playing effect corresponds to the animation playing effect, and the animation playing effect is used for the first client to generate the augmented reality animation.
11. The method of claim 9 or 10, wherein the object setup interface further comprises a media information addition interface;
the method further comprises the following steps:
the second client receives a media adding instruction through a media information adding interface;
the second client displays a media information selection interface according to the media adding instruction, wherein the media information selection interface displays at least one piece of media information;
the second client receives a media uploading instruction;
the second client determines the media information to be played from the at least one piece of media information according to the media uploading instruction;
the method further comprises the following steps:
and when the second client sends the object transmission instruction to the server, sending the to-be-played media information to the server so that the server stores the to-be-played media information, wherein the to-be-played media information is used for the first client to generate the augmented reality animation.
12. An augmented reality-based virtual gift interaction device, comprising:
the device comprises a receiving module, a processing module and a display module, wherein the receiving module is used for receiving an object scanning instruction, and the object scanning instruction is used for starting an image acquisition device;
the display module is used for displaying a prompt message and an image to be recognized on an object recognition interface according to the object scanning instruction, wherein the prompt message is a message generated based on a virtual gift, and the image to be recognized is an image acquired by the image acquisition device;
the generating module is used for generating the augmented reality animation corresponding to the virtual gift according to the image to be recognized and the animation playing effect if the image to be recognized is successfully matched with the prompt message;
and the playing module is used for playing the augmented reality animation corresponding to the virtual gift on the image to be identified.
13. An augmented reality-based virtual gift interaction device, comprising:
the receiving module is used for receiving an object selection instruction, wherein the object selection instruction carries an identifier of the virtual gift;
the display module is used for displaying the virtual gift on an object setting interface according to the object selection instruction;
the receiving module is further configured to receive an object selection instruction through an object selection interface provided by the object setting interface, where the object selection instruction carries a receiver identifier of the virtual gift;
the receiving module is further configured to receive an object transmission instruction, and send the virtual gift to the first client according to the object transmission instruction, where the object transmission instruction at least carries an identifier of the virtual gift and an identifier of a receiver of the virtual gift, and the identifier of the receiver of the virtual gift and the first client have a corresponding relationship.
14. A terminal device, comprising: a memory and a processor;
wherein the memory is used for storing programs;
the processor is configured to execute a program in the memory, and the processor is configured to perform the method of any one of claims 1 to 8 or the method of any one of claims 9 to 11 according to instructions in the program code.
15. A computer-readable storage medium comprising instructions which, when executed on a computer, cause the computer to perform the method of any of claims 1 to 8, or perform the method of any of claims 9 to 11.
CN202010129769.9A 2020-02-28 2020-02-28 Virtual gift interaction method based on augmented reality and related device Pending CN113325946A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010129769.9A CN113325946A (en) 2020-02-28 2020-02-28 Virtual gift interaction method based on augmented reality and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010129769.9A CN113325946A (en) 2020-02-28 2020-02-28 Virtual gift interaction method based on augmented reality and related device

Publications (1)

Publication Number Publication Date
CN113325946A true CN113325946A (en) 2021-08-31

Family

ID=77412729

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010129769.9A Pending CN113325946A (en) 2020-02-28 2020-02-28 Virtual gift interaction method based on augmented reality and related device

Country Status (1)

Country Link
CN (1) CN113325946A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115016688A (en) * 2022-06-28 2022-09-06 维沃移动通信有限公司 Virtual information display method and device and electronic equipment
WO2024099278A1 (en) * 2022-11-07 2024-05-16 北京有竹居网络技术有限公司 Interaction method and apparatus, electronic device, and computer readable medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140164159A1 (en) * 2012-12-11 2014-06-12 Christine Lovelace Customizable virtual gift wrapping & presentation
CN106686393A (en) * 2016-12-19 2017-05-17 广州华多网络科技有限公司 Virtual gift giving method and device
CN107820132A (en) * 2017-11-21 2018-03-20 广州华多网络科技有限公司 Living broadcast interactive method, apparatus and system
CN109194973A (en) * 2018-09-26 2019-01-11 广州华多网络科技有限公司 A kind of more main broadcaster's direct broadcasting rooms give the methods of exhibiting, device and equipment of virtual present
CN110781421A (en) * 2019-08-13 2020-02-11 腾讯科技(深圳)有限公司 Virtual resource display method and related device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140164159A1 (en) * 2012-12-11 2014-06-12 Christine Lovelace Customizable virtual gift wrapping & presentation
CN106686393A (en) * 2016-12-19 2017-05-17 广州华多网络科技有限公司 Virtual gift giving method and device
CN107820132A (en) * 2017-11-21 2018-03-20 广州华多网络科技有限公司 Living broadcast interactive method, apparatus and system
CN109194973A (en) * 2018-09-26 2019-01-11 广州华多网络科技有限公司 A kind of more main broadcaster's direct broadcasting rooms give the methods of exhibiting, device and equipment of virtual present
CN110781421A (en) * 2019-08-13 2020-02-11 腾讯科技(深圳)有限公司 Virtual resource display method and related device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115016688A (en) * 2022-06-28 2022-09-06 维沃移动通信有限公司 Virtual information display method and device and electronic equipment
WO2024099278A1 (en) * 2022-11-07 2024-05-16 北京有竹居网络技术有限公司 Interaction method and apparatus, electronic device, and computer readable medium

Similar Documents

Publication Publication Date Title
US10636221B2 (en) Interaction method between user terminals, terminal, server, system, and storage medium
US10325394B2 (en) Mobile communication terminal and data input method
US11003331B2 (en) Screen capturing method and terminal, and screenshot reading method and terminal
CN109145142B (en) Management method and terminal for shared information of pictures
CN106803993B (en) Method and device for realizing video branch selection playing
CN108932159B (en) Task management method and device
CN108388637A (en) A kind of method, apparatus and relevant device for providing augmented reality service
CN111164983B (en) The interconnection terminal lends local processing capability
CN109660728B (en) Photographing method and device
CN107948562B (en) Video recording method and video recording terminal
US10528845B2 (en) Method, device, and storage medium for generating GIF file
CN112312144B (en) Live broadcast method, device, equipment and storage medium
CN112004156A (en) Video playing method, related device and storage medium
CN110087149A (en) A kind of video image sharing method, device and mobile terminal
CN110795589A (en) Image searching method and device, computer equipment and storage medium
CN114422640A (en) Equipment recommendation method and electronic equipment
CN114880062B (en) Chat expression display method, device, electronic device and storage medium
CN113325946A (en) Virtual gift interaction method based on augmented reality and related device
CN113392178A (en) Message reminding method, related device, equipment and storage medium
CN112131438A (en) Information generation method, information display method and device
CN116027895A (en) Virtual content interaction method, device, equipment and storage medium
CN113609358B (en) Content sharing method, device, electronic equipment and storage medium
WO2019206316A1 (en) Photographic method and terminal device
CN110532412A (en) A kind of document handling method and mobile terminal
CN111178306B (en) Display control method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40050665

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination