CN112827171A - Interaction method, interaction device, electronic equipment and storage medium - Google Patents

Interaction method, interaction device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112827171A
CN112827171A CN202110152534.6A CN202110152534A CN112827171A CN 112827171 A CN112827171 A CN 112827171A CN 202110152534 A CN202110152534 A CN 202110152534A CN 112827171 A CN112827171 A CN 112827171A
Authority
CN
China
Prior art keywords
user
image
game resource
resource
game
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110152534.6A
Other languages
Chinese (zh)
Inventor
黄亦辰
吴彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202110152534.6A priority Critical patent/CN112827171A/en
Publication of CN112827171A publication Critical patent/CN112827171A/en
Priority to PCT/CN2022/071705 priority patent/WO2022166551A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the disclosure relates to an interaction method, an interaction device, an electronic device and a storage medium, wherein the method comprises the following steps: responding to a game resource extraction request of a user, and displaying a game resource extraction scene; the game resource extraction scene comprises a drawing area; receiving an image drawn by a user in the drawing area; and acquiring a target game resource matched with the image based on the drawn image, taking the target game resource as a game resource extraction result of the user, and displaying the target game resource. The method and the device for extracting the specific game resources can improve the probability of extracting the specific game resources by the user, enrich the extraction implementation mode of the resource extraction type game, and improve the interest and the interactivity of the game.

Description

Interaction method, interaction device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an interaction method, an interaction apparatus, an electronic device, and a storage medium.
Background
With the development of computer technology, more and more applications can provide virtual resources available in the applications, and users open the virtual resources for the users to use in the applications by means of drawing cards, roulette and the like.
The card drawing mode is a common virtual resource drawing mode in card games, and the card drawing probability corresponding to different game resources is different. In the existing card drawing mode, the card drawing probability of a user for certain game resources is usually fixed, the probability that the user draws high-level game resources is usually lower, the game experience of the user is influenced, and the card drawing implementation mode is single, so that the increase of game interactivity is not facilitated.
Disclosure of Invention
In order to solve the technical problem or at least partially solve the technical problem, embodiments of the present disclosure provide an interaction method, apparatus, electronic device, and storage medium.
In a first aspect, an embodiment of the present disclosure provides an interaction method, including:
responding to a game resource extraction request of a user, and displaying a game resource extraction scene; the game resource extraction scene comprises a drawing area;
receiving an image drawn by the user in the drawing area;
and acquiring a target game resource matched with the image based on the drawn image, taking the target game resource as a game resource extraction result of the user, and displaying the target game resource.
In a second aspect, an embodiment of the present disclosure further provides an interaction apparatus, including:
the resource extraction scene display module is used for responding to a game resource extraction request of a user and displaying a game resource extraction scene; the game resource extraction scene comprises a drawing area;
the drawing image receiving module is used for receiving the image drawn in the drawing area by the user;
and the target game resource acquisition module is used for acquiring the target game resource matched with the image based on the drawn image, taking the target game resource as the game resource extraction result of the user, and displaying the target game resource.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, including a memory and a processor, where the memory stores a computer program, and when the computer program is executed by the processor, the electronic device is enabled to implement any of the interaction methods provided in the embodiments of the present disclosure.
In a fourth aspect, the present disclosure also provides a computer-readable storage medium, where a computer program is stored in the storage medium, and when the computer program is executed by a computing device, the computing device is caused to implement any one of the interaction methods provided by the embodiments of the present disclosure.
Compared with the prior art, the technical scheme provided by the embodiment of the disclosure has at least the following advantages:
in the embodiment of the disclosure, for the resource extraction game, the image drawn by the user in the drawing area is acquired, the target game resource is matched based on the image and is used as the resource extraction result of the user, so that the problems that the probability of the user extracting the specific game resource is low and the resource extraction implementation mode is single in the existing scheme are solved, the probability of the user extracting the specific game resource is improved, the extraction implementation mode of the resource extraction game is enriched, and the game interest and the interactivity are improved. For example, for resources with higher resource levels in a part of games, the probability of drawing a high-level game by a user is usually extremely low, and after the technical scheme provided by the embodiment of the disclosure is adopted, the user can draw an image related to the high-level resource and match the high-level resource based on the image, so that the efficiency of drawing the high-level game resource by the user can be improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments or technical solutions in the prior art of the present disclosure, the drawings used in the description of the embodiments or prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a flowchart of an interaction method provided by an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a game resource extraction scenario provided by an embodiment of the present disclosure;
FIG. 3 is a flow chart of another interaction method provided by the embodiments of the present disclosure;
FIG. 4 is a schematic diagram of another game resource extraction scenario provided by an embodiment of the present disclosure;
fig. 5 is a schematic diagram illustrating a display effect of a user virtual character or a specific virtual character according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of a display effect of another user virtual character or a specific virtual character provided in the embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an interaction apparatus according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
In order that the above objects, features and advantages of the present disclosure may be more clearly understood, aspects of the present disclosure will be further described below. It should be noted that the embodiments and features of the embodiments of the present disclosure may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure, but the present disclosure may be practiced in other ways than those described herein; it is to be understood that the embodiments disclosed in the specification are only a few embodiments of the present disclosure, and not all embodiments.
Fig. 1 is a flowchart of an interaction method provided in an embodiment of the present disclosure, which may be applied to a situation of how to implement resource extraction for a resource extraction-type game, and the interaction method may be implemented by an interaction device. The interactive device can be implemented by software and/or hardware, and can be integrated on any electronic equipment with computing capability, such as a terminal of a smart phone, a tablet computer, a notebook computer, a desktop computer, and the like.
As shown in fig. 1, an interaction method provided by the embodiment of the present disclosure may include:
s101, responding to a game resource extraction request of a user, and displaying a game resource extraction scene; the game resource drawing scene includes a drawing area.
Illustratively, in the process of playing a game, a user can trigger a game resource extraction request by touching a resource extraction control or a resource extraction prop icon on a game interface; and the electronic equipment responds to the request, and shows a game resource extraction scene, wherein the scene is a summary of interface display effects when the user performs resource extraction, and a drawing area can be included in the scene and is not limited to be used as a response area of the user drawing image. The specific layout effect of the game resource extraction scene can be designed according to the game development requirement, and the embodiment of the disclosure is not particularly limited.
Fig. 2 is a schematic diagram of a game resource extraction scenario provided in an embodiment of the present disclosure, which is used for exemplary illustration of the embodiment of the present disclosure, but should not be construed as a specific limitation to the embodiment of the present disclosure. As shown in fig. 2, the drawing area may be shown in the middle area of the game resource drawing scene. After the drawing area is finished, the user may also submit the drawn image by touching a "submit" control (not shown in fig. 2) in the drawing area. The drawing operation that the user can perform may include, but is not limited to, drawing graphics, adding colors, and the like, and various selectable drawing tools such as a brush, an eraser, and the like with different line thicknesses are provided for the user in the game resource extraction scene.
S102, receiving an image drawn in the drawing area by the user.
For example, the electronic device may determine an image drawn by the user in the drawing area in response to a drawing image submission operation of the user, and may further determine a target game resource matching the image in the game resource library.
In an alternative embodiment, receiving the image drawn by the user in the drawing area may include:
monitoring the sliding operation of a user in the drawing area, and acquiring the sliding track of the sliding operation;
and determining an image drawn in the drawing area by the user based on the sliding track in response to the fact that the touch object for generating the sliding operation leaves the drawing area or no new sliding operation is monitored within a preset time interval after the sliding operation is stopped.
The user can slide in the drawing area through a fingertip or any other available touch tool (such as a touch pen and the like) to generate a sliding track; if the touch object (i.e., the user's fingertip or the touch implement) leaves the drawing area, e.g., the touch object moves out of the drawing area or the touch object leaves the game interface, the user-drawn image may be determined based on the sliding trajectory that has been generated; or, if a new sliding operation is not monitored within a preset time interval (for example, x seconds, which may be specifically set according to actual requirements) after the sliding operation of the user is terminated, that is, the sliding operation of the user is considered to be terminated, determining an image drawn by the user based on the generated sliding track. In the embodiment of the present disclosure, the sliding track may include a track for determining the outline of the graph, a sliding track for filling colors, and the like, that is, in the process of determining the image drawn by the user based on the sliding track, the color of the image drawn by the user may also be determined based on the color filling operation of the user, so as to determine the image effect finally drawn by the user.
By monitoring the sliding operation of the user, the effective drawing of the user can be determined in time, and particularly, the condition that one stroke of drawing authority is provided for the user (namely, the user needs to finish drawing one stroke of drawing on the premise of no interruption of the sliding track) is met, so that the efficiency of determining the effective image drawn by the user is improved, and the efficiency of resource extraction is improved. Moreover, for the drawing condition of one stroke, the image drawn by the user is usually in a simple graphic effect, the background can improve the efficiency of performing compliance detection on the image drawn by the user, and the user is prevented from drawing some non-compliant sensitive images.
S103, acquiring a target game resource matched with the image based on the drawn image, taking the target game resource as a game resource extraction result of the user, and displaying the target game resource.
For example, the similarity between the drawn image and each game resource in the game resource library may be calculated, and the target game resource may be determined according to the similarity, for example, a game resource corresponding to the similarity exceeding a preset threshold (the value may be set adaptively) may be determined as the target game resource. The implementation of the similarity calculation may be implemented by using any available image similarity calculation method, and the embodiments of the present disclosure are not limited in particular, for example, the similarity calculation based on image feature vectors, or the similarity calculation based on image edge detection and comparison, and the like.
Optionally, in this embodiment of the present disclosure, the image drawn by the user in the drawing area includes an image associated with a virtual character or an image associated with a virtual prop. That is, the embodiment of the present disclosure may be applied to a situation for extracting a virtual character in a game, and may also be applied to a situation for extracting a virtual item in a game. The virtual character can be any character in the game, and is distinguished according to the game type, and the virtual prop can be any prop related to the virtual character in the game, such as weapons, armor, ornaments and the like. The embodiment of the disclosure supports the user to draw any image related to the virtual character or the virtual prop so as to realize the extraction of game resources.
Further illustratively, the image associated with the virtual character includes at least one of a character image, a face representation, an expression representation, an action representation, and an accessory (e.g., ornament) representation of the virtual character. The electronic equipment can determine a matched virtual character according to a character image, a face portrait, an expression portrait, a motion portrait or an accessory portrait drawn by a user, for example, the expression portrait of a certain virtual character drawn by the user, and the electronic equipment can firstly determine a matched target expression portrait by performing similarity matching in a game resource library, and then determine the target virtual character as an extraction object of the user based on the corresponding relation between the target expression portrait and the target virtual character.
The target game resources comprise cards associated with the virtual characters or virtual items, for example, the cards associated with the virtual characters comprise but are not limited to image display cards of the virtual characters, and the cards associated with the virtual items comprise but are not limited to style display cards of the virtual items. Specifically, the image display card can be realized in the form of image display fragments, and the style display card can be realized in the form of style display fragments. In addition, the game can be designed, a plurality of image display fragments can be acquired and then used for exchanging complete virtual characters, and a plurality of pattern display fragments can be acquired and then used for exchanging complete virtual props.
In an alternative embodiment, based on the rendered image, obtaining a target game resource matching the image includes:
calculating the similarity between the image and each game resource in the game resource library to be extracted based on the drawn image; regarding the implementation of the calculation of the similarity, any available similarity calculation mode may be adopted, and the embodiment of the present disclosure is not particularly limited;
determining game resources corresponding to the similarity exceeding a preset threshold (the value can be set adaptively) as candidate resources;
and determining the target game resource from at least one candidate resource based on a preset rule.
Wherein the preset rule is used for defining how to determine the target game resource from a plurality of candidate resources. For example, the preset rule includes determining a resource with the largest similarity among the candidate resources as the target game resource, or determining a resource with the largest initial drawing probability among the candidate resources as the target game resource, or determining the target game resource based on the similarity and the initial drawing probability corresponding to the candidate resources at the same time, for example, determining a resource with an initial drawing probability greater than a first threshold and a similarity greater than a second threshold as the target game resource. The initial drawing probability is a preset drawing hit rate for each game resource in the game development process, and the larger the initial drawing probability is, the larger the probability that the corresponding game resource is drawn by the user is. The values of the various thresholds can be set adaptively.
Optionally, after determining, as the candidate resource, the game resource corresponding to the similarity exceeding the preset threshold, the method provided in the embodiment of the present disclosure further includes:
and improving the initial drawing probability of the candidate resources to determine the target game resources from at least one candidate resource after the probability improvement based on a preset rule. That is, after determining the candidate resources based on the similarity calculation, the initial probability of each candidate resource may be adjusted, for example, probability boosting of a preset ratio is performed based on the initial probability of each candidate resource (for example, xx% boosting is performed based on the initial probability of the drawing), or a probability boosting ratio is determined according to the similarity between each candidate resource and the user drawing image, and then the most corresponding probability of each candidate resource is determined based on the probability boosting ratio, for example, the greater the similarity corresponding to the candidate resource, the greater the corresponding probability boosting ratio. Further, the candidate resource with the maximum probability of being drawn can be determined as the target game resource based on the candidate resource with the improved probability; of course, the target game resource may also be determined based on the similarity corresponding to the candidate resource and the mid-drawing probability after the probability is improved.
The probability of drawing a specific game resource by the user is improved by improving the drawing probability of the candidate resource and then determining the target game resource.
In an alternative embodiment, based on the rendered image, obtaining a target game resource matching the image includes:
calculating the similarity between the image and each game resource in the game resource library to be extracted based on the drawn image;
and determining the game resource corresponding to the maximum similarity as the target game resource.
After the similarity between the image drawn by the user and each game resource in the game resource library is determined, the game resource corresponding to the maximum similarity can be directly determined as the target game resource, and the efficiency of extracting the specific game resource by the user is improved.
In the embodiment of the disclosure, for the resource extraction game, the image drawn by the user in the drawing area is acquired, the target game resource is matched based on the image and is used as the resource extraction result of the user, so that the problems that the probability of the user extracting the specific game resource is low and the resource extraction implementation mode is single in the existing scheme are solved, the probability of the user extracting the specific game resource is improved, the extraction implementation mode of the resource extraction game is enriched, and the game interest and the interactivity are improved. For example, for resources with higher resource levels in a part of games, the probability of drawing a high-level game by a user is usually extremely low, and after the technical scheme provided by the embodiment of the disclosure is adopted, the user can draw an image related to the high-level resource and match the high-level resource based on the image, so that the efficiency of drawing the high-level game resource by the user can be improved.
Fig. 3 is a flowchart of another interaction method provided in the embodiment of the present disclosure, which is further optimized and expanded based on the above technical solution, and can be combined with the above optional embodiments. In the embodiment of the present disclosure, the game resource extraction scene may further include a template element in addition to the drawing area, where the template element may be used as a basic component element for drawing an image by the user. The template elements may include character image materials, expression materials, facial materials, action materials, accessory materials, virtual property materials, etc. associated with the game resource. By providing the template elements, convenience of drawing images by a user can be improved, and the efficiency of drawing images by the user is improved.
As shown in fig. 3, an interaction method provided by the embodiment of the present disclosure may include:
s301, responding to a game resource extraction request of a user, and displaying a game resource extraction scene; the game resource extraction scene comprises a drawing area and template elements.
Fig. 4 is a schematic diagram of another game resource extraction scenario provided in the embodiment of the present disclosure, which is used to exemplarily explain the embodiment of the present disclosure, but should not be construed as a specific limitation to the embodiment of the present disclosure, and a specific display position of a drawing area and a template element in the game resource extraction scenario may be flexibly set according to a game interface layout. FIG. 4 illustrates, as an example, the template element shown in the right area of the game resource extraction scene. Moreover, each template element shown in fig. 4 is only used as an element example, and a specific template element may be preset according to a requirement.
S302, responding to the selection operation of the template elements by the user, and determining at least one target template element selected by the user.
For example, a user may click on a template element, the electronic device determines a target template element selected by the user according to the click operation of the user, and displays the target template element in the drawing area; the user can also drag the template element to the drawing area, wherein the template element is also the target template element selected by the user.
S303, determining a drawn image based on the combination of at least one target template element and/or the drawing operation of the user in the drawing area.
In the drawing area, a user can combine target elements, such as moving, splicing and the like, to obtain a target spliced image, and the target spliced image can be directly used as an image drawn by the user; the drawing operation of the user and the target stitched image may be combined again to obtain a final drawn image, where the drawing operation of the user may be drawing of a new image independent of the target stitched image, or redrawing (or called editing) of the target stitched image based on the target stitched image, such as line editing or color editing; of course, the image drawn by the user may be determined directly based on the drawing operation by the user.
S304, acquiring the target game resource matched with the image based on the drawn image, taking the target game resource as a game resource extraction result of the user, and displaying the target game resource.
In this embodiment of the present disclosure, optionally, receiving an image drawn by a user in the drawing area includes:
timing the drawing time; the timing starting time is the time when the game resource extraction scene starts to be displayed or the time when the user triggers the drawing area;
determining an image drawn by a user in a drawing area according to the drawing operation of the user within the preset time; wherein the preset drawing time may be adaptively determined according to a game design.
As shown in fig. 4, time information (specific display position may be determined according to interface layout) may also be displayed in the game resource extraction scene for prompting the user of the currently available drawing time, which may not only improve game experience of the user, determine an effective drawing image of the user within a limited time, but also control the consumed time of resource extraction, avoid longer consumed time of resource extraction due to longer drawing time of the user, and further avoid reduction of the implementation efficiency of resource extraction.
On the basis of the foregoing technical solution, optionally, before responding to a game resource extraction request of a user and displaying a game resource extraction scene, the method provided in the embodiment of the present disclosure further includes:
displaying a user virtual character corresponding to a user or a game resource extraction guide animation of a specific virtual character; the virtual role of the user corresponding to the user can be a virtual role played by the user in the game or a virtual role controlled by the user, and the specific virtual role can be an interactive role designed for a resource extraction situation in the game;
correspondingly, after receiving the image drawn in the drawing area by the user, the method further comprises the following steps:
based on the drawn image, an expression and/or an action of the user virtual character or the specific virtual character is determined, and the user virtual character or the specific virtual character is presented based on the determined expression and/or action.
For example, after receiving an image drawn by a user, the electronic device may determine, according to a preset interaction feedback logic, an expression and/or an action to be exhibited by a virtual character of the user or a specific virtual character; or after receiving the image drawn by the user, the electronic device determines the expression and/or action to be displayed by the virtual character of the user or the specific virtual character according to the matching condition of the image and each game resource in the game resource library. For example, a target game resource matched with an image drawn by a user is matched in a game resource library, so that the expression to be displayed of a user virtual character or a specific virtual character can be determined to be happy, and the action to be displayed is also an action related to the happy, such as dance and the like; if a target game resource matching the image drawn by the user is not matched in the game resource library, it can be determined that the expression to be presented by the virtual character of the user or a specific virtual character is disappointed or carefree, and the action to be presented is also an action related to disappointed or carefree. In addition, after receiving the image drawn by the user, the electronic device may further evaluate the image according to a preset image evaluation rule, and determine the expression and/or the action to be presented of the virtual character of the user or the specific virtual character according to the evaluation result, for example, if the evaluation result is high, the expression and/or the action to be presented are related to happy, and if the evaluation result is low, the expression and/or the action to be presented are related to disappointment or apprehension.
Fig. 5 is a schematic view of a display effect of a user virtual character or a specific virtual character provided in an embodiment of the present disclosure, which shows that when an expression to be displayed of the user virtual character or the specific virtual character is happy, and an action to be displayed is also an action related to happy, a display effect of the user virtual character or the specific virtual character is shown.
Fig. 6 is a schematic view of a display effect of another user virtual character or a specific virtual character provided in the embodiment of the present disclosure, and shows a schematic view of a display effect of a user virtual character or a specific virtual character when an expression to be displayed of the user virtual character or the specific virtual character is disappointed and an action to be displayed is also an action related to disappointing.
It should be understood that fig. 5 and 6 are intended as an example only and should not be construed as specifically limiting the embodiments of the present disclosure.
The game resource extraction guide animation of the user virtual character or the specific virtual character corresponding to the user is displayed, and the expression and/or the action to be displayed of the user virtual character or the specific virtual character is determined based on the drawing image of the user, so that the interestingness of resource extraction can be improved, the interactivity of the game is increased, and the game experience of the user is improved.
In the embodiment of the disclosure, aiming at the resource extraction game, firstly, an image drawn by a user is determined based on the combination of at least one target template element selected by the user in a game resource extraction scene and/or the drawing operation of the user in a drawing area, so that the realization mode of drawing by the user is enriched; then, target game resources are matched based on the images drawn by the user and used as resource extraction results of the user, the problems that in the existing scheme, the probability of the user for extracting the specific game resources is low and the resource extraction implementation mode is single are solved, the probability of the user for extracting the specific game resources is improved, the extraction implementation mode of the resource extraction type game is enriched, the interest and the interactivity of the game are improved, and the game experience of the user is improved.
Fig. 7 is a schematic structural diagram of an interaction apparatus provided in an embodiment of the present disclosure, where the apparatus may be implemented by software and/or hardware, and may be integrated on any electronic device with computing capability, such as a terminal like a smart phone, a tablet computer, a notebook computer, and a desktop computer.
As shown in fig. 7, the interaction apparatus 500 provided in the embodiment of the present disclosure may include a resource extraction scene presentation module 501, a drawing image receiving module 502, and a target game resource obtaining module 503, where:
a resource extraction scene display module 501, configured to respond to a game resource extraction request of a user and display a game resource extraction scene; the game resource extraction scene comprises a drawing area;
a drawing image receiving module 502, configured to receive an image drawn in a drawing area by a user;
and a target game resource obtaining module 503, configured to obtain, based on the drawn image, a target game resource matched with the image, as a game resource extraction result of the user, and display the target game resource.
Optionally, the game resource extraction scenario further includes a template element; the drawing image receiving module 502 includes:
a target template element determination unit, configured to determine at least one target template element selected by a user in response to a selection operation of the template element by the user;
a first drawing image determination unit for determining a drawn image based on a combination of at least one target template element and/or a drawing operation of a user in the drawing area.
Optionally, the drawing image receiving module 502 includes:
the sliding track acquiring unit is used for monitoring the sliding operation of a user in the drawing area and acquiring the sliding track of the sliding operation;
and the second drawn image determining unit is used for determining an image drawn in the drawing area by the user based on the sliding track in response to that the touch object used for generating the sliding operation leaves the drawing area or no new sliding operation is monitored within a preset time interval after the sliding operation is stopped.
Optionally, the drawing image receiving module 502 includes:
a timing unit for timing the drawing time; the timing starting time is the time when the game resource extraction scene starts to be displayed or the time when the user triggers the drawing area;
and the third drawing image determining unit is used for determining the image drawn in the drawing area by the user according to the drawing operation of the user in the preset time.
Optionally, the target game resource obtaining module 503 includes:
the similarity calculation unit is used for calculating the similarity between the image and each game resource in the game resource library to be extracted based on the drawn image;
the candidate resource determining unit is used for determining the game resources corresponding to the similarity exceeding the preset threshold as candidate resources;
the first target game resource determining unit is used for determining a target game resource from at least one candidate resource based on a preset rule.
Optionally, the target game resource obtaining module 503 further includes:
and the probability improving unit is used for improving the initial drawing probability of the candidate resources so as to determine the target game resources from at least one candidate resource after the probability improvement based on a preset rule.
Optionally, the target game resource obtaining module 503 includes:
the similarity calculation unit is used for calculating the similarity between the image and each game resource in the game resource library to be extracted based on the drawn image;
and the second target game resource determining unit is used for determining the game resource corresponding to the maximum similarity as the target game resource.
Optionally, the apparatus 500 provided in the embodiment of the present disclosure further includes:
the guiding animation display module is used for displaying a user virtual character corresponding to a user or game resource extraction guiding animation of a specific virtual character;
and the virtual character display module is used for determining the expression and/or action of the user virtual character or the specific virtual character based on the drawn image and displaying the user virtual character or the specific virtual character based on the determined expression and/or action.
Optionally, the drawn image includes an image associated with the virtual character, or an image associated with the virtual prop; the target game resource includes cards associated with the virtual character or virtual item.
Optionally, the image associated with the virtual character comprises at least one of a character image, a face portrait, an expression portrait, a movement portrait and an accessory portrait of the virtual character;
the cards associated with the virtual characters comprise image display cards of the virtual characters;
the cards associated with the virtual items include style display cards of the virtual items.
The interaction device provided by the embodiment of the disclosure can execute any interaction method provided by the embodiment of the disclosure, and has corresponding functional modules and beneficial effects of the execution method. Reference may be made to the description of any method embodiment of the disclosure that may not be described in detail in the embodiments of the apparatus of the disclosure.
Fig. 8 is a schematic structural diagram of an electronic device provided in the embodiment of the present disclosure, which is used to exemplarily illustrate an electronic device that implements an interaction method provided in the embodiment of the present disclosure. The electronic devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., car navigation terminals), and the like, and fixed terminals such as digital TVs, desktop computers, smart home devices, wearable electronic devices, servers, and the like. The electronic device shown in fig. 8 is only an example, and should not bring any limitation to the functions and occupation ranges of the embodiments of the present disclosure.
As shown in fig. 8, the electronic device 600 includes one or more processors 601 and memory 602.
The processor 601 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device 600 to perform desired functions.
The memory 602 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. Volatile memory can include, for example, Random Access Memory (RAM), cache memory (or the like). The non-volatile memory may include, for example, Read Only Memory (ROM), a hard disk, flash memory, and the like. One or more computer program instructions may be stored on a computer readable storage medium and executed by the processor 601 to implement the interaction methods provided by the embodiments of the disclosure, as well as to implement other desired functions. Various contents such as an input signal, a signal component, a noise component, etc. may also be stored in the computer-readable storage medium.
The interaction method provided by the embodiment of the disclosure may include: responding to a game resource extraction request of a user, and displaying a game resource extraction scene; the game resource extraction scene comprises a drawing area; receiving an image drawn by a user in the drawing area; and acquiring a target game resource matched with the image based on the drawn image, taking the target game resource as a game resource extraction result of the user, and displaying the target game resource. It should be understood that electronic device 600 may also perform other alternative embodiments provided by the disclosed method embodiments.
In one example, the electronic device 600 may further include: an input device 603 and an output device 604, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
The input device 603 may also include, for example, a keyboard, a mouse, and the like.
The output device 604 may output various information including the determined distance information, direction information, and the like to the outside. The output devices 604 may include, for example, a display, speakers, a printer, and a communication network and remote output devices connected thereto, among others.
Of course, for simplicity, only some of the components of the electronic device 600 relevant to the present disclosure are shown in fig. 8, omitting components such as buses, input/output interfaces, and the like. In addition, electronic device 600 may include any other suitable components depending on the particular application.
In addition to the methods and apparatus described above, the disclosed embodiments also provide a computer program product comprising a computer program or computer program instructions that, when executed by a computing device, cause the computing device to implement any of the interaction methods provided by the disclosed embodiments.
The computer program product may write program code for performing the operations of embodiments of the present disclosure in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the consumer electronic device, partly on the consumer electronic device, as a stand-alone software package, partly on the consumer electronic device and partly on a remote electronic device, or entirely on the remote electronic device.
Furthermore, the disclosed embodiments may also provide a computer-readable storage medium having stored thereon computer program instructions that, when executed by a computing device, cause the computing device to implement any of the interaction methods provided by the disclosed embodiments.
The interaction method provided by the embodiment of the disclosure may include: responding to a game resource extraction request of a user, and displaying a game resource extraction scene; the game resource extraction scene comprises a drawing area; receiving an image drawn by a user in the drawing area; and acquiring a target game resource matched with the image based on the drawn image, taking the target game resource as a game resource extraction result of the user, and displaying the target game resource. It should be understood that the computer program instructions, when executed by a computing device, may also cause the computing device to implement other alternative embodiments provided by the disclosed method embodiments.
A computer-readable storage medium may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present disclosure, which enable those skilled in the art to understand or practice the present disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (13)

1. An interaction method, comprising:
responding to a game resource extraction request of a user, and displaying a game resource extraction scene; the game resource extraction scene comprises a drawing area;
receiving an image drawn by the user in the drawing area;
and acquiring a target game resource matched with the image based on the drawn image, taking the target game resource as a game resource extraction result of the user, and displaying the target game resource.
2. The method of claim 1, wherein the game resource extraction scenario further comprises a template element;
the receiving an image drawn by the user in the drawing area comprises:
in response to the selection operation of the user on the template elements, determining at least one target template element selected by the user;
determining the rendered image based on a combination of the at least one target template element and/or a rendering operation of the user in a rendering region.
3. The method of claim 1, wherein receiving the image drawn by the user in the drawing area comprises:
monitoring the sliding operation of the user in the drawing area, and acquiring the sliding track of the sliding operation;
and determining an image drawn in the drawing area by the user based on the sliding track in response to that the touch object for generating the sliding operation leaves the drawing area or no new sliding operation is monitored within a preset time interval after the sliding operation is stopped.
4. The method of claim 1, wherein receiving the image drawn by the user in the drawing area comprises:
timing the drawing time; the starting time of timing is the time when the game resource extraction scene starts to be displayed or the time when the user triggers the drawing area;
and determining the image drawn in the drawing area by the user according to the drawing operation of the user in the preset time.
5. The method of claim 1, wherein obtaining the target game resource matching the image based on the rendered image comprises:
calculating the similarity between the image and each game resource in the game resource library to be extracted based on the drawn image;
determining game resources corresponding to the similarity exceeding a preset threshold as candidate resources;
and determining the target game resource from at least one candidate resource based on a preset rule.
6. The method according to claim 5, wherein after determining the game resource corresponding to the similarity exceeding the preset threshold as the candidate resource, the method further comprises:
and improving the initial drawing probability of the candidate resources to determine the target game resource from at least one candidate resource after the probability improvement based on the preset rule.
7. The method of claim 1, wherein obtaining the target game resource matching the image based on the rendered image comprises:
calculating the similarity between the image and each game resource in the game resource library to be extracted based on the drawn image;
and determining the game resource corresponding to the maximum similarity as the target game resource.
8. The method of claim 1, further comprising, before said presenting a game resource extraction scenario in response to a game resource extraction request from a user:
displaying a game resource extraction guide animation of a user virtual character or a specific virtual character corresponding to the user;
correspondingly, after the receiving the image drawn in the drawing area by the user, the method further comprises the following steps:
and determining the expression and/or action of the user virtual character or the specific virtual character based on the drawn image, and displaying the user virtual character or the specific virtual character based on the determined expression and/or action.
9. The method of any of claims 1-8, wherein the rendered image comprises an image associated with a virtual character, or an image associated with a virtual prop;
the target game resource includes a card associated with the virtual character or the virtual item.
10. The method of claim 9, wherein the image associated with the virtual character comprises at least one of a character image, a face representation, an expression representation, a motion representation, and an accessory representation of the virtual character;
the cards associated with the virtual character comprise an image display card of the virtual character;
the cards associated with the virtual items include style display cards of the virtual items.
11. An interactive apparatus, comprising:
the resource extraction scene display module is used for responding to a game resource extraction request of a user and displaying a game resource extraction scene; the game resource extraction scene comprises a drawing area;
the drawing image receiving module is used for receiving the image drawn in the drawing area by the user;
and the target game resource acquisition module is used for acquiring the target game resource matched with the image based on the drawn image, taking the target game resource as the game resource extraction result of the user, and displaying the target game resource.
12. An electronic device, comprising a memory and a processor, wherein the memory has stored therein a computer program that, when executed by the processor, causes the electronic device to implement the interaction method of any one of claims 1-10.
13. A computer-readable storage medium, in which a computer program is stored which, when executed by a computing device, causes the computing device to carry out the interaction method according to any one of claims 1 to 10.
CN202110152534.6A 2021-02-03 2021-02-03 Interaction method, interaction device, electronic equipment and storage medium Pending CN112827171A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110152534.6A CN112827171A (en) 2021-02-03 2021-02-03 Interaction method, interaction device, electronic equipment and storage medium
PCT/CN2022/071705 WO2022166551A1 (en) 2021-02-03 2022-01-13 Interaction method and apparatus, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110152534.6A CN112827171A (en) 2021-02-03 2021-02-03 Interaction method, interaction device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112827171A true CN112827171A (en) 2021-05-25

Family

ID=75931851

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110152534.6A Pending CN112827171A (en) 2021-02-03 2021-02-03 Interaction method, interaction device, electronic equipment and storage medium

Country Status (2)

Country Link
CN (1) CN112827171A (en)
WO (1) WO2022166551A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113304475A (en) * 2021-06-25 2021-08-27 北京字跳网络技术有限公司 Interaction method, interaction device, electronic equipment and storage medium
WO2022166551A1 (en) * 2021-02-03 2022-08-11 北京字跳网络技术有限公司 Interaction method and apparatus, electronic device and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117273816A (en) * 2022-09-21 2023-12-22 支付宝(杭州)信息技术有限公司 Resource lottery processing method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109389660A (en) * 2018-09-28 2019-02-26 百度在线网络技术(北京)有限公司 Image generating method and device
CN110393917A (en) * 2019-08-26 2019-11-01 网易(杭州)网络有限公司 A kind of pumping card method and device in game
CN110502181A (en) * 2019-08-26 2019-11-26 网易(杭州)网络有限公司 Pumping card probability determination method, device, equipment and medium in game
CN111389017A (en) * 2020-04-14 2020-07-10 网易(杭州)网络有限公司 Interactive control method and device in game, electronic equipment and computer medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9463389B1 (en) * 2012-10-05 2016-10-11 Zynga Inc. Methods and systems relating to obtaining game asset value
CN109529325B (en) * 2018-11-27 2021-11-19 杭州勺子网络科技有限公司 Reward distribution method, device, game management server and readable storage medium
CN112827171A (en) * 2021-02-03 2021-05-25 北京字跳网络技术有限公司 Interaction method, interaction device, electronic equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109389660A (en) * 2018-09-28 2019-02-26 百度在线网络技术(北京)有限公司 Image generating method and device
CN110393917A (en) * 2019-08-26 2019-11-01 网易(杭州)网络有限公司 A kind of pumping card method and device in game
CN110502181A (en) * 2019-08-26 2019-11-26 网易(杭州)网络有限公司 Pumping card probability determination method, device, equipment and medium in game
CN111389017A (en) * 2020-04-14 2020-07-10 网易(杭州)网络有限公司 Interactive control method and device in game, electronic equipment and computer medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
净水出榴莲: "巴基剧情+抽卡_bilibili", 《HTTPS://WWW.BILIBILI.COM/VIDEO/BV1AD4Y1U7DW/?SPM_ID_FROM=333.337.SEARCH-CARD.ALL.CLICK》 *
流星雨佑刘阳: "海贼王新手游单抽概率", 《HTTPS://WWW.BILIBILI.COM/VIDEO/BV1F54Y1R7I6/?SPM_ID_FROM=333.788.RECOMMEND_MORE_VIDEO.0》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022166551A1 (en) * 2021-02-03 2022-08-11 北京字跳网络技术有限公司 Interaction method and apparatus, electronic device and storage medium
CN113304475A (en) * 2021-06-25 2021-08-27 北京字跳网络技术有限公司 Interaction method, interaction device, electronic equipment and storage medium
CN113304475B (en) * 2021-06-25 2023-09-22 北京字跳网络技术有限公司 Interaction method, device, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2022166551A1 (en) 2022-08-11

Similar Documents

Publication Publication Date Title
US11893230B2 (en) Semantic zoom animations
US20190095040A1 (en) Electronic device operating according to pressure state of touch input and method thereof
WO2021232930A1 (en) Application screen splitting method and apparatus, storage medium and electric device
CN112827171A (en) Interaction method, interaction device, electronic equipment and storage medium
AU2011376310B2 (en) Programming interface for semantic zoom
US20140089824A1 (en) Systems And Methods For Dynamically Altering A User Interface Based On User Interface Actions
US20130067398A1 (en) Semantic Zoom
CA2847177A1 (en) Semantic zoom gestures
KR20130127349A (en) System and control method for character make-up
CN108984707B (en) Method, device, terminal equipment and storage medium for sharing personal information
US10416868B2 (en) Method and system for character insertion in a character string
CN112114734B (en) Online document display method, device, terminal and storage medium
US20170052701A1 (en) Dynamic virtual keyboard graphical user interface
CN109388309B (en) Menu display method, device, terminal and storage medium
CN105700727A (en) Interacting With Application layer Beneath Transparent Layer
EP2965181A1 (en) Enhanced canvas environments
US9710124B2 (en) Augmenting user interface elements based on timing information
CN112843723A (en) Interaction method, interaction device, electronic equipment and storage medium
CN107562324A (en) The method and terminal of data display control
CN111522610A (en) Information display method, device and equipment
Yang et al. Around-device finger input on commodity smartwatches with learning guidance through discoverability
KR20150093045A (en) Sketch Retrieval system, user equipment, service equipment and service method based on meteorological phenomena information and computer readable medium having computer program recorded therefor
CN104007886A (en) Information processing method and electronic device
CN109754450B (en) Method, device and equipment for generating track
CN115828844A (en) Data processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210525

RJ01 Rejection of invention patent application after publication