CN111135579A - Game software interaction method and device, terminal equipment and storage medium - Google Patents

Game software interaction method and device, terminal equipment and storage medium Download PDF

Info

Publication number
CN111135579A
CN111135579A CN201911361191.3A CN201911361191A CN111135579A CN 111135579 A CN111135579 A CN 111135579A CN 201911361191 A CN201911361191 A CN 201911361191A CN 111135579 A CN111135579 A CN 111135579A
Authority
CN
China
Prior art keywords
target
character model
interaction event
interaction
target character
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911361191.3A
Other languages
Chinese (zh)
Inventor
戴兴明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mihoyo Technology Shanghai Co ltd
Original Assignee
Mihoyo Technology Shanghai Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mihoyo Technology Shanghai Co ltd filed Critical Mihoyo Technology Shanghai Co ltd
Priority to CN201911361191.3A priority Critical patent/CN111135579A/en
Publication of CN111135579A publication Critical patent/CN111135579A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses an interaction method and device of game software, terminal equipment and a storage medium. The method comprises the following steps: creating a target character model on a preset game interface of game software; acquiring a target interaction event corresponding to the target character model, and creating an interaction trigger object according to the target interaction event; and when receiving a trigger operation input by a user based on the interaction trigger object, controlling the target character model to execute a feedback operation corresponding to the target interaction event. The embodiment of the invention solves the problem of stiffness of the traditional game character model by configuring the target interaction event for the target character model, so that the user can interact with the character model in the game software, the flexibility of the character model is improved, and the interest of the game is increased.

Description

Game software interaction method and device, terminal equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of software, in particular to an interaction method and device of game software, terminal equipment and a storage medium.
Background
Human-computer interaction is a study of the interaction between a research system and a user, and the system can be various machines and can also be a computerized system and software. At present, a common entertainment mode for realizing man-machine interaction is that a user uses various terminal devices to run game software.
During the development process of game software, some scene character models are involved in the game, and character images in the scene character models are various, and may be character images in non-player character models, and may also be character images in player character models or other character images. However, a scene character model does not usually have its own behavior pattern, and the scene character model can be used for beautification and decoration in a scene before a user enters a certain level, and can also be used as an example of a character model when the user selects a player character.
In the prior art, when a scene character model is constructed, the scene character model is usually controlled by artificial intelligence, but interaction behaviors such as battle and dialogue with a user cannot be realized. Although some scene character models may have the function of feeding back click triggers of users, the existing feedback operation is single in type, the character models are too stiff, and the interaction between the character models and the users is not flexible enough.
Disclosure of Invention
The embodiment of the invention provides an interaction method and device of game software, terminal equipment and a storage medium, so that a user can interact with a character model in the game software, the flexibility of the character model is improved, and the interestingness of a game is increased.
In a first aspect, an embodiment of the present invention provides an interaction method for game software, where the method includes:
creating a target character model on a preset game interface of game software;
acquiring a target interaction event corresponding to the target character model, and creating an interaction trigger object according to the target interaction event;
and when receiving a trigger operation input by a user based on the interaction trigger object, controlling the target character model to execute a feedback operation corresponding to the target interaction event.
In a second aspect, an embodiment of the present invention further provides an interaction device for game software, where the device includes:
the target character model creating module is used for creating a target character model on a preset game interface of game software;
the target interaction event determining module is used for acquiring a target interaction event corresponding to the target character model and creating an interaction triggering object according to the target interaction event;
and the feedback operation control module is used for controlling the target character model to execute the feedback operation corresponding to the target interaction event when receiving the trigger operation input by the user based on the interaction trigger object.
In a third aspect, an embodiment of the present invention further provides a terminal device, where the terminal device includes:
one or more processors;
a memory for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement any of the game software interaction methods referred to above.
In a fourth aspect, the present invention further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are used for executing the interaction method of any one of the game software mentioned above.
The embodiment of the invention solves the problem of stiffness of the traditional game character model by configuring the target interaction event for the target character model, so that the user can interact with the character model in the game software, the flexibility of the character model is improved, and the interest of the game is increased.
Drawings
Fig. 1 is a flowchart of an interaction method of game software according to an embodiment of the present invention.
Fig. 2a is a schematic view of a preset game interface before an interactive trigger object is triggered according to an embodiment of the present invention.
Fig. 2b is a schematic diagram of a preset game interface triggered by an interactive trigger object according to an embodiment of the present invention.
Fig. 2c is a schematic diagram of a preset game interface before triggering of another interactive trigger object according to an embodiment of the present invention.
Fig. 3 is a flowchart of an interaction method of game software according to a second embodiment of the present invention.
Fig. 4 is a flowchart of a specific example of an interaction method of game software according to a second embodiment of the present invention.
Fig. 5 is a schematic diagram of an interaction device of game software according to a third embodiment of the present invention.
Fig. 6 is a schematic structural diagram of a terminal device according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of an interaction method of game software according to an embodiment of the present invention, where the method is applicable to a case where a character model in the game software performs human-computer interaction, and the method may be executed by an interaction device of the game software, where the device may be implemented in software and/or hardware, and the device may be configured in a terminal device. The method specifically comprises the following steps:
and S110, creating a target character model on a preset game interface of the game software.
The game software is generally a software product combining various programs and animation effects, and can be installed in a terminal device for use, for example, a smart phone, an Ipad, a computer, and various game devices.
The game interface refers to a user interface in game software, and the game interface usually includes game design elements such as buttons, animations, characters, sounds, windows and the like which are in direct or indirect contact with a user, so as to realize an interactive function between the game software and the user. In one embodiment, the preset game interface optionally includes at least one of a scene environment interface, a scenario interface and a function interface. The scene environment interface is an interface for showing a specific environment factor to a user. For example, the specific environmental factors may be historical environmental factors, scientific environmental factors, economic environmental factors, or the like; the scenario interface refers to an interface related to the design background of the game software. Illustratively, the design background can be swordsmen, fighting or intelligent break-through and the like, and the plot interface can promote the development of plots in the game software; the function interface refers to an interface through which a user can perform a setting operation. Illustratively, the function interface may be a game start interface or a level start interface, i.e., the user may select whether to start the game on the interface. Of course, the functional interface may also be a character creation interface, i.e., a user may create a character image on the interface, such as selecting a system preset character model and creating a personalized character image, wherein, for example, the character image includes, but is not limited to, clothes, makeup, five sense organs, and equipment. The above is an example of the functional interface, and the functional interface is not limited herein.
The personality model may be, for example, a default personality model of the system or a user-created generated personality model. The system default character model refers to a character model created by a developer when designing game software, such as a non-player character model. For example, the character model generated by the user creation may be a player character model.
In one embodiment, the target character model may optionally include a system default character model corresponding to a predetermined game interface, a user-created generated character model, and/or a user-selected character model.
In one embodiment, optionally, at least one system default character model and/or character model generated by user creation corresponding to the preset game interface is obtained, and the target character model is determined according to preset selection rules, the system default character models and/or the character model generated by user creation. The preset selection rule may be, for example, a random selection.
In one embodiment, optionally, a preset character model is created on a preset game interface of the game software, and when a user character model selection instruction input by a user is received, the user character model is taken as a target character model; and creating a target character model on a preset game interface of the game software. The user personality model may be a system default personality model and/or a user-created generated personality model, among other things.
Further, in an embodiment, optionally, the number of times that the user selects each user character model is counted, and the preset character model is determined according to the counted number of times; and creating a preset character model on a preset game interface of the game software.
And S120, acquiring a target interaction event corresponding to the target character model, and creating an interaction trigger object according to the target interaction event.
Wherein, the interaction event refers to an event related to the interaction behavior of the character model. Illustratively, the interactive event may be a call to a phone or a weather talk, or may prompt the user to perform a related operation.
In an embodiment, optionally, a preset interaction event set corresponding to the target character model is obtained, and the target interaction event is determined according to a preset selection condition and the preset interaction event set. Illustratively, the preset selection condition includes a random selection. For example, assume character model A and character model B, interaction event 1, interaction event 2, interaction event 3, and interaction event 4. The preset interaction event set A corresponding to the character model A comprises an interaction event 1, an interaction event 2 and an interaction event 3, and the preset interaction event set B corresponding to the character model B comprises an interaction event 4. And when the character model A is the target character model, randomly selecting the interaction event 2 from the preset interaction event set A as the target interaction event.
The interactive trigger object refers to an object that a user can perform a trigger operation in a preset game interface, and may be, for example, a button, a preset object on a target character model, or a trigger area in the preset game interface. For example, the preset objects on the target character model may be eyes, arms, clothes, and the like. The trigger area in the preset game interface may be any area in the preset game scene, and the position and the area size of the trigger area are not limited herein.
In one embodiment, optionally, according to the type and/or content of the target interaction event, determining the attribute of the interaction trigger object, and creating the interaction trigger object according to the attribute, wherein the attribute of the interaction trigger object comprises at least one of shape, color and display position.
In one embodiment, optionally, the types of target interactivity events include generic interactivity events and feature interactivity events. The content of the common interaction event may be, for example, a call, a weather or mood talk, etc. The contents of the characteristic interactivity event may be a birthday greeting and a time-related interactivity event, and the contents of the time-related interactivity event may be, for example, an interactivity event of a morning greeting, a noon greeting, and an evening greeting, etc.
In one embodiment, optionally, the color of the interaction triggering object is determined according to the type of the target interaction event; and determining the shape and the display position of the interaction triggering object according to the content of the target interaction event. Illustratively, assume that the target interactivity event is a birthday blessing, belonging to a featured interactivity event. Specifically, since the target interaction event belongs to the feature interaction event, the color of the interaction triggering object may be red. Since the content of the target interaction event is a birthday blessing, the shape of the interaction triggering object may be a cake shape, and the display position may be a center position of the preset game interface.
It should be noted that, the creation of the interaction triggering object according to the target interaction event is only exemplary and not limited herein.
And S130, when a trigger operation input by the user based on the interaction trigger object is received, controlling the target character model to execute a feedback operation corresponding to the target interaction event.
In one embodiment, optionally, the target character model is controlled to perform a feedback operation corresponding to the target interaction event, including at least one of:
calling an expression module to display expression information of the target character model in the target interaction event;
calling a sound module to play sound information of the target character model in the target interaction event;
calling an action module to display action information of the target character model in the target interaction event;
calling a special effect module to show the special effect expression of the target character model in the target interaction event;
and calling a character module to display the character information of the target character model in the target interaction event.
Wherein, for example, assuming that the target interaction event is a birthday blessing, the facial expression information of the target character model may be happy; the sound message may be "today is your birthday, congratulating your birthday. "of course, the sound information may also be a happy birthday song; accordingly, the text message may be "today is your birthday, congratulating your birthday. "can also be happy birthday lyrics; the action information can be a wishing action, and the special effect performance can be coloured ribbon dancing.
On the basis of the foregoing embodiment, optionally, after receiving a trigger operation input by a user based on an interaction trigger object, the method further includes: hiding the interaction trigger object. Specifically, hiding the interaction trigger object means not displaying the interaction trigger object on the preset game interface. The advantage of such an arrangement is that when the target character model performs the feedback operation corresponding to the target interaction event, if the interaction triggering object is displayed on the preset game interface, there may be a problem of occlusion, thereby affecting the aesthetic appearance of the preset game interface. And the interaction triggering object is hidden, so that the interaction experience of the user is improved.
Fig. 2a is a schematic view of a preset game interface before an interactive trigger object is triggered according to an embodiment of the present invention. FIG. 2a illustrates a bridge scenario environment interface in which a target character model is created. The "cloud" shaped bubble object at the top right position of the head of the target character model in fig. 2a is the interaction triggering object. Fig. 2b is a schematic diagram of a preset game interface triggered by an interactive trigger object according to an embodiment of the present invention. Fig. 2b shows the facial expression information, motion information, and text information of the target character model. The feedback operation corresponding to the target interaction event shown in fig. 2b does not include special effect representation. It will be appreciated that the sound information cannot be embodied in fig. 2 b. Fig. 2c is a schematic diagram of a preset game interface before triggering of another interactive trigger object according to an embodiment of the present invention. The target interaction event in fig. 2c is different from the target interaction event in fig. 2a, and is embodied in that the shape and the creation position of the interaction trigger object in fig. 2c are the same as those of the interaction trigger object in fig. 2a, but the gray values of the interaction trigger objects are different.
According to the technical scheme, the problem that the existing game character model is stiff is solved by configuring the target interaction event for the target character model, so that the user can interact with the character model in the game software, the flexibility of the character model is improved, and the interest of the game is increased.
Example two
Fig. 3 is a flowchart of an interaction method of game software according to a second embodiment of the present invention, and the technical solution of the present embodiment is further detailed based on the above-mentioned embodiment. Optionally, the obtaining of the target interaction event corresponding to the target character model includes: and acquiring running information of the game software, and determining a target interaction event corresponding to the target character model according to the running information, wherein the running information comprises at least one of a running scene, game virtual time and user trigger time.
The specific implementation steps of this embodiment include:
and S210, creating a target character model on a preset game interface of the game software.
S220, obtaining running information of the game software, and determining a target interaction event corresponding to the target character model according to the running information, wherein the running information comprises at least one of a running scene, game virtual time and user triggering time.
In one embodiment, optionally, the operation scene includes at least one of a scene object and scene information in the preset game interface. Illustratively, the scene object may be a building, a river, a prop or a tree, or the like. The scene information may be a battle scene, a rural scene, or a backroom scene, among others. The preset scene is not limited, and specifically, the preset scene may be set according to the type of the game software
The game virtual time is exemplarily a time preset by game software. The user trigger time includes, but is not limited to, a user login time and a time to invoke a preset game interface. For example, if the user calls the preset game interface by the trigger operation at 9:00 am, the virtual time in the preset game interface may be 9:00 pm.
In one embodiment, optionally, determining whether the user trigger time is at a preset user time; if so, taking the preset interaction event corresponding to the preset user time as the target interaction event. For example, the preset user time may be a birthday or a anniversary of the user, and specifically, the anniversary may be a anniversary of the game software or a anniversary of the user registered in the game software.
In one embodiment, optionally, the target interaction event corresponding to the target character model is determined according to the running information and the preset priority. For example, the preset priority from high to low may be a preset user time, a running scene, and a game virtual time. For example, assume that the preset user time includes the user's birthday, the operation scene is a battle scene, and the game virtual time is 260 years before the yuan evening. And if the user trigger time is in the preset user time, the target interaction event is the target interaction event corresponding to the birthday. And if the triggering time of the user is not in the preset user time, the target interaction event is the target interaction event corresponding to the scene.
In one embodiment, optionally, after detecting that the target character model completes the preset operation, activating the interaction event module; and acquiring a target interaction event which is distributed by the interaction event module and corresponds to the target character model.
When the target character model is created, the behavior states of the target character model are various when the target character model is initially displayed to the user on the preset game interface. For example, the initial behavior state of the target character model may be standing, squatting, side-to-side or back-to-back, etc. The preset operation refers to an operation performed by the target character model before the feedback operation is performed. In one embodiment, optionally, the target character model is controlled to complete a preset operation corresponding to a default operation instruction and/or a user operation instruction input by a user according to the default operation instruction of the system and/or the user operation instruction input by the user. Wherein, for example, the preset operation may be rising, bending or turning.
The interaction event module is used for distributing a target interaction event for the target character model, and the interaction event module comprises at least one character model and at least one preset interaction event corresponding to each character model. In one embodiment, optionally, the interaction event module determines the target interaction event corresponding to the target character model according to a preset distribution rule. For example, the preset allocation rule may be random allocation, or may be the technical solution described in this embodiment.
And S230, creating an interaction trigger object according to the target interaction event.
And S240, when a trigger operation input by the user based on the interaction trigger object is received, controlling the target character model to execute a feedback operation corresponding to the target interaction event.
In one embodiment, optionally, after controlling the target character model to perform the feedback operation corresponding to the target interaction event, the method further includes: and acquiring another target interaction event corresponding to the target character model, and creating an interaction triggering object according to the current target interaction event.
Fig. 4 is a flowchart of a specific example of an interaction method of game software according to a second embodiment of the present invention. And creating a target character model on a game interface of the bridge, and activating an interaction event module after detecting that the target character model finishes turning. And acquiring a target interaction event distributed by the interaction event module, and creating an interaction trigger object according to the target interaction event, wherein the interaction trigger object can be a bubble. And after receiving the triggering operation of clicking the bubble by the user, controlling the target character model to execute the feedback operation corresponding to the target interaction event. And after the feedback operation is executed, another target interaction event distributed by the interaction event module is obtained again, and an interaction trigger object is created according to the current target interaction event.
According to the technical scheme of the embodiment, the target interaction event is determined through the running information of the game software, the problem that the existing character model of the game is stiff is solved, and the diversity and the real-time performance of the feedback operation of the character model are improved, so that the flexibility of the character model is further improved, and the interestingness of the game is increased.
EXAMPLE III
Fig. 5 is a schematic diagram of an interaction device of game software according to a third embodiment of the present invention. The embodiment can be suitable for the situation that the character model in the game software carries out human-computer interaction, the device can be realized in a software and/or hardware mode, and the device can be configured on the terminal equipment. The interaction means of the game software includes a target character model creation module 310, a target interaction event determination module 320, and a feedback operation control module 330.
The target character model creating module 310 is configured to create a target character model on a preset game interface of the game software;
a target interaction event determining module 320, configured to obtain a target interaction event corresponding to the target character model, and create an interaction trigger object according to the target interaction event;
and the feedback operation control module 330 is configured to control the target character model to perform a feedback operation corresponding to the target interaction event when a trigger operation input by the user based on the interaction trigger object is received.
According to the technical scheme, the problem that the existing game character model is stiff is solved by configuring the target interaction event for the target character model, so that the user can interact with the character model in the game software, the flexibility of the character model is improved, and the interest of the game is increased.
On the basis of the foregoing technical solution, optionally, the target interaction event determining module 320 includes:
the interactive event activating unit is used for activating the interactive event module after detecting that the target character model completes the preset operation;
and the target interaction event acquisition first unit is used for acquiring a target interaction event which is distributed by the interaction event module and corresponds to the target character model.
Optionally, the target interaction event determining module 320 includes:
and the target interaction event acquisition second unit is used for acquiring the running information of the game software and determining the target interaction event corresponding to the target character model according to the running information, wherein the running information comprises at least one of a running scene, game virtual time and user trigger time.
Optionally, the target interaction event obtaining second unit is specifically configured to:
determining whether the user trigger time is in a preset user time; and if so, taking the preset interaction event corresponding to the preset user time as the target interaction event.
Optionally, the target interaction event determining module 320 is specifically configured to:
determining the attribute of the interaction trigger object according to the type and/or the content of the target interaction event, and creating the interaction trigger object according to the attribute, wherein the attribute of the interaction trigger object comprises at least one of shape, color and display position.
Optionally, the apparatus further comprises:
and the interactive trigger object hiding module is used for hiding the interactive trigger object.
Controlling the target character model to perform feedback operations corresponding to the target interaction event, including at least one of:
calling an expression module to display expression information of the target character model in the target interaction event;
calling a sound module to play sound information of the target character model in the target interaction event;
calling an action module to display action information of the target character model in the target interaction event;
calling a special effect module to show the special effect expression of the target character model in the target interaction event;
and calling a character module to display the character information of the target character model in the target interaction event.
The game software interaction device provided by the embodiment of the invention can be used for executing the game software interaction method provided by the embodiment of the invention, and has the corresponding functions and beneficial effects of the execution method.
It should be noted that, in the embodiment of the interaction device of the game software, the units and modules included in the embodiment are only divided according to the functional logic, but are not limited to the above division, as long as the corresponding functions can be realized; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
Example four
Fig. 6 is a schematic structural diagram of a terminal device according to a fourth embodiment of the present invention, where the fourth embodiment of the present invention provides a service for implementing the game software interaction method according to the foregoing embodiment of the present invention, and may configure an interaction apparatus for game software in the foregoing embodiment. Fig. 6 illustrates a block diagram of an exemplary terminal device 12 suitable for use in implementing embodiments of the present invention. The terminal device 12 shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiment of the present invention.
As shown in fig. 6, terminal device 12 is in the form of a general purpose computing device. The components of terminal device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Terminal device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by terminal device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. Terminal device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 6, and commonly referred to as a "hard drive"). Although not shown in FIG. 6, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Terminal device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with terminal device 12, and/or with any devices (e.g., network card, modem, etc.) that enable terminal device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, terminal device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via network adapter 20. As shown in fig. 6, the network adapter 20 communicates with the other modules of the terminal device 12 via the bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with terminal device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing, such as implementing an interactive method of game software provided by an embodiment of the present invention, by running a program stored in the system memory 28.
Through the terminal equipment, the problem that the existing game character model is stiff is solved, so that a user can interact with the character model in game software, the flexibility of the character model is improved, and the interest of a game is increased.
EXAMPLE five
An embodiment of the present invention further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform an interaction method for game software, the method including:
creating a target character model on a preset game interface of game software;
acquiring a target interaction event corresponding to the target character model, and creating an interaction trigger object according to the target interaction event;
and when receiving a trigger operation input by a user based on the interaction trigger object, controlling the target character model to execute a feedback operation corresponding to the target interaction event.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
Of course, the storage medium provided by the embodiment of the present invention contains computer-executable instructions, and the computer-executable instructions are not limited to the above method operations, and may also perform related operations in the interaction method of the game software provided by any embodiment of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. An interactive method of game software, comprising:
creating a target character model on a preset game interface of game software;
acquiring a target interaction event corresponding to the target character model, and creating an interaction trigger object according to the target interaction event;
and when receiving a trigger operation input by a user based on the interaction trigger object, controlling the target character model to execute a feedback operation corresponding to the target interaction event.
2. The method of claim 1, wherein obtaining the target interaction event corresponding to the target character model comprises:
when the target character model is detected to finish the preset operation, activating an interaction event module;
and acquiring a target interaction event which is distributed by the interaction event module and corresponds to the target character model.
3. The method of claim 1, wherein obtaining the target interaction event corresponding to the target character model comprises:
and acquiring running information of the game software, and determining a target interaction event corresponding to the target character model according to the running information, wherein the running information comprises at least one of a running scene, game virtual time and user trigger time.
4. The method of claim 3, wherein determining the target interaction event corresponding to the target character model based on the operational information comprises:
determining whether the user trigger time is in a preset user time;
and if so, taking a preset interaction event corresponding to the preset user time as the target interaction event.
5. The method of claim 1, wherein creating an interaction triggering object according to the target interaction event comprises:
determining the attribute of the interaction trigger object according to the type and/or content of the target interaction event, and creating the interaction trigger object according to the attribute, wherein the attribute of the interaction trigger object comprises at least one of shape, color and display position.
6. The method of claim 1, after receiving a trigger operation input by a user based on the interactive trigger object, further comprising:
hiding the interaction trigger object.
7. The method of claim 1, wherein the controlling the target character model to perform feedback operations corresponding to the target interaction event comprises at least one of:
calling an expression module to display expression information of the target character model in the target interaction event;
calling a sound module to play sound information of the target character model in the target interaction event;
calling an action module to display action information of the target character model in the target interaction event;
calling a special effect module to show the special effect expression of the target character model in the target interaction event;
and calling a text module to display the text information of the target character model in the target interaction event.
8. An interactive device for game software, comprising:
the target character model creating module is used for creating a target character model on a preset game interface of game software;
the target interaction event determining module is used for acquiring a target interaction event corresponding to the target character model and creating an interaction triggering object according to the target interaction event;
and the feedback operation control module is used for controlling the target character model to execute the feedback operation corresponding to the target interaction event when receiving the trigger operation input by the user based on the interaction trigger object.
9. A terminal device, comprising:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the interactive method of game software as claimed in any one of claims 1-7.
10. A storage medium containing computer-executable instructions for performing the interaction method of the game software according to any one of claims 1 to 7 when executed by a computer processor.
CN201911361191.3A 2019-12-25 2019-12-25 Game software interaction method and device, terminal equipment and storage medium Pending CN111135579A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911361191.3A CN111135579A (en) 2019-12-25 2019-12-25 Game software interaction method and device, terminal equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911361191.3A CN111135579A (en) 2019-12-25 2019-12-25 Game software interaction method and device, terminal equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111135579A true CN111135579A (en) 2020-05-12

Family

ID=70520250

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911361191.3A Pending CN111135579A (en) 2019-12-25 2019-12-25 Game software interaction method and device, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111135579A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111930229A (en) * 2020-07-22 2020-11-13 北京字节跳动网络技术有限公司 Man-machine interaction method and device and electronic equipment
CN112221125A (en) * 2020-10-26 2021-01-15 网易(杭州)网络有限公司 Game interaction method and device, electronic equipment and storage medium
CN113742807A (en) * 2021-09-07 2021-12-03 广联达科技股份有限公司 Interaction processing method and device and electronic equipment
WO2022156367A1 (en) * 2021-01-21 2022-07-28 北京字跳网络技术有限公司 Data generation control method and apparatus, and electronic device and storage medium
CN114924666A (en) * 2022-05-12 2022-08-19 上海云绅智能科技有限公司 Interaction method and device for application scene, terminal equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103116463A (en) * 2013-01-31 2013-05-22 广东欧珀移动通信有限公司 Interface control method of personal digital assistant applications and mobile terminal
CN107281752A (en) * 2017-06-16 2017-10-24 苏州蜗牛数字科技股份有限公司 It is a kind of that the method that intelligent virtual is conducted a sightseeing tour is set in VR game
CN108465238A (en) * 2018-02-12 2018-08-31 网易(杭州)网络有限公司 Information processing method, electronic equipment in game and storage medium
US20180311582A1 (en) * 2017-04-28 2018-11-01 PlayFusion Limited User interface control cluster for enhancing a gaming experience
CN108984087A (en) * 2017-06-02 2018-12-11 腾讯科技(深圳)有限公司 Social interaction method and device based on three-dimensional avatars
CN110102053A (en) * 2019-05-13 2019-08-09 腾讯科技(深圳)有限公司 Virtual image display methods, device, terminal and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103116463A (en) * 2013-01-31 2013-05-22 广东欧珀移动通信有限公司 Interface control method of personal digital assistant applications and mobile terminal
US20180311582A1 (en) * 2017-04-28 2018-11-01 PlayFusion Limited User interface control cluster for enhancing a gaming experience
CN108984087A (en) * 2017-06-02 2018-12-11 腾讯科技(深圳)有限公司 Social interaction method and device based on three-dimensional avatars
CN107281752A (en) * 2017-06-16 2017-10-24 苏州蜗牛数字科技股份有限公司 It is a kind of that the method that intelligent virtual is conducted a sightseeing tour is set in VR game
CN108465238A (en) * 2018-02-12 2018-08-31 网易(杭州)网络有限公司 Information processing method, electronic equipment in game and storage medium
CN110102053A (en) * 2019-05-13 2019-08-09 腾讯科技(深圳)有限公司 Virtual image display methods, device, terminal and storage medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111930229A (en) * 2020-07-22 2020-11-13 北京字节跳动网络技术有限公司 Man-machine interaction method and device and electronic equipment
CN111930229B (en) * 2020-07-22 2021-09-03 北京字节跳动网络技术有限公司 Man-machine interaction method and device and electronic equipment
CN112221125A (en) * 2020-10-26 2021-01-15 网易(杭州)网络有限公司 Game interaction method and device, electronic equipment and storage medium
WO2022156367A1 (en) * 2021-01-21 2022-07-28 北京字跳网络技术有限公司 Data generation control method and apparatus, and electronic device and storage medium
CN113742807A (en) * 2021-09-07 2021-12-03 广联达科技股份有限公司 Interaction processing method and device and electronic equipment
CN113742807B (en) * 2021-09-07 2024-05-14 广联达科技股份有限公司 Interactive processing method and device and electronic equipment
CN114924666A (en) * 2022-05-12 2022-08-19 上海云绅智能科技有限公司 Interaction method and device for application scene, terminal equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111135579A (en) Game software interaction method and device, terminal equipment and storage medium
US10348795B2 (en) Interactive control management for a live interactive video game stream
CN110337025B (en) Interaction control method and device in live webcasting, storage medium and electronic equipment
CN109947388B (en) Page playing and reading control method and device, electronic equipment and storage medium
CN111330272B (en) Virtual object control method, device, terminal and storage medium
US10795554B2 (en) Method of operating terminal for instant messaging service
CN104380256A (en) Method, system and executable piece of code for virtualisation of hardware resource associated with computer system
US20230306694A1 (en) Ranking list information display method and apparatus, and electronic device and storage medium
CN108845840A (en) Management method, device, storage medium and the intelligent terminal of application program sound
KR20210156741A (en) Method and system for providing conversation with artificial intelligence character
CN112306321B (en) Information display method, device and equipment and computer readable storage medium
CN113952720A (en) Game scene rendering method and device, electronic equipment and storage medium
CN112631814B (en) Game scenario dialogue playing method and device, storage medium and electronic equipment
CN110237531A (en) Method, apparatus, terminal and the storage medium of game control
CN114130011A (en) Object selection method, device, storage medium and program product for virtual scene
CN113559520A (en) Interactive control method and device in game, electronic equipment and readable storage medium
CN111309210B (en) Method, device, terminal and storage medium for executing system functions
CN114885199B (en) Real-time interaction method, device, electronic equipment, storage medium and system
US20240091643A1 (en) Method and apparatus for controlling virtual objects in game, and electronic device and storage medium
CN113144606B (en) Skill triggering method of virtual object and related equipment
CN114712853A (en) Game map loading and displaying method, device, equipment and storage medium
CN112435317A (en) Anti-threading method and device in game, electronic equipment and storage medium
CN111652344A (en) Method and apparatus for presenting information
WO2024055865A1 (en) Task platform display method and apparatus, device, and computer program product
CN113599811B (en) Animation role display method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200512

RJ01 Rejection of invention patent application after publication