CN112044053A - Information processing method, device, equipment and storage medium in virtual scene - Google Patents

Information processing method, device, equipment and storage medium in virtual scene Download PDF

Info

Publication number
CN112044053A
CN112044053A CN202010913313.1A CN202010913313A CN112044053A CN 112044053 A CN112044053 A CN 112044053A CN 202010913313 A CN202010913313 A CN 202010913313A CN 112044053 A CN112044053 A CN 112044053A
Authority
CN
China
Prior art keywords
interactive
interaction
presenting
playing
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010913313.1A
Other languages
Chinese (zh)
Other versions
CN112044053B (en
Inventor
李一琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010913313.1A priority Critical patent/CN112044053B/en
Publication of CN112044053A publication Critical patent/CN112044053A/en
Application granted granted Critical
Publication of CN112044053B publication Critical patent/CN112044053B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides an information processing method, device and equipment in a virtual scene and a computer readable storage medium; the method comprises the following steps: presenting an interactive interface of a virtual scene, and presenting at least one interactive object in the interactive interface; playing a media file during the process of presenting the at least one interactive object; presenting an interaction animation for the interaction object in the interaction interface in response to the interaction operation for the interaction object; and in the process of presenting the interactive animation, dynamically adjusting the playing rhythm of the media file to ensure that the playing rhythm of the media file corresponds to the interactive rhythm corresponding to the interactive operation. By the method and the device, the playing rhythm of the media file can be dynamically adjusted, so that the playing rhythm of the media file corresponds to the interaction rhythm corresponding to the interaction operation, and the participation degree of a user is improved.

Description

Information processing method, device, equipment and storage medium in virtual scene
Technical Field
The present application relates to information processing technologies, and in particular, to an information processing method, an information processing apparatus, information processing equipment, and a computer-readable storage medium in a virtual scene.
Background
The display technology based on the graphic processing hardware expands the perception environment and the channel for acquiring information, particularly the display technology of the virtual scene, and can realize intelligent interaction between people and various virtual objects in the virtual scene according to the actual application requirements. The virtual scene is displayed by a screen of the device, and a visual perception effect similar to the real world is achieved by means of a stereoscopic display technology, typically, various virtual scenes are output by using a stereoscopic display technology such as stereoscopic projection, virtual reality and augmented reality technologies. The game is a typical application of a virtual scene display technology, a user can run the game through the equipment, and in the virtual scene output by the equipment, a game object controlled by the user is in cooperative battle or fight with other game objects.
In the related art, when a user interacts with an interactive object in a virtual scene based on an interactive interface of the virtual scene, an interaction rhythm corresponding to an interactive operation of the interactive object is separated from a playing rhythm of a media file (such as background music) corresponding to the virtual scene, so that the participation degree of the user completely depends on the interest degree of the virtual scene, and the participation degree of the user in the virtual scene is reduced.
Disclosure of Invention
The embodiment of the application provides an information processing method, an information processing device, information processing equipment and a computer-readable storage medium in a virtual scene, which can dynamically adjust the playing rhythm of a media file, so that the playing rhythm of the media file corresponds to an interaction rhythm corresponding to an interaction operation, and the participation degree of a user is improved.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides an information processing method in a virtual scene, which comprises the following steps:
presenting an interactive interface of a virtual scene, and presenting at least one interactive object in the interactive interface;
playing a media file during the process of presenting the at least one interactive object;
presenting an interaction animation for the interaction object in the interaction interface in response to the interaction operation for the interaction object;
and in the process of presenting the interactive animation, dynamically adjusting the playing rhythm of the media file to ensure that the playing rhythm of the media file corresponds to the interactive rhythm corresponding to the interactive operation.
An embodiment of the present application provides an information processing apparatus in a virtual scene, including:
the first presentation module is used for presenting an interactive interface of a virtual scene and presenting at least one interactive object in the interactive interface;
the background playing module is used for playing the media file in the process of presenting the at least one interactive object;
the second presentation module is used for responding to the interactive operation aiming at the interactive object and presenting the interactive animation aiming at the interactive object in the interactive interface;
and the rhythm adjusting module is used for dynamically adjusting the playing rhythm of the media file in the process of presenting the interactive animation, so that the playing rhythm of the media file corresponds to the interactive rhythm corresponding to the interactive operation.
In the above scheme, the first presentation module is further configured to present, in the interactive interface, a selection interface including at least one object tag;
in response to the selection operation of the object label triggered based on the selection interface, presenting an interactive object matched with the selected object label;
correspondingly, the background playing module is further configured to play the media file matched with the selected object tag.
In the above scheme, the first presenting module is further configured to present, in the interactive interface, at least one object bearing position for bearing the interactive object, where the object bearing position and the interactive object are in a one-to-one correspondence relationship;
and in each object bearing position, presenting the process from the appearance of a partial object to the appearance of a whole object of the corresponding interactive object.
In the foregoing solution, the second presenting module is further configured to, when the interactive object appears as a whole object, trigger the interactive operation, and respond to the interactive operation for the interactive object, and in the interactive interface, present an animation that the interactive object disappears in the interactive interface from appearance of the whole object to disappearance of the object.
In the above solution, the first presentation module is further configured to present, in the interactive interface, the presentation animation of the at least one interactive object in a loop at a first presentation rate;
when the interactive performance generated by the interactive operation reaches a first performance threshold value, circularly presenting the display animation of the at least one interactive object at a second presentation rate;
wherein the second presentation rate is greater than the first presentation rate.
In the above scheme, the first presentation module is further configured to present display animations of a first number of interactive objects simultaneously in the interactive interface;
when the interactive achievement generated by the interactive operation reaches a second achievement threshold value, displaying display animations of a second number of interactive objects in the interactive interface at the same time;
wherein the second number is greater than the first number.
In the above scheme, the apparatus further includes an audio playing module, where the audio playing module is configured to obtain a target audio file when the interactive operation successfully acts on the interactive object;
wherein the target audio file corresponds to the interactive operation which is successfully acted or corresponds to the interactive object which is successfully acted by the interactive operation;
and playing the target audio file in the process of presenting the interactive animation.
In the foregoing solution, the audio playing module is further configured to
And adjusting the playing tone of the target audio file along with the increase of the interaction times of the interaction operation successfully acting on the interaction object, so that the playing tone is matched with the interaction times.
In the foregoing solution, the rhythm adjusting module is further configured to
When the interaction times of the interaction operation aiming at the interaction object exceed a time threshold, acquiring an adjusting coefficient matched with the interaction times;
and adjusting the playing rhythm of the media file based on the adjusting coefficient, and playing the media file according to the adjusted playing rhythm.
In the foregoing solution, the rhythm adjusting module is further configured to
And correspondingly adjusting the playing rhythm of the media file along with the change of the interaction frequency of the interaction operation aiming at the interaction object, so that the playing rhythm of the media file is matched with the interaction frequency.
In the foregoing solution, the rhythm adjusting module is further configured to
When the media file is a background audio file, dynamically adjusting the audio playing rhythm of the background audio file;
and when the media file is a background animation file, dynamically adjusting the animation display rhythm of the background animation file.
In the above scheme, the apparatus further includes a third presenting module, where the third presenting module is configured to present the third image data
When the interaction times of the interaction operation which does not successfully act on the interaction object reach a time threshold value, presenting time prompt information in the interaction interface;
the time prompt information is used for prompting the interaction ending time aiming at the virtual scene;
and when the interaction ending time is up, presenting an interaction result interface of the virtual scene, and presenting an interaction result corresponding to the virtual scene in the interaction result interface.
An embodiment of the present application provides an electronic device, including:
a memory for storing executable instructions;
and the processor is used for realizing the information processing method in the virtual scene provided by the embodiment of the application when the executable instructions stored in the memory are executed.
The embodiment of the present application provides a computer-readable storage medium, which stores executable instructions for causing a processor to execute the method for processing information in a virtual scene provided in the embodiment of the present application.
The embodiment of the application has the following beneficial effects:
in the process of interacting with the interactive object, the playing rhythm of the media file corresponding to the virtual scene is dynamically adjusted according to the interaction rhythm corresponding to the interactive operation, so that the playing rhythm of the media file corresponds to the interaction rhythm corresponding to the interactive operation, and then a user can feel the change of the interaction rhythm of the interactive operation aiming at the interactive object through the change of the playing rhythm of the media file, thereby being beneficial to further stimulating the user to participate in the interactive operation with the interactive object and further improving the participation degree of the user.
Drawings
Fig. 1A is a schematic diagram of an optional application mode of an information processing method in a virtual scene according to an embodiment of the present application;
fig. 1B is a schematic diagram of an optional application mode of an information processing method in a virtual scene according to an embodiment of the present application;
fig. 2 is an alternative structural schematic diagram of an electronic device provided in an embodiment of the present application;
fig. 3 is an optional flowchart of an information processing method in a virtual scene according to an embodiment of the present disclosure;
FIGS. 4A-4C are schematic diagrams of display interfaces provided by embodiments of the present application;
FIGS. 5A-5B are schematic diagrams of display interfaces provided by embodiments of the present application;
fig. 6 is an optional flowchart of an information processing method in a virtual scene according to an embodiment of the present disclosure;
FIG. 7 is a schematic flowchart of audio operation of a web page according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an information processing apparatus in a virtual scene according to an embodiment of the present application.
Detailed Description
In order to make the objectives, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the attached drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
In the following description, references to the terms "first", "second", and the like, are only to distinguish similar objects and do not denote a particular order, but rather the terms "first", "second", and the like may be used interchangeably with the order specified, where permissible, to enable embodiments of the present application described herein to be practiced otherwise than as specifically illustrated or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be used for the following explanation.
1) The client side, and the application program running in the terminal for providing various services, such as a video playing client side, an instant messaging client side, a live broadcast client side, and the like.
2) In response to the condition or state on which the performed operation depends, one or more of the performed operations may be in real-time or may have a set delay when the dependent condition or state is satisfied; there is no restriction on the order of execution of the operations performed unless otherwise specified.
3) Virtual scenes, which are different from the real world and output by equipment, can form visual perception of the virtual scenes by naked eyes or assistance of the equipment, such as two-dimensional images output by a display screen, and three-dimensional images output by stereoscopic display technologies such as stereoscopic projection, virtual reality and augmented reality technologies; in addition, various real-world-simulated perceptions such as auditory perception, tactile perception, olfactory perception, motion perception and the like can be formed through various possible hardware.
In an implementation scenario, referring to fig. 1A, fig. 1A is a schematic diagram of an optional application mode of the information processing method in a virtual scenario provided in the embodiment of the present application, and is applicable to some application modes that can complete calculation of related data of a virtual scenario completely depending on the computing capability of the terminal 200, for example, a game in a single-computer/offline mode, and output of the virtual scenario is completed through the terminal 200 such as a smart phone, a tablet computer, and a virtual reality/augmented reality device.
When the visual perception of the virtual scene is formed, the terminal 200 calculates and displays required data through the graphic computing hardware, finishes loading, analyzing and rendering of the display data, and outputs a picture or a video capable of forming the visual perception on the virtual scene on the graphic output hardware, for example, a two-dimensional picture or a video is displayed on a display screen of a smart phone, or a picture or a video for realizing a three-dimensional display effect is projected on a lens of augmented reality/virtual reality glasses; furthermore, to enrich the perception effect, the device may also form one or more of auditory perception, tactile perception, and motion perception by means of different hardware.
As an example, the terminal 200 runs a game application, presents an interactive interface corresponding to a game when the game application runs, and presents at least one interactive object in the interactive interface; playing a media file as background information of a virtual scene during presentation of at least one interactive object; presenting an interactive animation aiming at the interactive object in the interactive interface in response to the interactive operation aiming at the interactive object; and in the process of presenting the interactive animation, dynamically adjusting the playing rhythm of the media file to ensure that the playing rhythm of the media file corresponds to the interactive rhythm corresponding to the interactive operation.
As another example, a virtual drum application is installed in the terminal device, and when the virtual drum application runs, the terminal presents an interactive interface corresponding to the virtual drum, and presents at least one interactive object in the interactive interface (i.e., a drum, which is freely retractable); playing a media file serving as background information of a virtual stage in the process of presenting at least one interactive object; presenting an interactive animation aiming at the interactive object in the interactive interface in response to the interactive operation aiming at the interactive object; and in the process of presenting the interactive animation, dynamically adjusting the playing rhythm of the media file to ensure that the playing rhythm of the media file corresponds to the interactive rhythm corresponding to the interactive operation.
In another implementation scenario, referring to fig. 1B, fig. 1B is a schematic diagram of an optional application mode of the information processing method in the virtual scenario, which is applied to the terminal 200 and the server 300, and is generally applicable to completing virtual scenario calculation depending on the computing power of the server 300 and outputting the application mode of the virtual scenario at the terminal 200.
Taking the visual perception of forming a virtual scene as an example, the server 300 calculates display data related to the virtual scene and sends the calculated display data to the terminal 200, the terminal 200 depends on graphic calculation hardware to complete loading and analysis of the calculated display data, and depends on graphic output hardware to output the virtual scene to form the visual perception, for example, a two-dimensional picture or video can be presented on a display screen of a smart phone, or a picture or video for realizing a three-dimensional display effect is projected on a lens of augmented reality/virtual reality glasses; for perception in the form of a virtual scene, it is understood that a hearing perception may be formed by means of a corresponding hardware output of the terminal device, e.g. using a microphone output, a haptic perception using a vibrator output, etc.
As an example, the terminal 200 runs a game application, performs calculation of display data related to a virtual scene through the server 300 connected to the network and transmits the calculation result to the terminal 200, presents an interactive interface corresponding to the game when the game application runs, and presents at least one interactive object in the interactive interface; playing a media file as background information of a virtual scene during presentation of at least one interactive object; presenting an interactive animation aiming at the interactive object in the interactive interface in response to the interactive operation aiming at the interactive object; and in the process of presenting the interactive animation, dynamically adjusting the playing rhythm of the media file to ensure that the playing rhythm of the media file corresponds to the interactive rhythm corresponding to the interactive operation.
Referring to fig. 2, fig. 2 is an optional schematic structural diagram of an electronic device 500 provided in the embodiment of the present application, and in practical application, the electronic device 500 may be the terminal 200 in fig. 1A, or may be the terminal 200 or the server 300 in fig. 1B, and the electronic device is the terminal 200 shown in fig. 1A as an example, so as to describe the electronic device that implements the information processing method in the virtual scene in the embodiment of the present application. The electronic device 500 shown in fig. 2 includes: at least one processor 510, memory 550, at least one network interface 520, and a user interface 530. The various components in the electronic device 500 are coupled together by a bus system 540. It is understood that the bus system 540 is used to enable communications among the components. The bus system 540 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 540 in fig. 3.
The Processor 510 may be an integrated circuit chip having Signal processing capabilities, such as a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like, wherein the general purpose Processor may be a microprocessor or any conventional Processor, or the like.
The user interface 530 includes one or more output devices 531 enabling presentation of media content, including one or more speakers and/or one or more visual display screens. The user interface 530 also includes one or more input devices 532, including user interface components to facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The memory 550 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard disk drives, optical disk drives, and the like. Memory 550 optionally includes one or more storage devices physically located remote from processor 510.
The memory 550 may comprise volatile memory or nonvolatile memory, and may also comprise both volatile and nonvolatile memory. The nonvolatile memory may be a Read Only Memory (ROM), and the volatile memory may be a Random Access Memory (RAM). The memory 550 described in embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 550 can store data to support various operations, examples of which include programs, modules, and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 551 including system programs for processing various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and processing hardware-based tasks;
a network communication module 552 for communicating to other computing devices via one or more (wired or wireless) network interfaces 520, exemplary network interfaces 520 including: bluetooth, wireless compatibility authentication (WiFi), and Universal Serial Bus (USB), etc.;
a presentation module 553 for enabling presentation of information (e.g., a user interface for operating peripherals and displaying content and information) via one or more output devices 531 (e.g., a display screen, speakers, etc.) associated with the user interface 530;
an input processing module 554 to detect one or more user inputs or interactions from one of the one or more input devices 532 and to translate the detected inputs or interactions.
In some embodiments, the information processing apparatus in the virtual scene provided in the embodiments of the present application may be implemented in software, and fig. 2 illustrates an information processing apparatus 555 in the virtual scene stored in a memory 550, which may be software in the form of programs and plug-ins, and includes the following software modules: the first rendering module 5551, the background playing module 5552, the second rendering module 5553 and the tempo adjustment module 5554 are logical and thus may be arbitrarily combined or further split depending on the implemented functions.
The functions of the respective modules will be explained below.
In other embodiments, the information processing Device in the virtual scene provided in this embodiment may be implemented in hardware, and for example, the information processing Device in the virtual scene provided in this embodiment may be a processor in the form of a hardware decoding processor, which is programmed to execute the information processing method in the virtual scene provided in this embodiment, for example, the processor in the form of the hardware decoding processor may employ one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs), Field Programmable Gate Arrays (FPGAs), or other electronic components.
Next, a description is given of an information processing method in a virtual scene provided in the embodiment of the present application, and in actual implementation, the information processing method in the virtual scene provided in the embodiment of the present application may be implemented by a server or a terminal alone, or may be implemented by a server and a terminal in cooperation.
Referring to fig. 3, fig. 3 is an optional flowchart of an information processing method in a virtual scene according to an embodiment of the present application, and the steps shown in fig. 3 will be described.
Step 101: the terminal presents an interactive interface of the virtual scene and presents at least one interactive object in the interactive interface.
In some embodiments, when the terminal runs the application program, a selection interface including at least one object tag is presented; and in response to the selection operation of the object tag triggered based on the selection interface, presenting the interactive object matched with the selected object tag in the interactive interface of the virtual scene, and playing the media file matched with the selected object tag.
The media file is used as background information of a virtual scene, such as background music or background animation, the object tag is used for indicating the type of an interactive object, and the object tag and the media file are in one-to-one correspondence.
Here, in an actual application, when the terminal runs an application program, a selection interface for selecting an object tag is presented first, and at least one object tag is presented in the selection interface, when a user selects a certain object tag in the at least one object tag, the terminal responds to a selection operation of the object tag triggered based on the selection interface, displays the selected object tag in the selection interface in a target display style (such as highlight, bold, and the like), and when the user triggers a determination function item for the object tag, the terminal responds to the trigger operation, acquires an interactive object and a media file matched with the selected object tag, so as to present the acquired interactive object in an interactive interface of a virtual scene, and plays the media file in a process of presenting the interactive object.
Referring to fig. 4A-4B, fig. 4A-4B are schematic diagrams of display interfaces provided by an embodiment of the present application, in fig. 4A, when the terminal runs an application program "defeated bad thing", the terminal presents a selection interface a0 and presents a plurality of object tags a1 in a selection interface a0, and when the user selects an object tag a1 of "low immunity", the display style of the object tag a1 is different from that of other unselected object tags, such as highlighting the selected object tag a 1; when the user triggers the decision function item a2, an interactive object B1 matching the "immunity low" object tag a1 is presented in the interactive interface B0 of the virtual scene shown in fig. 4B, i.e., a representation of the presented interactive object B1 ("bad") with the "immunity low" object tag, and a media file matching the "immunity low" object tag a1 is played during the presentation of the interactive object B1.
In some embodiments, the terminal may present at least one interactive object in the interactive interface by:
presenting at least one object bearing position for bearing an interactive object in an interactive interface, wherein the object bearing positions and the interactive object are in one-to-one correspondence; and in each object bearing position, presenting the process from the appearance of a partial object to the appearance of a whole object of the corresponding interactive object.
Here, the interactive object can be freely popped from the object mapping position, that is, the interactive object can be gradually popped from the object mapping position until the whole of the interactive object is located above the object mapping position, and then gradually fall back to the object mapping position until the whole of the interactive object is located in the object mapping position, and accordingly, the process from the appearance of a part of the interactive object to the appearance of the whole object, and then from the appearance of the whole object to the disappearance of the object is shown in the interactive interface.
For example, in fig. 4B, the object map bits B2 and the interactive objects B1 are in a one-to-one correspondence, each object map bit B2 only allows one interactive object B1 to appear at a time, and when the number of interactive objects is two or more, the two or more interactive objects at the time point of presentation of the corresponding object map bits include at least two interactive objects, that is, the presentation times of the two or more interactive objects are not the same.
In some embodiments, the terminal may present at least one interactive object in the interactive interface by:
in the interactive interface, circularly presenting the presentation animation of at least one interactive object at a first presentation rate; when the interactive performance generated by the interactive operation reaches a first performance threshold value, circularly presenting the display animation of at least one interactive object at a second presentation rate; and the second presentation rate is greater than the first presentation rate, and the presentation animation is used for presenting the process from appearance to disappearance of the corresponding interactive object in the interactive interface.
Here, when the terminal starts to run the application program, the presentation rate of the corresponding interactive object in the interactive interface is low, that is, the rate of popping the interactive object from the object mapping bit is slow, and then as the number of times of interaction that the interactive operation succeeds increases, when the interactive score reaches the score threshold, the rate of popping the interactive object from the object mapping bit becomes fast. The interactive achievements include at least one of: the number of successful interactions, the interaction score, the virtual resource value obtained by the interaction, and the like.
For example, in fig. 4B, for each interactive object B1 of "bad thing", an animation in which "bad thing" is popped out and retracted from the object mapping slot at a presentation frequency of 3 times per second is just started to be presented in the interactive interface, and when the number of times that the player plays the avatar of "bad thing" is 10 times, or the hit score exceeds 10 minutes, or the resulting virtual resource is 20 coins, an animation in which "bad thing" is popped out and retracted from the object mapping slot at a presentation frequency of 5 times per second is presented in the interactive interface.
In some embodiments, the terminal may present at least one interactive object in the interactive interface by:
presenting display animations of a first number of interactive objects simultaneously in the interactive interface; when the interactive achievement generated by the interactive operation reaches a second achievement threshold value, displaying the display animation of a second number of interactive objects in the interactive interface; wherein the second number is greater than the first number.
Here, when the terminal starts to run the application program, the number of the interactive objects simultaneously presented in the interactive interface is small, and then as the number of times of interaction successful in the interactive operation increases, the number of the interactive objects simultaneously presented in the interactive interface increases when the interactive score reaches the score threshold.
For example, in fig. 4B, for each interactive object B1 of "bad thing", 1 animation that "bad thing" pops out and retracts back from the object mapping position is just started to be presented simultaneously in the interactive interface, when the number of times that the player plays the avatar of "bad thing" is 10 times, or the hit score exceeds 10 points, or the resulting virtual resource is 20 coins, the interactive object as shown in fig. 4C is presented in the interactive interface, fig. 4C is a schematic diagram of the display interface provided by the embodiment of the present application, and fig. 4C is an animation that 2 "bad thing" pops out and retracts back from the object mapping position.
Through the method, along with the increase of the number of successful interaction times of the interactive operation, the presenting speed for the interactive objects is higher, or the number of the interactive objects presented at the same time is larger, so that the interactive operation for the interactive objects is more challenging, and the interest and the challenge of the virtual scene are increased.
Step 102: during the rendering of the at least one interactive object, the media file is played.
In fig. 4A, the media file corresponding to the object tag of "head of wall is" background music 1, "the media file corresponding to the object tag of" low immunity "is" background music 2, "dining hall is dragged and changed" the media file corresponding to the object tag is "background animation 1, and so on.
When a user selects a certain object label, playing a media file corresponding to the selected object label in the process of presenting at least one interactive object corresponding to the selected object label in an interactive interface; and when the user does not select the object tag, playing the default media file of the virtual scene in the process of presenting at least one default interactive object of the virtual scene in the interactive interface.
Step 103: and presenting the interactive animation aiming at the interactive object in the interactive interface in response to the interactive operation aiming at the interactive object.
In practical application, the interactive objects can be freely shrunk from the object mapping positions, and in each object bearing position, the process from the appearance of partial objects to the appearance of the whole objects of the corresponding interactive objects is presented. When the interactive object pops up from the object mapping position, the user can trigger the interactive operation aiming at the interactive object through a touch screen, a voice control switch, a keyboard, a mouse and the like.
In some embodiments, the terminal may present an interaction animation for the interaction object in the interaction interface in response to the interaction operation for the interaction object by:
the interactive operation is triggered when the whole object appears in the interactive object, and in response to the interactive operation aiming at the interactive object, in the interactive interface, the animation that the interactive object disappears in the interactive interface from the appearance of the whole object to the disappearance of the object is presented.
Here, when an interactive object is presented in the interactive interface, the user may trigger an interactive operation for the interactive object, and the terminal presents an interactive animation of whether the interactive object is successfully acted on by the interactive operation in response to the interactive operation. When the interactive operation successfully acts on the interactive object, playing an animation that the interactive object is successfully hit in the interactive interface, for example, presenting states such as 'head blond star', 'face black' or 'air leakage' and the like which are shown when the interactive object is hit. When the interactive operation does not successfully act on the interactive object, playing an animation that the interactive object is not successfully hit in the interactive interface, such as presenting a state such as "proud", "smiling face", or "happy" that the interactive object is not hit.
The successful interaction on the interaction object means that the interaction directly acts on the interaction object, for example, the interaction acts on any part of the interaction object, more specifically, for the application of 'defeat bad thing', the interaction object is a 'cartoon character', and when a user clicks the 'cartoon character' through a touch screen, the user touches any part of the head or arm of the 'cartoon character', and the interaction is considered to successfully act on the interaction object.
Referring to fig. 5A-5B, fig. 5A-5B are schematic diagrams of display interfaces provided in an embodiment of the present application, in fig. 5A, when an interactive object is hit, a state of "face black" shown by the interactive object is presented in the interactive interface; in fig. 5B, when the interactive subject is not hit by the pair, the state of the "smiling face" presented by the interactive subject is presented in the interactive interface.
In some embodiments, when the interactive operation successfully acts on the interactive object, the terminal further acquires the target audio file, and plays the target audio file in the process of presenting the interactive animation.
And the target audio file corresponds to the interactive operation which is successfully acted or corresponds to the interactive object which is successfully acted by the interactive operation.
Here, when the interactive operation successfully acts on the interactive object, a target audio file corresponding to the interactive object is also acquired, and the target audio file may be "cheering" corresponding to the interactive operation that successfully acts, or "whooping" corresponding to the interactive object. For example, for the application of "spoiled bad thing", the interactive object is "cartoon dog", when the user hits and hits the "cartoon dog" through the touch screen, a cheering sound of a successful interactive operation of celebrating "hit" such as "ohy …", "binggo" is played, or a naugh sound of a simulated "cartoon dog" such as "whining …", "wang …" is played, and so on.
In some embodiments, the play tone of the target audio file is adjusted so that the play tone matches the number of interactions with an accompanying increase in the number of interactions with which the interaction object was successfully acted.
For example, for the cartoon dog, when the cartoon dog is played, the target audio file is played by the first tone, and the tone of the played target audio file is increased along with the increase of the number of times of playing the cartoon dog, so that the playing of the target audio file is closely related to the operation of the user, the user can get strong response, and the atmosphere sense and the interestingness can be highlighted.
Step 104: and in the process of presenting the interactive animation, dynamically adjusting the playing rhythm of the media file to ensure that the playing rhythm of the media file corresponds to the interactive rhythm corresponding to the interactive operation.
In some embodiments, the terminal may dynamically adjust the playing rhythm of the media file by:
when the interaction times of the interaction operation aiming at the interaction object exceed a time threshold, acquiring an adjusting coefficient matched with the interaction times; and adjusting the playing rhythm of the media file based on the adjusting coefficient, and playing the media file according to the adjusted playing rhythm.
In practical implementation, the adjustment coefficient may be determined according to the number of interactions, for example, when the number of interactions is less than 5, the adjustment coefficient is 1, or the adjustment coefficient is (number of interactions + 5)/10; when the number of times of interaction is greater than 5 and less than 10, the adjustment coefficient is 1.2, or the adjustment coefficient is (number of times of interaction + 5)/10; when the number of interactions is greater than 10, the adjustment coefficient is 2, or the adjustment coefficient is (number of interactions + 10)/10.
In some embodiments, the terminal may dynamically adjust the playing rhythm of the media file by:
and correspondingly adjusting the playing rhythm of the media file along with the change of the interaction frequency of the interaction operation aiming at the interaction object, so that the playing rhythm of the media file is matched with the interaction frequency.
Here, if the interaction frequency is changed, the corresponding playing rhythm is changed, and if the interaction frequency is higher, the playing rhythm of the background file is faster.
In some embodiments, the terminal may dynamically adjust the playing rhythm of the media file by:
when the media file is a background audio file, dynamically adjusting the audio playing rhythm of the background audio file; and when the media file is the background animation file, dynamically adjusting the animation display rhythm of the background animation file.
Here, adjusting the audio playing rhythm is to adjust at least one of the audio playing rate or the playing volume of the background audio file, and if the interaction frequency is higher, the corresponding audio playing rate is faster, and the playing volume is also higher; when the media file is a background animation file, the greater the interaction frequency, the faster the corresponding animation display rhythm.
By the above manner, in the process of interacting with the interactive object, the playing rhythm of the media file corresponding to the virtual scene is dynamically adjusted according to the interaction rhythm corresponding to the interaction operation, so that the playing rhythm of the media file corresponds to the interaction rhythm corresponding to the interaction operation, and the user can feel the change of the interaction rhythm of the interaction operation aiming at the interactive object through the change of the playing rhythm of the media file, thereby being beneficial to further stimulating the user to participate in the interaction operation with the interactive object, and further improving the participation degree of the user.
In some embodiments, when the number of interactions that an interactive operation does not successfully act on an interactive object reaches a threshold number, presenting time prompt information in an interactive interface; the time prompt information is used for prompting the interaction ending time aiming at the virtual scene; and when the interaction ending time is up, presenting an interaction result interface of the virtual scene, and presenting an interaction result corresponding to the virtual scene in the interaction result interface.
Here, in the process of interaction between the user and the interactive object, if the continuous multiple times of interactive operations do not successfully act on the interactive object, the interaction end time is presented, for example, presented on the interactive interface in a countdown manner, and when the countdown is ended, the user is reminded of the end of the interaction, and the interaction result is presented.
For example, in fig. 4C, when the player misses 4 rounds or does not hit "bad thing" for 4 consecutive interactions, countdown information a is presented in the interactive interface, so that when the countdown becomes 0, the game end interface is presented, and the game result is presented.
Next, an exemplary application of the embodiment of the present application in a practical application scenario will be described.
In the related art, when a user interacts with an interactive object based on a game interactive interface, an interactive rhythm corresponding to interactive operation is separated from a playing rhythm of a background audio file corresponding to the game, that is, a fixed background audio file is played in the same game scene, so that the participation degree of the user completely depends on the interesting degree of the game, and the participation degree of the user is reduced.
In view of this, the present application provides an information processing method in a virtual scene, which can dynamically adjust a playing rhythm of a background audio file according to an interaction rhythm corresponding to an interaction operation of a user, so that the playing rhythm of the background audio file corresponds to the interaction rhythm corresponding to the interaction operation; therefore, the harsh nesting of the background audio file on the interactive interface of the game scene can be avoided, the understanding of the user on the background audio file is not only atmosphere rendering, but also the user can intuitively feel that the playing rhythm of the background audio file is associated with the interactive rhythm of the user through interactive operation, and therefore the participation degree of the user is improved.
Referring to fig. 6, fig. 6 is an optional flowchart of an information processing method in a virtual scene according to an embodiment of the present application, and the steps shown in fig. 6 will be described.
Step 201: and the terminal presents a selection interface which corresponds to the virtual scene and comprises at least one object label.
In actual implementation, for common game scene interaction, such as bad luck strike, when the terminal runs an application program of "spoiled bad thing" as shown in fig. 4A, a selection interface for selecting object tags as shown in fig. 4A is presented first, and at least one object tag is presented in the selection interface.
Step 202: and responding to the selection operation of the object label triggered based on the selection interface, and acquiring the interactive object and the background audio file corresponding to the selected object label.
Here, there is a one-to-one correspondence between the object tag and the interactive object, and between the object tag and the media file. When a user selects an object tag of the at least one object tag, a bad avatar (i.e., an interactive object) and a background audio file (i.e., a media file) corresponding to the selected object tag are acquired.
For example, in FIG. 4A, when the user selects "Immunity Low" object tag A1, an interactive object B1 ("bad") is captured as a representation shown in FIG. 4B with the "Immunity Low" object tag, and background music matching "Immunity Low" object tag A1 is captured.
Step 203: and the terminal presents an interactive interface of the virtual scene, presents the selected interactive object in the interactive interface and plays the background audio file.
When the user triggers the determined function item of "enter k.o." shown in fig. 4A, the terminal presents the interactive interface of the virtual scene as shown in fig. 4B in response to the triggering operation, presents the interactive object corresponding to the selected object tag in the interactive interface, and plays the background audio file corresponding to the selected object tag.
In practical application, when a terminal just starts to run an application program, the presentation rate of a corresponding interactive object in an interactive interface is low, namely the pop-up rate of the interactive object is slow, and then the pop-up rate of the interactive object is increased when an interactive score reaches a score threshold value along with the increase of the number of successful interactive operations. Similarly, when the terminal starts to run the application program, the number of the interactive objects simultaneously presented in the interactive interface is small, and then as the number of successful interactive operations increases, the number of the interactive objects simultaneously presented in the interactive interface increases when the interactive score reaches the score threshold.
Step 204: and presenting the interactive animation aiming at the interactive object in the interactive interface in response to the interactive operation aiming at the interactive object.
Here, in the interactive interface shown in fig. 4B, at least one object bearing position for bearing an interactive object is presented, where the object bearing position and the interactive object have a one-to-one correspondence relationship; the interactive objects can be freely contracted from the object mapping positions, and the process from the appearance part to the appearance whole of the corresponding interactive objects is presented in each object bearing position. When the interactive object pops up from the object mapping position, a user can trigger interactive operation aiming at the interactive object through a touch screen, a voice control switch, a keyboard, a mouse and the like, and when the interactive operation is triggered when the interactive object appears as a whole, the interactive operation aiming at the interactive object is responded, and animation from the appearance of the whole interactive object to the disappearance of the interactive object in the interactive interface is presented in the interactive interface.
In some embodiments, when an interactive object is presented in the interactive interface, the user can trigger an interactive operation aiming at the interactive object, and the terminal responds to the interactive operation and presents an interactive animation whether the interactive object is successfully acted by the interactive operation. When the interactive operation successfully acts on the interactive object, playing an animation that the interactive object is successfully hit in the interactive interface, for example, presenting states such as 'head blond star', 'face black' or 'air leakage' and the like which are shown when the interactive object is hit. When the interactive operation does not successfully act on the interactive object, playing an animation that the interactive object is not successfully hit in the interactive interface, such as presenting a state such as "proud", "smiling face", or "happy" that the interactive object is not hit.
Step 205: and in the process of presenting the interactive animation, dynamically adjusting the playing rhythm of the media file to ensure that the playing rhythm of the media file corresponds to the interactive rhythm corresponding to the interactive operation.
Here, by acquiring an interaction frequency corresponding to an interaction operation of a user on an interaction object in real time, a playing rhythm of the background audio file is dynamically changed, such as changing a playing volume, a tone, a playing speed, and the like of the background audio file.
In actual implementation, a Web Audio (Web Audio) interface provides a powerful background for controlling a background Audio file of a Web page, so that a user can perform Audio operation in an Audio Context (Audio Context), such as adding an effect to Audio, creating Audio visualization, and the like, and the Web Audio interface has the characteristic of modular routing. And performing operation by using an Audio Node (Audio Node), and connecting a plurality of Audio nodes together to form an Audio routing graph. Since it handles audio in an audio context, different types of sources can also be supported by one environment and allows target routing, thus supporting multiple sources even in a single audio context, despite the many different types of channel layouts of these audio sources.
Referring to fig. 7, fig. 7 is a schematic flowchart of a webpage Audio operation Audio provided in the embodiment of the present application, first, an Audio Context (Audio Context) is created, where the Audio Context is equivalent to a sound container and is a precondition environment for processing Audio; creating sources, e.g. < audio >, oscillators, streams, in an audio context; creating effect nodes, such as reverberation, biquad filters, translation, compression; creating a destination for the audio, i.e. an audio rendering device, e.g. a loudspeaker; connected to the effector, effect outputs are made to the destination, i.e. source-effect-destination, i.e. input-process-output, are connected.
When the playing rhythm of the background audio file is adjusted, the playing volume, the tone and the speed can be specifically adjusted, wherein the volume adjustment can be realized through a gain node (GainNode) in the audio nodes, the default gain is 1, that is, the current volume is kept unchanged, the minimum value of the gain which can be set is about-3.4, and the maximum value is about 3.4, the GainNode interface represents the change of the volume, and the GainNode interface is an audio processing module of the audio node, and a given gain is applied to an input before the output, such as GainNode.
Here, in practical applications, the number of interactions of the user with respect to the interactive operation of the interactive object is counted in real time, and when the number of interactions exceeds a threshold number, a gain matched with the number of interactions is obtained, for example, when the number of interactions is less than 5, the gain is 1, which indicates that the volume is unchanged; when the interaction times are more than 5 and less than 10, the gain is 1.2, namely, the volume of the background audio file is adjusted to be 1.2 times of the original volume; when the number of interactions is greater than 10, the gain is 1.5, that is, the volume of the background audio file is adjusted to 1.5 times of the original volume.
When the tone of the background audio file is adjusted, the interactive frequency of the interactive operation of the user aiming at the interactive object is counted in real time, and the tone of the background audio file is adjusted synchronously along with the change of the interactive frequency. See the following pseudo code, which is implemented in practice using a nonlinear distorter (wavesharenode interface), where wavesharenode is an Audio Node (Audio Node) that uses curves to apply a waveform distortion to an Audio signal, and contains the properties of a continuation-from-parent Audio No de.
Figure BDA0002664089900000191
Figure BDA0002664089900000201
Where wavesharenode is a curve attribute, curve is a Float32Array describing the distortion to be applied, the middle element of the Array is applied to each signal value 0, the first to signal value-1, and the last to signal value 1; values as small as-1 or greater than 1 are treated as-1 and 1, respectively. If necessary, the distortion curve is calculated using linear interpolation.
When the playing speed of the background audio file is adjusted, the interactive times of interactive operation of a user aiming at an interactive object are counted in real time, and when the interactive times exceed a time threshold value, an adjusting coefficient matched with the interactive times is obtained; and adjusting the playing speed of the background audio file based on the adjustment coefficient, and playing the background audio file according to the adjusted playing speed. In practical implementation, the playback speed is increased or decreased by setting the playback rate property of the audiobuffersourcenode, and the frequency of the sound is changed, so that the pitch of the sound is changed. For example, when the number of interactions is greater than 5, the buffer source play rate value is 1.2, that is, 1.2 times of the playing speed is set, and the frequency becomes 1.2 times of the previous frequency; when the number of interactions is greater than 10, buffer source play rate value is 2, that is, 2 times of playing speed is set, and the frequency is changed to 2 times of the previous frequency; when the number of interactions is greater than 20, the value is 3, that is, 3 times the playing speed is set, and the frequency becomes 3 times the previous frequency.
Figure BDA0002664089900000202
Figure BDA0002664089900000211
In some embodiments, when the interactive operation successfully acts on the interactive object, the terminal further acquires the target audio file, and plays the target audio file in the process of presenting the interactive animation. And the target audio file corresponds to the interactive operation which is successfully acted or corresponds to the interactive object which is successfully acted by the interactive operation.
Here, when the interactive operation successfully acts on the interactive object, a target audio file corresponding to the interactive object is also acquired, and the target audio file may be "cheering" corresponding to the interactive operation that successfully acts, or "whooping" corresponding to the interactive object. For example, for the application of "spoiled bad thing", if the interactive object is a "cartoon dog", when the user hits the "cartoon dog" through the touch screen and hits the "cartoon dog", a cheering sound of a successful interactive operation such as "joking" of "ohy …", "binggo" or the like is played, or a naugh sound of a simulated "cartoon dog" such as "whining …", "wang …" is played, and the like.
In some embodiments, the play tone of the target audio file is adjusted so that the play tone matches the number of interactions with an accompanying increase in the number of interactions with which the interaction object was successfully acted.
For example, for the cartoon dog, when the cartoon dog is played, the target audio file is played by the first tone, and the tone of the played target audio file is increased along with the increase of the number of times of playing the cartoon dog, so that the playing of the target audio file is closely related to the operation of the user, the user can get strong response, and the atmosphere sense and the interestingness can be highlighted.
By the method, the dimensionality of one audio frequency can be increased outside the small game playing method, the participation degree of a user is further stimulated through audio frequency change, the interaction times are increased, the interaction frequency is faster, the audio volume is larger, the audio rhythm is faster, and the audio frequency tone change is more obvious.
Step 206: and when the interaction times of the interaction operation which does not successfully act on the interaction object reach a time threshold value, presenting time prompt information in the interaction interface, and when the interaction ending time is reached, presenting an interaction result corresponding to the virtual scene.
And the time prompt information is used for prompting the interaction ending time aiming at the virtual scene. Here, in the process of interaction between the user and the interactive object, if the continuous multiple times of interactive operations do not successfully act on the interactive object, the interaction end time is presented, for example, presented on the interactive interface in a countdown manner, and when the countdown is ended, the user is reminded of the end of the interaction, and the interaction result is presented.
For example, in fig. 4C, when the player misses 4 rounds or does not hit "bad thing" for 4 consecutive interactions, countdown information a is presented in the interactive interface, so that when the countdown becomes 0, the game end interface is presented, and the game result is presented.
Continuing with the exemplary structure of the information processing apparatus 555 implemented as a software module in the virtual scene provided in the embodiment of the present application, in some embodiments, as shown in fig. 8, fig. 8 is a schematic structural diagram of the information processing apparatus in the virtual scene provided in the embodiment of the present application, and the software module stored in the information processing apparatus 555 in the virtual scene in the memory 550 may include:
a first presentation module 5551, configured to present an interactive interface of a virtual scene, and present at least one interactive object in the interactive interface;
a background playing module 5552, configured to play the media file during the process of presenting the at least one interactive object;
a second presenting module 5553, configured to present an interaction animation for the interaction object in the interaction interface in response to the interaction operation for the interaction object;
a rhythm adjusting module 5554, configured to dynamically adjust the playing rhythm of the media file in the process of presenting the interactive animation, so that the playing rhythm of the media file corresponds to the interaction rhythm corresponding to the interactive operation.
In some embodiments, the first presentation module is further configured to present, in the interactive interface, a selection interface including at least one object tag;
in response to the selection operation of the object label triggered based on the selection interface, presenting an interactive object matched with the selected object label;
correspondingly, the background playing module is further configured to play the media file matched with the selected object tag.
In some embodiments, the first presenting module is further configured to present, in the interactive interface, at least one object bearing position for bearing the interactive object, where the object bearing position and the interactive object are in a one-to-one correspondence relationship;
and in each object bearing position, presenting the process from the appearance of a partial object to the appearance of a whole object of the corresponding interactive object.
In some embodiments, the second presentation module is further configured to trigger the interactive operation to be triggered when the interactive object appears as a whole object, and in response to the interactive operation for the interactive object, present an animation of the interactive object appearing in the interactive interface from the whole object appearing to the object disappearing in the interactive interface.
In some embodiments, the first presentation module is further configured to present, in the interactive interface, a presentation animation of the at least one interactive object in a loop at a first presentation rate;
when the interactive performance generated by the interactive operation reaches a first performance threshold value, circularly presenting the display animation of the at least one interactive object at a second presentation rate;
wherein the second presentation rate is greater than the first presentation rate.
In some embodiments, the first presentation module is further configured to present presentation animations of the first number of interactive objects simultaneously in the interactive interface;
when the interactive achievement generated by the interactive operation reaches a second achievement threshold value, displaying display animations of a second number of interactive objects in the interactive interface at the same time;
wherein the second number is greater than the first number.
In some embodiments, the apparatus further comprises an audio playback module for playing back audio
When the interactive operation successfully acts on the interactive object, acquiring a target audio file;
wherein the target audio file corresponds to the interactive operation which is successfully acted or corresponds to the interactive object which is successfully acted by the interactive operation;
and playing the target audio file in the process of presenting the interactive animation.
In some embodiments, the audio playing module is further configured to
And adjusting the playing tone of the target audio file along with the increase of the interaction times of the interaction operation successfully acting on the interaction object, so that the playing tone is matched with the interaction times.
In some embodiments, the rhythm adjustment module is further configured to
When the interaction times of the interaction operation aiming at the interaction object exceed a time threshold, acquiring an adjusting coefficient matched with the interaction times;
and adjusting the playing rhythm of the media file based on the adjusting coefficient, and playing the media file according to the adjusted playing rhythm.
In some embodiments, the rhythm adjustment module is further configured to
And correspondingly adjusting the playing rhythm of the media file along with the change of the interaction frequency of the interaction operation aiming at the interaction object, so that the playing rhythm of the media file is matched with the interaction frequency.
In some embodiments, the rhythm adjustment module is further configured to
When the media file is a background audio file, dynamically adjusting the audio playing rhythm of the background audio file;
and when the media file is a background animation file, dynamically adjusting the animation display rhythm of the background animation file.
In some embodiments, the apparatus further includes a third presenting module, where the third presenting module is configured to present, in the interactive interface, a time prompt message when the number of interactions in which the interaction operation has not successfully acted on the interaction object reaches a number threshold;
the time prompt information is used for prompting the interaction ending time aiming at the virtual scene;
and when the interaction ending time is up, presenting an interaction result interface of the virtual scene, and presenting an interaction result corresponding to the virtual scene in the interaction result interface.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device executes the information processing method in the virtual scene in the embodiment of the present application.
The embodiment of the application provides a computer-readable storage medium storing executable instructions, wherein the executable instructions are stored, and when being executed by a processor, the executable instructions cause the processor to execute the information processing method in the virtual scene provided by the embodiment of the application.
In some embodiments, the computer-readable storage medium may be memory such as FRAM, ROM, PROM, EP ROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories.
In some embodiments, executable instructions may be written in any form of programming language (including compiled or interpreted languages), in the form of programs, software modules, scripts or code, and may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext Markup Language (H TML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
By way of example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
The above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present application are included in the protection scope of the present application.

Claims (15)

1. An information processing method in a virtual scene, the method comprising:
presenting an interactive interface of a virtual scene, and presenting at least one interactive object in the interactive interface;
playing a media file during the process of presenting the at least one interactive object;
presenting an interaction animation for the interaction object in the interaction interface in response to the interaction operation for the interaction object;
and in the process of presenting the interactive animation, dynamically adjusting the playing rhythm of the media file to ensure that the playing rhythm of the media file corresponds to the interactive rhythm corresponding to the interactive operation.
2. The method of claim 1, wherein said presenting at least one interactive object in said interactive interface comprises:
presenting, in the interactive interface, a selection interface including at least one object tag;
in response to the selection operation of the object label triggered based on the selection interface, presenting an interactive object matched with the selected object label;
accordingly, the playing the media file includes:
and playing the media file matched with the selected object label.
3. The method of claim 1, wherein said presenting at least one interactive object in said interactive interface comprises:
presenting at least one object bearing position for bearing the interactive object in the interactive interface, wherein the object bearing positions and the interactive object are in one-to-one correspondence;
and in each object bearing position, presenting the process from the appearance of a partial object to the appearance of a whole object of the corresponding interactive object.
4. The method of claim 3, wherein presenting an interaction animation for the interaction object in the interaction interface in response to the interaction operation for the interaction object comprises:
the interactive operation is triggered when the interactive object appears as a whole object, and in response to the interactive operation aiming at the interactive object, in the interactive interface, the animation that the interactive object disappears in the interactive interface from the appearance of the whole object to the disappearance of the object is presented.
5. The method of claim 1, wherein said presenting at least one interactive object in said interactive interface comprises:
in the interactive interface, circularly presenting the presentation animation of the at least one interactive object at a first presentation rate;
when the interactive performance generated by the interactive operation reaches a first performance threshold value, circularly presenting the display animation of the at least one interactive object at a second presentation rate;
wherein the second presentation rate is greater than the first presentation rate.
6. The method of claim 1, wherein said presenting at least one interactive object in said interactive interface comprises:
presenting presentation animations of a first number of interactive objects simultaneously in the interactive interface;
when the interactive achievement generated by the interactive operation reaches a second achievement threshold value, displaying display animations of a second number of interactive objects in the interactive interface at the same time;
wherein the second number is greater than the first number.
7. The method of claim 1, wherein the method further comprises:
when the interactive operation successfully acts on the interactive object, acquiring a target audio file;
wherein the target audio file corresponds to the interactive operation which is successfully acted or corresponds to the interactive object which is successfully acted by the interactive operation;
and playing the target audio file in the process of presenting the interactive animation.
8. The method of claim 7, wherein the method further comprises:
and adjusting the playing tone of the target audio file along with the increase of the interaction times of the interaction operation successfully acting on the interaction object, so that the playing tone is matched with the interaction times.
9. The method of claim 1, wherein the dynamically adjusting the playback tempo of the media file comprises:
when the interaction times of the interaction operation aiming at the interaction object exceed a time threshold, acquiring an adjusting coefficient matched with the interaction times;
and adjusting the playing rhythm of the media file based on the adjusting coefficient, and playing the media file according to the adjusted playing rhythm.
10. The method of claim 1, wherein the dynamically adjusting the playback tempo of the media file comprises:
and correspondingly adjusting the playing rhythm of the media file along with the change of the interaction frequency of the interaction operation aiming at the interaction object, so that the playing rhythm of the media file is matched with the interaction frequency.
11. The method of claim 1, wherein the dynamically adjusting the playback tempo of the media file comprises:
when the media file is a background audio file, dynamically adjusting the audio playing rhythm of the background audio file;
and when the media file is a background animation file, dynamically adjusting the animation display rhythm of the background animation file.
12. The method of claim 1, wherein the method further comprises:
when the interaction times of the interaction operation which does not successfully act on the interaction object reach a time threshold value, presenting time prompt information in the interaction interface;
the time prompt information is used for prompting the interaction ending time aiming at the virtual scene;
and when the interaction ending time is up, presenting an interaction result interface of the virtual scene, and presenting an interaction result corresponding to the virtual scene in the interaction result interface.
13. An information processing apparatus in a virtual scene, the apparatus comprising:
the first presentation module is used for presenting an interactive interface of a virtual scene and presenting at least one interactive object in the interactive interface;
the background playing module is used for playing the media file in the process of presenting the at least one interactive object;
the second presentation module is used for responding to the interactive operation aiming at the interactive object and presenting the interactive animation aiming at the interactive object in the interactive interface;
and the rhythm adjusting module is used for dynamically adjusting the playing rhythm of the media file in the process of presenting the interactive animation, so that the playing rhythm of the media file corresponds to the interactive rhythm corresponding to the interactive operation.
14. An electronic device, comprising:
a memory for storing executable instructions;
a processor, configured to implement the information processing method in the virtual scene according to any one of claims 1 to 12 when executing the executable instructions stored in the memory.
15. A computer-readable storage medium storing executable instructions for implementing the information processing method in the virtual scene according to any one of claims 1 to 12 when executed by a processor.
CN202010913313.1A 2020-09-03 2020-09-03 Information processing method, device, equipment and storage medium in virtual scene Active CN112044053B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010913313.1A CN112044053B (en) 2020-09-03 2020-09-03 Information processing method, device, equipment and storage medium in virtual scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010913313.1A CN112044053B (en) 2020-09-03 2020-09-03 Information processing method, device, equipment and storage medium in virtual scene

Publications (2)

Publication Number Publication Date
CN112044053A true CN112044053A (en) 2020-12-08
CN112044053B CN112044053B (en) 2022-05-17

Family

ID=73608428

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010913313.1A Active CN112044053B (en) 2020-09-03 2020-09-03 Information processing method, device, equipment and storage medium in virtual scene

Country Status (1)

Country Link
CN (1) CN112044053B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112800252A (en) * 2020-12-31 2021-05-14 腾讯科技(深圳)有限公司 Method, device and equipment for playing media files in virtual scene and storage medium
CN115220625A (en) * 2022-07-19 2022-10-21 广州酷狗计算机科技有限公司 Audio playing method and device, electronic equipment and computer readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006314717A (en) * 2005-05-16 2006-11-24 Nintendo Co Ltd Game program, and game apparatus
CN101916576A (en) * 2010-08-19 2010-12-15 惠州Tcl移动通信有限公司 Method for automatically playing background music
CN108525302A (en) * 2018-04-09 2018-09-14 网易(杭州)网络有限公司 Control method, device, processor and the terminal of game background music
CN108769769A (en) * 2018-05-30 2018-11-06 北京小米移动软件有限公司 Playback method, device and the computer readable storage medium of video
CN108888957A (en) * 2018-07-24 2018-11-27 合肥爱玩动漫有限公司 A method of by the customized game music of player and audio
CN109343770A (en) * 2018-09-27 2019-02-15 腾讯科技(深圳)有限公司 Interaction feedback method, equipment and recording medium
CN110337052A (en) * 2019-06-19 2019-10-15 Oppo广东移动通信有限公司 Adjust method, apparatus, electronic equipment and the computer storage medium of intensity of sound
CN111001162A (en) * 2019-12-09 2020-04-14 网易(杭州)网络有限公司 Game skin changing method and device, storage medium and processor

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006314717A (en) * 2005-05-16 2006-11-24 Nintendo Co Ltd Game program, and game apparatus
CN101916576A (en) * 2010-08-19 2010-12-15 惠州Tcl移动通信有限公司 Method for automatically playing background music
CN108525302A (en) * 2018-04-09 2018-09-14 网易(杭州)网络有限公司 Control method, device, processor and the terminal of game background music
CN108769769A (en) * 2018-05-30 2018-11-06 北京小米移动软件有限公司 Playback method, device and the computer readable storage medium of video
CN108888957A (en) * 2018-07-24 2018-11-27 合肥爱玩动漫有限公司 A method of by the customized game music of player and audio
CN109343770A (en) * 2018-09-27 2019-02-15 腾讯科技(深圳)有限公司 Interaction feedback method, equipment and recording medium
CN110337052A (en) * 2019-06-19 2019-10-15 Oppo广东移动通信有限公司 Adjust method, apparatus, electronic equipment and the computer storage medium of intensity of sound
CN111001162A (en) * 2019-12-09 2020-04-14 网易(杭州)网络有限公司 Game skin changing method and device, storage medium and processor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
夜幕下的避风港: "疯狂打地鼠小游戏", 《HTTPS://HAOKAN.BAIDU.COM/V?PD=WISENATURAL&VID=7727564158855912021》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112800252A (en) * 2020-12-31 2021-05-14 腾讯科技(深圳)有限公司 Method, device and equipment for playing media files in virtual scene and storage medium
CN115220625A (en) * 2022-07-19 2022-10-21 广州酷狗计算机科技有限公司 Audio playing method and device, electronic equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN112044053B (en) 2022-05-17

Similar Documents

Publication Publication Date Title
CN110465097B (en) Character vertical drawing display method and device in game, electronic equipment and storage medium
JP4095227B2 (en) Video game apparatus, background sound output setting method in video game, and computer-readable recording medium recorded with background sound output setting program
CN110737435B (en) Method, device, terminal equipment and storage medium for editing multimedia in game
KR102022604B1 (en) Server and method for providing game service based on an interaface for visually expressing ambient audio
CN111913624B (en) Interaction method and device for objects in virtual scene
CN109254650B (en) Man-machine interaction method and device
CN112044053B (en) Information processing method, device, equipment and storage medium in virtual scene
JP7292846B2 (en) Different perspectives from a common virtual environment
WO2006006274A1 (en) Game apparatus and game program
WO2022242260A1 (en) Interaction method, apparatus and device in game, and storage medium
CN112306321B (en) Information display method, device and equipment and computer readable storage medium
CN113487709A (en) Special effect display method and device, computer equipment and storage medium
CN115461707B (en) Video acquisition method, electronic device and storage medium
WO2024114518A1 (en) Display control method, display control apparatus, and electronic device
CN113952710A (en) Control method and device for screen-casting game, electronic equipment and storage medium
CN113763568A (en) Augmented reality display processing method, device, equipment and storage medium
JP7064155B2 (en) Computer programs and computer equipment
TWI294299B (en) Game device, image processing method for the same and information memory media
JP6198375B2 (en) Game program and game system
JP3558288B1 (en) System and method for video control by tagging objects in a game environment
JP2019187517A (en) Game program and game device
WO2023168990A1 (en) Performance recording method and apparatus in virtual scene, device, storage medium, and program product
JP7487158B2 (en) GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD
CN115396685B (en) Live interaction method and device, readable storage medium and electronic equipment
EP4306192A1 (en) Information processing device, information processing terminal, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40035396

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant