CN109165052B - Interactive processing method and device of application scene, terminal, system and storage medium - Google Patents

Interactive processing method and device of application scene, terminal, system and storage medium Download PDF

Info

Publication number
CN109165052B
CN109165052B CN201810897448.6A CN201810897448A CN109165052B CN 109165052 B CN109165052 B CN 109165052B CN 201810897448 A CN201810897448 A CN 201810897448A CN 109165052 B CN109165052 B CN 109165052B
Authority
CN
China
Prior art keywords
scene
scene data
data
application
interactive application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810897448.6A
Other languages
Chinese (zh)
Other versions
CN109165052A (en
Inventor
王谱
张正政
罗俊
龙振海
谢建平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810897448.6A priority Critical patent/CN109165052B/en
Publication of CN109165052A publication Critical patent/CN109165052A/en
Application granted granted Critical
Publication of CN109165052B publication Critical patent/CN109165052B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Stored Programmes (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The embodiment of the invention discloses an interactive processing method of an application scene, which comprises the following steps: in the process of running a target application scene of an interactive application, determining a scene data identifier for displaying in the target application scene from a configuration file set for the interactive application; acquiring target scene data indicated by the scene data identification from scene data stored for the interactive application; determining a data type of the target scene data; and loading the target scene data into the target application scene according to the determined data type. The embodiment of the invention can realize the updating of the application scene in the interactive application, improves the updating efficiency of the interactive application, does not need to update and install the interactive application for a user, and saves the updating and installing time.

Description

Interactive processing method and device of application scene, terminal, system and storage medium
Technical Field
The present invention relates to the field of computer application technologies, and in particular, to an interactive processing method, an interactive processing device, a terminal, an interactive processing system, and a storage medium for an application scenario.
Background
With the continuous development of electronic computing, network technology and computer technology, people can surf the internet at any time and any place through a user terminal, for example, people can check network news, perform social activities, even work, handle housework and other matters at any time and any place.
Corresponding to various intelligent applications APP, the most important research direction at present is to better realize human-computer interaction in the APP, except for developing a novel human-computer interaction interface on an application program to attract outdoor, how to process data required by interaction is also a direction which needs to be continuously updated and improved.
Disclosure of Invention
The embodiment of the invention provides an interactive processing method and device of an application scene, a terminal, a system and a storage medium, which can better process scene data required by interactive application.
In one aspect, an embodiment of the present invention provides an interactive processing method for an application scenario, including: in the process of running a target application scene of an interactive application, determining a scene data identifier for displaying in the target application scene from a configuration file set for the interactive application; acquiring target scene data indicated by the scene data identification from scene data stored for the interactive application; determining a data type of the target scene data; and loading the target scene data into the target application scene according to the determined data type.
On the other hand, an embodiment of the present invention further provides an interactive processing apparatus for an application scenario, including:
the processing module is used for determining a scene data identifier for displaying in a target application scene from a configuration file set for the interactive application in the process of running the target application scene of the interactive application; acquiring target scene data indicated by the scene data identification from scene data stored for the interactive application; determining a data type of the target scene data; and the display module is used for loading the target scene data into the target application scene according to the determined data type.
Correspondingly, the embodiment of the invention also provides an intelligent terminal, which comprises: a processor and a storage device; the storage device is used for storing program instructions; the processor calls the program instructions stored in the storage device and is used for executing the interactive processing method of the application scene.
In another aspect, an embodiment of the present invention further provides an interactive processing system, including: a server and a user terminal; the server is used for storing a resource package of the interactive application, and the resource package comprises: the interactive application comprises a configuration file and a directory file which are set for the interactive application and scene data configured for one or more application scenes in the interactive application; the user terminal is provided with the interactive application and is used for downloading the resource package of the interactive application from the server, analyzing the resource package, and storing the scene data of the corresponding scene data identification into the corresponding storage address according to the mapping relation between the scene data identification and the storage address recorded in the directory file.
Correspondingly, the embodiment of the invention also provides a computer storage medium, wherein the computer storage medium stores program instructions, and the program instructions realize the interactive processing method of the application scene when being executed.
The embodiment of the invention defines a new loading mode for loading the scene data in the application scene of the interactive application, can dynamically supplement various newly developed scene data to the interactive application based on the new loading mode, and developers only need to define the new scene data and define the configuration file, thereby realizing the updating of the application scene in the interactive application, improving the updating efficiency of the interactive application, and saving the updating and installing time for users without updating and installing the interactive application.
Drawings
FIG. 1 is a schematic diagram of a frame structure according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating an interactive processing method for an application scenario according to an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a process of processing scene data in an application scene according to an embodiment of the present invention;
FIG. 4a is a flowchart illustrating a scene processing method according to an embodiment of the present invention;
FIG. 4b is a schematic flow chart of another scene processing method according to the embodiment of the present invention;
FIG. 5 is a schematic structural diagram of an interaction processing apparatus for an application scenario according to an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of an interactive processing system according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an intelligent terminal according to an embodiment of the present invention.
Detailed Description
In the embodiment of the invention, the related interactive application comprises one or more application scenes, and the application scenes mainly realize the interaction between people and the intelligent terminal, such as some game applications, interesting scene applications and the like. In the interactive applications, based on different scene phases or user operations, the interactive applications can present different scene data, so as to achieve the purpose of interaction. Some interactive applications include a plurality of application scenes, in which a correct animation or an incorrect animation can be played by clicking a certain area in the currently displayed application scene, the progress of the animation is controlled by similar copy and paste operations, the progress of the animation is controlled by sliding a button, the progress of the animation is controlled by scratching prizes, the progress of the animation is controlled by sliding a screen, the subtitle is played while the animation is played, an animation play response is made by judging whether answers provided by input voices are correct or not according to correct answers of various choice questions, and the purpose of interaction is achieved by giving user prompts through the animation, playing background music, playing animation sound effects and the like. More interesting interaction effects can be achieved among various interactions through any flexible combination.
An implementation architecture according to an embodiment of the present invention is shown in fig. 1, and mainly includes a development device 101 of a resource development end of an interactive application, an application server 102 of the interactive application, and a user terminal 103 of a user of the interactive application. The user of the resource development end may be a user who completes the development and upgrade updating of various interactive applications and the development and update of the resource package of the interactive applications, and may also be a user who only provides the resource package for the interactive applications. In the embodiment of the present invention, based on the development device 101, a development user creates scene data, such as animation, music, video, pictures, characters, and voice, that can be displayed to the user in each corresponding scene of an interactive application through various creation tools, and also sets a configuration file and a directory file for the interactive application. The configuration file is used for determining scene data identifications required to be displayed, and the directory file is used for recording the mapping relation between the scene data identifications and the storage addresses of the scene data indicated by the scene data identifications at the local terminal, so that the corresponding scene data can be found and provided for the target application scene of the interactive application to use based on the configuration file and the directory file.
The development device 101 packages the scene data, the configuration file, and the directory file required by the interactive application this time according to a predetermined format to obtain a resource package provided to the user terminal installed with the interactive application this time. In one embodiment, the scene data, configuration files, directory files may be compressed into zip (a compact format) packages. The interactive applications can be developed by adopting the existing development tools, for example, the interactive applications can be scratch application scenes, and in the scratch application scenes, a user can manually scratch certain images on the scratch application scenes in a mode of repeatedly sliding a touch screen, so that the aim of man-machine interaction is fulfilled; for another example, the interactive application may include an application scenario where a user answers questions by voice, and if the user answers the questions correctly, one animation is provided to the user, and if the user answers the wrong answer, another animation is provided to the user.
The application server 102 is mainly used for providing an application service for the user terminal 103 installed with the interactive application, and can perform security detection, verification and other related processing on application data such as the resource package and other related processing of the interactive application. In this embodiment of the present invention, the application server 102 may store the resource package provided by the development device 101, and store the resource package received this time according to the version number. The version number is obtained according to a preset version naming rule, and can be specified by a development user or generated by a server. In a simple embodiment, the version number of the resource package submitted this time may be determined based on the time when the development device 101 submits the resource package, so as to ensure the uniqueness of the resource package version submitted this time. After receiving and storing the resource package, the application server 102 may actively push an update message to a plurality of user terminals 103 installed with the interactive application, so that the user terminals 103 may download and store the resource package. In one embodiment, the application server 102 may send the latest resource package or one or more resource packages requested to be downloaded by the user terminal 103 in the download request to the user terminal 103 upon receiving the download request of the user terminal 103. In the embodiment of the present invention, the resource package does not change the main program of the interactive application.
The user terminal 103 with the interactive application installed may refer to an intelligent device such as a smart phone, a tablet computer, or a personal computer, and when the user terminal 103 runs the interactive application, related scene data is obtained based on a framework structure defined for the interactive application, and the scene data is presented to the user in each corresponding scene. After the user terminal 103 acquires the above-mentioned resource packet, the resource packet may be analyzed by a scene processor (storyprocessor, which may be implemented by a processor as one of the processing functional units defined in the embodiments of the present invention) in the core of the framework structure, and corresponding processing is performed, and according to a data type of scene data included in the resource packet, display processing of corresponding scene data is performed. In one embodiment, a display process of contents such as playback of background music or video or animation, various game prompts, and the like may be performed. In the embodiment of the invention, the storyprocessor serving as a core class of the framework has the functions of processing animation playing, scene switching, scheduling of different operations in various scenes, playing of background music, playing of sound effects and the like. Various capabilities are allocated to various control component activities for processing and data decoupling is performed. An embodiment of the interaction processing method for an application scenario according to the embodiment of the present invention is described below with reference to fig. 2 from the user terminal 103 on the basis of fig. 1.
After the interactive application is installed, the user terminal 103 may run the interactive application as needed. For the above-mentioned resource package, the user terminal 103 may actively pull the resource package from the server for subsequent use when running the interactive application. Of course, in other embodiments, the user may download the resource package through other devices first, and then forward the resource package to the user terminal 103. In S201, in the process of running a target application scene of an interactive application, a scene data identifier for displaying in the target application scene is determined from a configuration file set for the interactive application. The target application scene is one of the scenes of the interactive application, for example, a startup scene when the interactive application is started up, and a scene of a certain chapter after the interactive application is run (for example, a scratch scene of a first section of a first chapter of a game). During the running process of the interactive application, the scene processor checks the configuration file in real time so as to determine the scene data identifier of the scene data which needs to be displayed currently in the target application scene. In one embodiment, when the application program runs, receives a user interaction operation (for example, a sliding operation of the user in a certain scene area of a scratch scene), or executes a special scene node (for example, an aircraft flies to a certain position), and the like, the scene processor determines a scene data identifier of the scene data to be presented to the user from the configuration file, so as to obtain the scene data to be presented to the user.
After determining the scene data identification, in S202, the target scene data indicated by the scene data identification is obtained from the scene data stored for the interactive application. In one embodiment, the scene data identifier of the configuration file may be directly a storage address, and the corresponding target scene data may be directly found based on the storage address. In another embodiment, the resource package may further include a directory file for recording the scene data identifier and the corresponding storage address, and therefore the S202 may specifically include: searching a storage address corresponding to the data identifier in a directory file set for the application scene; and acquiring target scene data stored under the searched storage address from the scene data stored for the interactive application. As shown in table 1 below, the directory file may be preset when the above-mentioned resource package is issued, and after the user terminal 103 acquires the resource package, the user terminal stores each piece of scene data decompressed from the resource package according to the storage address indicated by the scene data identifier in the directory file in the resource package.
Table 1:
scene data identification Storage address
aaaa.MP3 SD card/interactive application/scene data/aaaa. mp3
bbbb.mov SD card/interactive application/scene data/bbbb
…… ……
After the target scene data is acquired, the data type of the target scene data is determined in S203. The data types are mainly used for distinguishing execution modes of the target scene data, and the scene processor checks the configuration file to acquire the target scene data through the steps involved above, and also determines the data types of the target scene data based on the configuration file so as to present the target scene data for a user by adopting different execution modes based on different data types.
The data type of the scene data may be classified based on the component for presenting the scene data, or the scene data may be classified directly based on the application scene included in the interactive application and the corresponding component for presenting the scene data, for example, the scratch scene and the scratch-specific playing component corresponding to the scratch scene are included in the interactive application, and then the type of the scene data used in the scratch scene may be classified as the scratch type, or the erasure type described below, for example. The interactive application further includes a scene to be answered based on the voice input by the user and a corresponding subtitle display component (or a playing component of the questioning animation, etc.), and then the type of the scene data corresponding to the scene to be answered based on the voice input by the user is configured as an answer type. Different data types can be set for each scene data to ensure that the scene data is not called by mistake.
The data type of each scene data may be set in a configuration file at the time of development. In one embodiment, the configuration file includes: the determining the data type of the target scene data includes: determining a data type of the target scene data from the configuration file.
In one embodiment, the data types in the configuration file include: the playback type includes the playback type of audio data and/or the playback type of video data, the audio data includes foreground music scene data and the like, the video data includes foreground animation scene data and the like, and the playback type includes: any one or more of a click play type of the audio data and/or the video data, a scroll play type of the audio data and/or the video data, a button drag play type of the audio data and/or the video data, and an erase type of the audio data and/or the video data. In one embodiment, the data types may further include a background playing type and a foreground sound effect playing type, and the background playing type includes a background music playing type and/or a background animation playing type.
In one embodiment, the playback types of foreground animation scene data and/or foreground music scene data configured in the configuration file may be as follows. In the scene structure described below, only the required parts of the present application are described, and in the actual use process, the configuration file may further include other uses, such as indicating whether the foreground animation is played in a loop, whether the background music is played in a loop, and the like. The representation of a certain scene in the configuration file is as follows:
Figure BDA0001758548810000061
Figure BDA0001758548810000071
in the above line 0, the scene ID of the application scene associated with the scene structure is indicated, for example, the scene ID of the scratch scene in the interactive application, the background animation, the foreground animation, and the like, all use the application scene corresponding to the scene ID as the target application scene, and the scene data mentioned in the configuration files is the target scene data of the target application scene.
In the above-mentioned line 1, a foreground animation file name for identifying a foreground animation stored on the user terminal 103, that is, the above-mentioned scene data identification, is described. In the above-mentioned line 2, the background animation file name is described, which is a scene data identifier of the background animation scene data. The above line 7 describes a BGM (Background music) Background music name that is used as a Background music scene data identification for identifying a Background music scene data.
In the embodiment of the present invention, the configuration file mainly defines the playback type of the foreground animation scene data and/or the foreground music scene data, and does not relate to the playback type of the background animation or the background music, because the background animation or the background music is played directly, and a playback type does not need to be specified. Therefore, the data type of the scene data of the foreground animation is mentioned in the 8 th line, and when the value is 0, the foreground animation corresponding to the file name of the foreground animation does not respond to any event and is directly played; if the value is 1, the foreground animation corresponding to the foreground animation file name is of a click playing type; if the value is 2, the foreground animation corresponding to the foreground animation file name is of a rolling playing type; if the value is 3, the foreground animation corresponding to the foreground animation file name is of a button dragging playing type; if the value is 4, the foreground animation corresponding to the foreground animation file name is indicated to be of an erasing type. And the background animation corresponding to the background animation file name and the background music corresponding to the background music name can be directly played according to the requirement.
Taking an application scene including scratch music in the interactive application as an example, the configuration file corresponds to the scratch music scene through the representation form. After the user enters the scratch Scene in the interactive application, the structural content of the corresponding sceneId, namely the struct Scene mentioned above, can be found from the configuration file based on the scratch Scene identification sceneId. At this time, the struct Scene corresponding to the scratch Scene identification sceneId in the configuration file is executed. The execution process comprises the following steps:
firstly, a background animation scene data identifier lottiefileNameBg and a background music scene data identifier bgmName can be determined, based on the two identifiers, a corresponding storage address can be found from a directory file, and then corresponding data is found, a background animation player is directly called to play the found scene background animation scene data, and a background music player is called to play the found background music scene data. The background animation playing type may be determined based on the specific name of the background animation scene data identifier lottieFileNameBg in the configuration file or the position (number of lines) in the configuration file, and similarly, the background animation playing type may be determined based on the specific name of the background music scene data identifier bgmName in the configuration file or the position (number of lines) in the configuration file. The corresponding playback type does not need to be specified in the configuration file.
Secondly, based on the foreground animation scene data identification lottieFileNameForg in the configuration file, finding out the corresponding foreground animation scene data from the directory file, and simultaneously determining the playing type of the foreground animation scene data based on the pancelType which is used for defining the data type of the foreground animation in the configuration file. If the number of the foreground animation scene data is 4, the playing component of the scratch music can play the found foreground animation scene data based on the processing mode corresponding to the erasing type, for example, the user performs an erasing operation on the display screen, and the playing component of the scratch music can play a part of animation to the user.
In other embodiments, the configuration file may further include other scene processing manners such as a switching type of the page scene data, where the switching type includes: any one or more of a page gradual change display type of the page scene data, a left-right sliding page turning type of the page scene data, and an up-down sliding page turning type of the page scene data. For indicating display control of page data in a page scene. The following also refers to the portion of the configuration file pertaining to the page of an embodiment of the present invention.
Figure BDA0001758548810000081
Figure BDA0001758548810000091
The line 1 is a background music name on the page, a playing path is obtained from a directory file ResManager through the background music name, and then the playing path is delivered to a BGMPlayer (background music player) for playing, the line 2 is a switching type of page scene data, a value of 0 indicates that the switching type is according to a default switching type, and the default switching type comprises a gradual display type, such as a fade-in fade-out effect type; if the value is 1, the page can be turned by sliding left and right, and if the value is 2, the page can be turned by sliding up and down; line 3 is a page background picture resource name, and a file path can be obtained from a directory file ResManager through the page background picture resource name; line 4 is the page background animation and line 5 is the scene list of the current page.
In other embodiments, information about the data type may also be added by including the data type of the scene data in the scene data, for example, in the naming rule of the scene data, in an embodiment, in a scene named scratch, if a certain scene data is: if the erasing type in the playing types of the foreground animation scene data indicates that the scene data corresponds to the scratch animation, the animation is named as: when the target application scene is the scratch scene, the animation data (target scene data) is found from the corresponding storage address based on the scene data name (namely, the scene data identifier), and then the animation data (target scene data) can be determined to be the erasing type based on the name of the animation data, namely the 'scratch scene animation', and the scratch playing component is called to play the animation, so that the corresponding animation is played in the current scratch scene and presented to a user.
After the data type of the target scene data is determined, in S204, the target scene data is loaded into the target application scene according to the determined data type, so as to be presented to the user in the target application scene. Different data types correspond to different control components. In S204, the control component corresponding to the data type in the interactive application is mainly triggered to record the target scene data, so as to present the target scene data to the user in the application scene, including presentation modes such as animation playing, sound effect playing, subtitle or prompt displaying, and the like.
Next, the above mentioned data types are described again, as shown in fig. 3, when the data type of the target scene data is a background music playing type, the searched target scene data may be played by a background music player (bgmpilayer, a background music playing component related in the embodiment of the present invention). In addition, after determining that the background music is shared with a certain page in the target scene, the target scene data may be distributed to a background music player (bgmpilayer) for playing to complete the playing of the background music.
And if the current music scene is checked based on the configuration file and the foreground sound effect needs to be played, the data type of the corresponding target scene data is the foreground sound effect playing type, and the searched target scene data is directly played through the foreground music playing component instead of being played in the form of background music. Of course, while foreground music is being played, background music is not being played.
If the data type of the target scene data is an erasing type in the playing types of the foreground animation scene data, the current corresponding target application scene can be considered as an application scene such as scratch music, and the animation playing of the searched target scene data can be controlled through a customized scratch music control so as to display the target scene data. As shown in fig. 3, a scene controller scritchctrl may be provided in the scratch scene to control the state of the scratch scene, the playback of foreground animation scene data, and the like.
If the data type of the target scene data is a click playing type in the playing types of the foreground animation scene data, the current corresponding target application scene can be considered as a click designated area to judge the right and wrong of a click area, and different animation scenes are played. And if the area clicked by the user is an error area, calling the foreground animation playing component to play the animation of the searched target scene data corresponding to the click error.
If the data type of the target scene data is a rolling playing type in the playing types of the foreground animation scene data, the searched target scene data can be played through an animation player with a progressive control, for example, the playing progress of the animation player for playing the animation data in the searched target scene data is controlled by sliding a playing ruler on the animation player, or the animation player can be controlled by directly sliding a touch screen for playing the animation data in the searched target scene data.
If the data type of the target scene data is a button dragging playing type in the playing type of the foreground animation scene data, the playing progress when the found animation data of the target scene data is played can be controlled through an animation player with an auxiliary button.
If the data type of the target scene data is the caption type, the caption is directly displayed, or the caption is displayed along with the playing of the related target scene data about music, so that the caption is displayed in the target music scene. In addition, when the detection configuration file determines that target scene data related to the prompt exists in the current scene, the corresponding target scene data is determined as prompt type scene data, and the prompt information corresponding to the found target scene data is displayed in the current scene through the prompt component.
In the embodiment of the present invention, other target application scenarios may also exist, for example: and judging whether the voice output is wrong or not, specifically judging whether the answer of the user is correct or not by receiving the voice input of the user in a voice answer application scene, and displaying scene data according to the result of the voice answer of the user in the voice answer application scene, namely a target application scene. Fig. 4a shows a complete process of scene processing for mishandling with speech input, which can be performed by a scene processor in a user terminal. For the wrong target application scene based on the voice input, the resource packet stores the error scene data corresponding to the answer error, and the memory of the user terminal stores the error scene data. When the running interactive application enters the target application scene, firstly, the user terminal receives the voice input of the user in S4011, and in S4012, the user terminal recognizes the voice, judges the error of the voice input and determines the recognition result. If the determination result is "error", the user terminal continues to determine in S4013 whether there is scene data corresponding to the result that is error, and if not, may stay in the current target application scene, and continue to wait for the voice data of the user. If the scene data corresponding to the error exists, the user terminal triggers entering of the error scene corresponding to the error identification result, in S4014, the user terminal detects the scene data identifier corresponding to the error scene through the configuration file, and searches the target scene data corresponding to the error scene from the memory based on the directory file, the target scene data may be, for example, foreground animation scene data, the type of the target scene data may be configured to be 0 in the configuration file, that is, the type of the target scene data is directly played without responding to any event, the target scene data found corresponding to the target scene data is directly played, that is, the error scene can be entered, and the target application scene based on the voice input can be returned to after the playing is completed. And if the judgment result in S4012 is that the user answer is correct, the user terminal controls to enter the next scene in S4015. In addition, in the voice answer application scene, background music and/or background animation can be added, a background animation player is directly called to play the found scene background animation scene data, and the background music player is called to play the found background music scene data.
In the embodiment of the present invention, other target application scenarios may also be used, for example: an application scene of the doll clip. In this scenario, different scenario processing may be performed for the user based on whether the user click position is correct or not. Fig. 4b shows a complete process of clip doll scene processing with click position, which may be executed by a scene processor in a user terminal. For the clip doll scene, error scene data corresponding to the position click error is stored in the resource packet, and the error scene data is stored in the user terminal memory. When the interactive application is run and enters the clip doll application scene, a scene interface of a clip doll is displayed first, and the click operation of the user is received in S4021, and the click position in the scene interface is determined. In S4022, it is determined whether the clicked position is the correct position, if the clicked position is the correct position, the next application scenario may be entered in S4023, and if the clicked position is the incorrect position, the Scene data identifier recorded in the clip doll application scenario (for example, the Scene data identifier recorded in struct Scene described above) is detected in the configuration file in S4024, the target Scene data corresponding to the Scene data identifier is searched from the memory based on the directory file, and the searched target Scene data is played in S4025 to enter the incorrect scenario. The target scene data may be, for example, foreground animation scene data, and the type of the target scene data may be configured as a direct play type in the configuration file, for example, the configuration is 0, that is, the type of the direct play without responding to any event, and then the target scene data corresponding to the found target scene data may be directly played. Of course, the configuration may be 1, i.e. a click-to-play type that can be played only after a user clicks. And returning to the current clip doll application scene interface after the playing is finished, and continuously receiving the clicking operation of the user. In addition, in the application scene of the doll clip, background music and/or background animation can be added, a background animation player is directly called to play the found scene background animation scene data, and the background music player is called to play the found background music scene data.
The user may store the resource package corresponding to the interactive application into the user terminal 103 by himself, or may download the resource package corresponding to the interactive application from the application server 102. The resource packet includes: the configuration file and the directory file set for the interactive application and the scene data configured for one or more application scenes in the interactive application are mentioned above; after the resource package is obtained by downloading, the user terminal 103 parses the resource package based on the user terminal, and stores the scene data of the corresponding scene data identifier into the corresponding storage address according to the mapping relationship between the scene data identifier and the storage address recorded in the directory file. Generally, the installation of the interactive application and the storage of the data may both specify the installation location, for example, as shown in table 1, all data of a certain interactive application may be stored in a folder corresponding to the interactive application, that is, "interactive application/" and all scene data may be stored in a new subfolder in the folder. Therefore, the storage address of each scene data can be specified in the directory file in advance, and after receiving and decompressing the resource packet, the user terminal 103 can store the file indicated by the corresponding scene data identifier (foreground animation file name, background animation file name) into the corresponding storage address according to the specified storage address on the directory file, so as to facilitate subsequent searching and acquiring the scene data.
In one embodiment, in the process that the interactive application is started by a user and starts to run, if the user terminal 103 receives a running trigger event of the interactive application, a local version identifier corresponding to scene data stored for the interactive application is acquired; acquiring an updated version identifier of the resource package of the interactive application from a server; and if the local version identification is not consistent with the updated version identification, triggering and executing the step of downloading the resource package of the interactive application from the server. That is, it may be detected whether the resource package stored in the user terminal 103 is the latest resource package after the user opens the interactive application each time, if so, the interactive processing of the subsequent application scenario may be directly performed, and if not, the application server 102 needs to download the latest resource package, so as to better run the interactive application. In another embodiment, in the running process of the interactive application, when it is detected that target scene data needs to be acquired for the first time, a local version identifier corresponding to the scene data stored for the interactive application is acquired, and an updated version identifier of a resource package of the interactive application is acquired from a server; and if the local version identification is inconsistent with the updated version identification, acquiring target scene data indicated by the scene data identification from the scene data stored for the interactive application.
The embodiment of the invention defines a set of complete interactive processing modes from the development of new scene data by developers to the loading of the scene data in the application scene of the interactive application, and the developers can dynamically supplement various newly developed scene data to the interactive application based on the new processing modes, thereby realizing the updating of the application scene in the interactive application, improving the updating efficiency of the interactive application, avoiding the need of updating and installing the interactive application for users, and saving the time of updating and installing.
Referring to fig. 5, a schematic structural diagram of an interaction processing apparatus for an application scenario according to an embodiment of the present invention is mainly used in an intelligent terminal such as a smart phone, a tablet computer, a smart wearable device, and a personal computer, and the apparatus mainly includes the following modules.
A processing module 501, configured to determine, from a configuration file set for an interactive application in a process of running a target application scene of the interactive application, a scene data identifier for displaying in the target application scene; acquiring target scene data indicated by the scene data identification from scene data stored for the interactive application; determining a data type of the target scene data; a display module 502, configured to load the target scene data into the target application scene according to the determined data type.
In one embodiment, a directory file is further configured for the interactive application, and a scene data identifier and a corresponding storage address are recorded on the directory file; the processing module 501 is configured to, when acquiring target scene data indicated by the scene data identifier from scene data stored for the interactive application, search a storage address corresponding to the data identifier in a directory file set for the application scene; and acquiring target scene data stored under the searched storage address from the scene data stored for the interactive application.
In one embodiment, the configuration file includes: the scene data identifier and the data type corresponding to the scene data indicated by the scene data identifier, and the processing module 501, when configured to determine the data type of the target scene data, is configured to determine the data type of the target scene data from the configuration file.
In one embodiment, the data types in the configuration file include: the playing type of the foreground animation scene data and/or the foreground music scene data and the switching type of the page scene data; the play types include: any one or more of a click play type, a scroll play type, a button drag play type, and an erase type. In addition, the configuration file may further specify a switching type of the page scene data, the switching type including: the page gradually-changing display type, the left-right sliding page-turning type and the up-down sliding page-turning type are any one or more of.
In an embodiment, the display module 502 is configured to, when the target scene data is loaded into the target application scene according to the determined data type, trigger a control component corresponding to the data type in the interactive application to display the target scene data, so as to display the target scene data in the application scene.
In one embodiment, the processing module 501 is further configured to download a resource package of the interactive application from a server, where the resource package includes: the interactive application comprises a configuration file and a directory file which are set for the interactive application and scene data configured for one or more application scenes in the interactive application; and analyzing the resource package, and storing the scene data of the corresponding scene data identifier into the corresponding storage address according to the mapping relation between the scene data identifier and the storage address recorded in the directory file.
In an embodiment, the processing module 501 is further configured to, if an operation triggering event of the interactive application is received, obtain a local version identifier corresponding to scene data stored for the interactive application; acquiring an updated version identifier of the resource package of the interactive application from a server; and if the local version identification is inconsistent with the updated version identification, triggering the slave server to download the resource package of the interactive application.
The embodiment of the invention can enable developers to dynamically supplement various newly developed scene data to the interactive application, can realize the updating of the application scene in the interactive application, improves the updating efficiency of the interactive application, does not need to update and install the interactive application for users, and saves the updating and installing time.
Referring to fig. 6, a schematic structural diagram of an interactive processing system according to an embodiment of the present invention is shown, where the system may correspond to the system shown in fig. 1. In the embodiment of the invention, the system comprises a server and a user terminal. The server may be, for example, an application server of an interactive application, such as an application server of an interactive application including a scratch scene and a voice question and answer scene. The user terminal may be, for example, the above-mentioned smart phone, tablet computer, smart wearable device, personal computer, and the like.
The server is used for storing a resource package of the interactive application, and the resource package comprises: the interactive application comprises a configuration file and a directory file which are set for the interactive application and scene data configured for one or more application scenes in the interactive application. The resource package may be developed by the development user through the development device mentioned in the above embodiment. The detailed description of the resource package can refer to the description of the foregoing embodiments.
The user terminal is provided with the interactive application and is used for downloading the resource package of the interactive application from the server, analyzing the resource package, and storing the scene data of the corresponding scene data identification into the corresponding storage address according to the mapping relation between the scene data identification and the storage address recorded in the directory file. After receiving and storing the corresponding configuration file, directory file, and scene data, the user terminal may perform subsequent interactive processing of the application scene, and the specific execution process may refer to the description about the relevant content of the user terminal in the foregoing embodiment.
Referring to fig. 7, a schematic structural diagram of an intelligent terminal according to an embodiment of the present invention is shown, where the intelligent terminal according to an embodiment of the present invention may refer to the above-mentioned smart phone, tablet computer, smart wearable device, personal computer, and the like. The intelligent terminal corresponds to the user terminal. In the embodiment of the present invention, the intelligent terminal includes a processor 701 and a storage device 702, and as shown in fig. 6, the intelligent terminal may further include a power module for providing power, a user interface 703, a network interface 704, and the like.
The user interface 703 may include a touch screen, a physical key, a microphone for voice input, and the like as needed, and the intelligent terminal may receive input operations such as touch input, key input, voice input, and the like of a user on the one hand, and may present each application scene of the interactive application and corresponding scene data to the user on the other hand, based on the user interface 703.
The network interface 704 is mainly used for establishing a connection with a network device such as a server, downloading application data including a resource package of an installed interactive application from the server 601, and acquiring an application service provided by the server 601. The network interface 704 includes a wired network interface 704 and/or a wireless network interface 704.
The storage 702 may include a volatile memory (volatile memory), such as a random-access memory (RAM); the storage device 702 may also include a non-volatile memory (non-volatile memory), such as a flash memory (flash memory), a solid-state drive (SSD), or the like; the storage 702 may also comprise a combination of memories of the kind described above. The storage device 702 stores data such as an operating system of the user terminal, a network interface module for driving a network interface, and a user interface module for driving a user interface.
The processor 701 may be a Central Processing Unit (CPU). The processor 701 may further include a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or the like. The aforementioned PLD may refer to a field-programmable gate array (FPGA), a General Array Logic (GAL), and the like.
Optionally, the storage 702 is also used to store program instructions. The processor 701 may call the program instructions to implement various processes as mentioned in the above embodiments of the present application.
In one embodiment, the processor 701 invokes a program instruction stored in the storage 702, so as to determine, from a configuration file set for an interactive application, a scene data identifier for displaying in a target application scene of the interactive application in a process of running the target application scene; acquiring target scene data indicated by the scene data identification from scene data stored for the interactive application; determining a data type of the target scene data; and loading the target scene data into the target application scene according to the determined data type.
In one embodiment, a directory file is further configured for the interactive application, and a scene data identifier and a corresponding storage address are recorded on the directory file; the processor 701 is configured to search a storage address corresponding to the data identifier in a directory file set for the application scene when the processor 701 is configured to obtain target scene data indicated by the scene data identifier from scene data stored for the interactive application; and acquiring target scene data stored under the searched storage address from the scene data stored for the interactive application.
In one embodiment, the configuration file includes: the scene data identifier and a data type corresponding to the scene data indicated by the scene data identifier, and the processor 701 is configured to determine the data type of the target scene data from the configuration file when determining the data type of the target scene data.
In one embodiment, the data types in the configuration file include: the playing type of the foreground animation scene data and/or the foreground music scene data; the play types include: any one or more of a click play type, a scroll play type, a button drag play type, and an erase type. In addition, a switching type of the page scene data can be further specified in the configuration file, and the switching type comprises: the page gradually-changing display type, the left-right sliding page-turning type and the up-down sliding page-turning type are any one or more of.
In an embodiment, the processor 701, in the step of loading the target scene data to the target application scene according to the determined data type, is configured to trigger a control component corresponding to the data type in the interactive application to display the target scene data, so as to display the target scene data in the application scene.
In one embodiment, the processor 701 is further configured to download a resource package of the interactive application from a server, where the resource package includes: the interactive application comprises a configuration file and a directory file which are set for the interactive application and scene data configured for one or more application scenes in the interactive application; and analyzing the resource package, and storing the scene data of the corresponding scene data identifier into the corresponding storage address according to the mapping relation between the scene data identifier and the storage address recorded in the directory file.
In an embodiment, the processor 701 is further configured to, if an operation trigger event of the interactive application is received, obtain a local version identifier corresponding to scene data stored for the interactive application; acquiring an updated version identifier of the resource package of the interactive application from a server; and if the local version identification is not consistent with the updated version identification, executing the resource package for downloading the interactive application from the server.
The specific implementation of the processor 701 of the intelligent terminal according to the embodiment of the present invention may refer to the description of the related content in the foregoing embodiments.
The embodiment of the invention defines a set of complete interactive processing modes from the development of new scene data by developers to the loading of the scene data in the application scene of the interactive application, and the developers can dynamically supplement various newly developed scene data to the interactive application based on the new processing modes, thereby realizing the updating of the application scene in the interactive application, improving the updating efficiency of the interactive application, avoiding the need of updating and installing the interactive application for users, and saving the time of updating and installing.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
While the invention has been described with reference to a number of embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. An interactive processing method for an application scene is characterized by comprising the following steps:
in the process of running a target application scene of an interactive application, determining a scene data identifier for displaying in the target application scene from a configuration file set for the interactive application through a scene processor;
acquiring target scene data indicated by the scene data identification from scene data stored for the interactive application through a scene processor, and determining the data type of the target scene data according to the scene data identification, wherein the scene data identification is a scene data name, and the naming rule of the scene data name comprises adding information about the data type into the name;
calling a control component corresponding to the determined data type through a scene processor to execute the target scene data so as to load the target scene data into the target application scene;
the data type of the scene data is determined by classifying the scene data based on the application scene included in the interactive application and the corresponding control component for presenting the scene data.
2. The method of claim 1, wherein a directory file is further configured for the interactive application, and the directory file records scene data identifiers and corresponding storage addresses;
the obtaining target scene data indicated by the scene data identification from the scene data stored for the interactive application includes:
searching a storage address corresponding to the data identifier in a directory file set for the application scene;
and acquiring target scene data stored under the searched storage address from the scene data stored for the interactive application.
3. The method of claim 1, wherein the configuration file comprises: the determining the data type of the target scene data includes: determining a data type of the target scene data from the configuration file.
4. The method of claim 3, wherein the data types in the configuration file comprise: the playing type of the foreground animation scene data and/or the foreground music scene data; the play types include: any one or more of a click play type, a scroll play type, a button drag play type, and an erase type.
5. The method of claim 1, further comprising:
downloading a resource package of the interactive application from a server, wherein the resource package comprises: the interactive application comprises a configuration file and a directory file which are set for the interactive application and scene data configured for one or more application scenes in the interactive application;
and analyzing the resource package, and storing the scene data of the corresponding scene data identifier into the corresponding storage address according to the mapping relation between the scene data identifier and the storage address recorded in the directory file.
6. The method of claim 5, further comprising:
if receiving an operation triggering event of the interactive application, acquiring a local version identifier corresponding to scene data stored for the interactive application;
acquiring an updated version identifier of the resource package of the interactive application from a server;
and if the local version identification is inconsistent with the updated version identification, triggering and executing the resource package for downloading the interactive application from the server.
7. An interaction processing apparatus for an application scenario, comprising:
the system comprises a processing module, a scene processor and a display module, wherein the processing module is used for determining a scene data identifier for displaying in a target application scene from a configuration file set for an interactive application through the scene processor in the process of running the target application scene of the interactive application; acquiring target scene data indicated by the scene data identification from scene data stored for the interactive application through a scene processor, and determining the data type of the target scene data according to the scene data identification, wherein the scene data identification is a scene data name, and the naming rule of the scene data name comprises adding information about the data type into the name;
and the display module is used for calling the control component corresponding to the determined data type through the scene processor to execute the target scene data so as to load the target scene data into the target application scene.
8. An intelligent terminal, comprising: a processor and a storage device;
the storage device is used for storing program instructions;
the processor, calling program instructions stored in the memory device, for performing the method of any of claims 1-6.
9. An interactive processing system, comprising: a server and a user terminal;
the server is used for storing a resource package of the interactive application, and the resource package comprises: the interactive application comprises a configuration file and a directory file which are set for the interactive application and scene data configured for one or more application scenes in the interactive application;
the user terminal is provided with the interactive application and is used for downloading a resource package of the interactive application from the server, analyzing the resource package, and storing scene data of the corresponding scene data identifier into the corresponding storage address according to the mapping relation between the scene data identifier and the storage address recorded on the directory file;
wherein the user terminal is further configured to perform the method of any one of claims 1-6.
10. A computer storage medium having stored thereon program instructions that, when executed, implement the method of any one of claims 1-6.
CN201810897448.6A 2018-08-08 2018-08-08 Interactive processing method and device of application scene, terminal, system and storage medium Active CN109165052B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810897448.6A CN109165052B (en) 2018-08-08 2018-08-08 Interactive processing method and device of application scene, terminal, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810897448.6A CN109165052B (en) 2018-08-08 2018-08-08 Interactive processing method and device of application scene, terminal, system and storage medium

Publications (2)

Publication Number Publication Date
CN109165052A CN109165052A (en) 2019-01-08
CN109165052B true CN109165052B (en) 2021-10-26

Family

ID=64895096

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810897448.6A Active CN109165052B (en) 2018-08-08 2018-08-08 Interactive processing method and device of application scene, terminal, system and storage medium

Country Status (1)

Country Link
CN (1) CN109165052B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110740262A (en) * 2019-10-31 2020-01-31 维沃移动通信有限公司 Background music adding method and device and electronic equipment
CN112954423B (en) * 2020-04-08 2023-05-26 深圳市明源云客电子商务有限公司 Animation playing method, device and equipment
CN112153455A (en) * 2020-09-15 2020-12-29 北京达佳互联信息技术有限公司 Voting application method and device, electronic equipment and storage medium
CN113694519B (en) * 2021-08-27 2023-10-20 上海米哈游璃月科技有限公司 Applique effect processing method and device, storage medium and electronic equipment
CN117278710B (en) * 2023-10-20 2024-06-25 联通沃音乐文化有限公司 Call interaction function determining method, device, equipment and medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103236965A (en) * 2013-03-26 2013-08-07 北京小米科技有限责任公司 Method, terminal and system for displaying scenes in instant chat interface
CN105760199A (en) * 2016-02-23 2016-07-13 腾讯科技(深圳)有限公司 Method and equipment for loading application resource
CN106201161A (en) * 2014-09-23 2016-12-07 北京三星通信技术研究有限公司 The display packing of electronic equipment and system
CN106445597A (en) * 2016-09-28 2017-02-22 依偎科技(南昌)有限公司 Application download method, terminal, server and system
CN106843828A (en) * 2016-12-07 2017-06-13 腾讯科技(深圳)有限公司 interface display, loading method and device
CN107038044A (en) * 2017-03-27 2017-08-11 长沙趣动文化科技有限公司 The discrete loading method of game resource and system based on Unity3D

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9280684B1 (en) * 2009-06-03 2016-03-08 James F. Kragh Identity validation and verification system and associated methods
CN105389840A (en) * 2015-10-23 2016-03-09 网易(杭州)网络有限公司 Animation implementation method and system for control in 2D game
CN105491440B (en) * 2015-11-26 2019-05-07 广州华多网络科技有限公司 A kind of application method of play control, terminal and server
CN106648746B (en) * 2016-11-07 2020-10-20 三星电子(中国)研发中心 Application program execution method and device
CN107943894A (en) * 2017-11-16 2018-04-20 百度在线网络技术(北京)有限公司 Method and apparatus for pushing content of multimedia
CN107993495B (en) * 2017-11-30 2020-11-27 北京小米移动软件有限公司 Story teller and control method and device thereof, storage medium and story teller playing system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103236965A (en) * 2013-03-26 2013-08-07 北京小米科技有限责任公司 Method, terminal and system for displaying scenes in instant chat interface
CN106201161A (en) * 2014-09-23 2016-12-07 北京三星通信技术研究有限公司 The display packing of electronic equipment and system
CN105760199A (en) * 2016-02-23 2016-07-13 腾讯科技(深圳)有限公司 Method and equipment for loading application resource
CN106445597A (en) * 2016-09-28 2017-02-22 依偎科技(南昌)有限公司 Application download method, terminal, server and system
CN106843828A (en) * 2016-12-07 2017-06-13 腾讯科技(深圳)有限公司 interface display, loading method and device
CN107038044A (en) * 2017-03-27 2017-08-11 长沙趣动文化科技有限公司 The discrete loading method of game resource and system based on Unity3D

Also Published As

Publication number Publication date
CN109165052A (en) 2019-01-08

Similar Documents

Publication Publication Date Title
CN109165052B (en) Interactive processing method and device of application scene, terminal, system and storage medium
CN107329750B (en) Identification method and skip method of advertisement page in application program and mobile terminal
CN111198730B (en) Method, device, terminal and computer storage medium for starting sub-application program
CN106971009B (en) Voice database generation method and device, storage medium and electronic equipment
US20100122167A1 (en) System and method for controlling mobile terminal application using gesture
CN110085222B (en) Interactive apparatus and method for supporting voice conversation service
TW201635134A (en) Method and apparatus for voice control
CN112486451B (en) Voice broadcasting method, computing device and computer storage medium
CN111654749B (en) Video data production method and device, electronic equipment and computer readable medium
CN109992248A (en) Implementation method, device, equipment and the computer readable storage medium of voice application
CN108475260A (en) Method, system and the medium of the language identification of items of media content based on comment
CN109684573B (en) Target picture display method and device, storage medium and electronic equipment
CN112214271A (en) Page guiding method and device and electronic equipment
CN107515870B (en) Searching method and device and searching device
CN112988304B (en) Recording method and device of operation mode, electronic equipment and storage medium
CN114690992B (en) Prompting method, prompting device and computer storage medium
CN111580766B (en) Information display method and device and information display system
CN112214153B (en) Multimedia information recording method, server, terminal, system and storage medium
CN113707179A (en) Audio identification method, device, equipment and medium
CN113868445A (en) Continuous playing position determining method and continuous playing system
CN111625508A (en) Information processing method and device
CN113301436A (en) Play control method, device and computer readable storage medium
WO2023040692A1 (en) Speech control method, apparatus and device, and medium
CN116431233B (en) Resource loading method, device, equipment and storage medium
WO2023246467A1 (en) Method and apparatus for video recommendation, and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant