CN114816622B - Scene picture display method and device, electronic equipment and storage medium - Google Patents

Scene picture display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114816622B
CN114816622B CN202210344596.1A CN202210344596A CN114816622B CN 114816622 B CN114816622 B CN 114816622B CN 202210344596 A CN202210344596 A CN 202210344596A CN 114816622 B CN114816622 B CN 114816622B
Authority
CN
China
Prior art keywords
data
scene
control engine
transition
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210344596.1A
Other languages
Chinese (zh)
Other versions
CN114816622A (en
Inventor
李伟鹏
蔡晓华
杨小刚
胡方正
鞠达豪
孙弘法
杨凯丽
朱彤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202210344596.1A priority Critical patent/CN114816622B/en
Publication of CN114816622A publication Critical patent/CN114816622A/en
Application granted granted Critical
Publication of CN114816622B publication Critical patent/CN114816622B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/61Scene description

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The disclosure relates to a scene picture display method, a scene picture display device, electronic equipment and a storage medium, and belongs to the technical field of computers. The method comprises the following steps: acquiring configuration information, wherein the configuration information comprises a target data type and instruction information, and the instruction information indicates an instruction for creating a control engine under the condition that data belonging to the target data type is acquired; acquiring a data model, wherein the data model comprises at least one piece of data, the at least one piece of data comprises scene data, and the scene data is used for rendering a scene picture; creating a control engine based on the instruction information in case it is determined that the at least one piece of data belongs to the target data type; and displaying a scene picture corresponding to the scene data based on the data model by the control engine. In the embodiment of the disclosure, the display of the scene picture corresponding to the data model can be realized only by configuring the data model with the contained data belonging to the target data type, and rendering logic codes are not required to be developed for the data model independently, so that the development efficiency is improved.

Description

Scene picture display method and device, electronic equipment and storage medium
Technical Field
The disclosure relates to the field of computer technology, and in particular, to a scene picture display method, a device, an electronic device and a storage medium.
Background
With the development of internet technology, more and more scenes are put in resources. Generally, a developer is required to develop resource data and corresponding rendering logic codes, so that when resources are released, equipment which acquires the resources can render the resource data into the resources according to the rendering logic codes, but the development process is complex and the development efficiency is low due to the mode.
Disclosure of Invention
The disclosure provides a scene picture display method, a scene picture display device, electronic equipment and a storage medium, and development efficiency is improved.
According to an aspect of the embodiments of the present disclosure, there is provided a scene picture display method, the method including:
acquiring configuration information, wherein the configuration information comprises a target data type and instruction information, and the instruction information indicates an instruction for creating a control engine when data belonging to the target data type is acquired;
Acquiring a data model, wherein the data model comprises at least one piece of data, the at least one piece of data comprises scene data, and the scene data is used for rendering a scene picture;
creating the control engine based on the instruction information in case it is determined that the at least one piece of data belongs to the target data type;
And displaying a scene picture corresponding to the scene data based on the data model through the control engine.
In the scheme provided by the embodiment of the disclosure, the target data type is defined in the acquired configuration information, and the instruction information of the control engine is created under the condition that the data of the data type is acquired, so that once the data model of the target data type is acquired under the condition that the configuration information is acquired, the control engine is created based on the instruction information, the control engine is operated to realize the display of the scene picture corresponding to the scene data in the data model, namely, based on the configuration information, the data model of the target data type can be operated, and therefore, a developer can realize the display of the scene picture corresponding to the data model only by configuring the data model of the target data type without developing a rendering logic code for the data model independently, thereby improving the development efficiency.
In some embodiments, the displaying, by the control engine, a scene picture corresponding to the scene data based on the data model includes:
determining, by the control engine, first scene data from a plurality of pieces of scene data included in the data model;
and displaying, by the control engine, a first scene picture based on the first scene data.
In the embodiment of the disclosure, each piece of scene data included in the data model is used for displaying one scene picture, the control engine has the function of running the data model, and after the control engine is created, the first scene data which is preferentially displayed is determined in the data model to display the first scene picture, so that the control engine is utilized to control the scene picture display corresponding to the scene data in the data model.
In some embodiments, the first scene data includes resource data, and the displaying, by the control engine, a first scene picture based on the first scene data includes:
And displaying a first scene picture comprising resources corresponding to the resource data based on the first scene data through the control engine.
In the embodiment of the disclosure, the scene data comprises the resource data, the resource data in the scene data can be rendered into the resource for display, and the content included in the scene picture is enriched, so that the display effect of the scene picture is ensured.
In some embodiments, the displaying, by the control engine, based on the first scene data, a first scene picture including a resource corresponding to the resource data includes:
determining, by the control engine, a data format to which the resource data belongs;
And rendering the resource data into a control in the first scene picture according to a control template matched with the data format by the control engine, wherein the control comprises resources corresponding to the resource data.
In the embodiment of the disclosure, the samples required to be presented are different in consideration of different data formats to which the resource data belong, and the resources are rendered by using the template control matched with the data format to which each resource data belongs, so that the display style of the resources in the first scene picture is matched with the data format to which the resources belong, and the display effect of the resources in the first scene picture is ensured.
In some embodiments, the data model further includes an event trigger condition associated with the scene data and a control instruction corresponding to the event trigger condition, and after the control engine displays a scene picture corresponding to the scene data based on the data model, the method further includes:
and executing a control instruction corresponding to the event triggering condition by the control engine under the condition that the event triggering condition is currently met.
In the embodiment of the disclosure, the data model includes the event trigger condition associated with the scene data and the control instruction corresponding to the event trigger condition, and the display of the scene picture can be controlled by combining the event trigger condition and the control instruction configured in the data model through the control engine, that is, the data model is operated without developing additional processing logic codes, thereby improving the development efficiency.
In some embodiments, the scene is a first scene corresponding to first scene data in the data model, the event trigger condition associated with the first scene data includes a transition trigger condition, and the control instruction corresponding to the transition trigger condition includes a transition control instruction indicating to switch to display a second scene corresponding to second scene data in the data model; executing, by the control engine, a control instruction corresponding to the event trigger condition when the event trigger condition is currently satisfied, including:
and under the condition that the transition trigger condition is currently met, the control engine switches and displays the first scene picture as the second scene picture based on the transition control instruction.
In the embodiment of the disclosure, once the control engine determines that the scene triggering condition is currently met, based on the transition control instruction corresponding to the transition triggering condition, based on the second scene data indicated by the transition control instruction, the second scene can be rendered, and meanwhile, the first scene is switched and displayed to be the second scene, that is, the control engine is combined with the event triggering condition and the control instruction, so that the switching and displaying of the scene can be controlled, and the rendering logic code does not need to be developed for the data model independently, thereby improving the development efficiency.
In some embodiments, the transition control instruction further indicates a transition mode between the first scene and the second scene, the switching, by the control engine, the first scene to be displayed as the second scene based on the transition control instruction if the transition trigger condition is currently satisfied, including:
and under the condition that the transition trigger condition is currently met, the control engine switches and displays the first scene picture as the second scene picture according to the transition mode based on the transition control instruction.
In the embodiment of the disclosure, the transition control instruction comprises a transition mode between scene images, and the scene images are switched and displayed according to the set transition mode, so that the switching modes of the scene images are enriched, and the display effect is ensured.
In some embodiments, the switching, by the control engine, the first scene picture to be displayed as the second scene picture according to the transition mode based on the transition control instruction when the transition trigger condition is currently satisfied includes any one of the following:
the transition mode is a first type transition mode, and the control engine is used for displaying the second scene picture while canceling the display of the first scene picture based on the transition control instruction under the condition that the transition trigger condition is currently met;
The transition mode is a second-class transition mode, and the control engine is used for playing a transition animation matched with the second-class transition mode under the condition that the transition trigger condition is met currently, and displaying the second scene picture after the transition animation is played.
In the embodiment of the disclosure, various types of transition modes are provided, and the scene images are switched and displayed according to the configured transition modes, so that the diversity of scene image switching display can be enriched, and the display effect is ensured.
In some embodiments, in the event that the at least one piece of data is determined to be of the target data type, before creating the control engine based on the instruction information, the method further comprises:
for each piece of data in the data model, determining that the data belongs to the target data type in the case that the data comprises a target character string;
Wherein the target string is used to represent the target data type.
In the scheme provided by the embodiment of the disclosure, a manner of identifying a target data type is provided, and whether the data in the acquired data model belongs to the target data type is determined according to a target character string for representing the target data type, so that the accuracy of identification is ensured.
In some embodiments, the control engine is run in a target application, and in case it is determined that the at least one piece of data belongs to the target data type, after creating the control engine based on the instruction information, the method further comprises:
determining a display area set by the target application through the control engine, wherein the display area is used for displaying a scene picture;
The displaying, by the control engine, a scene picture corresponding to the scene data based on the data model, including:
Displaying, by the control engine, the scene picture in the display area based on the data model.
In the scheme provided by the embodiment of the disclosure, the scene picture is displayed in the display area set by the target application through the control engine running in the target application, so that a mode of displaying the scene picture in the application is realized, and the display accuracy of the scene picture is ensured.
According to another aspect of the embodiments of the present disclosure, there is provided a scene picture display device, the device including:
An acquisition unit configured to execute acquisition configuration information including a target data type and instruction information indicating an instruction for creating a control engine in a case where data belonging to the target data type is acquired;
the acquisition unit is further configured to execute an acquisition data model, the data model comprising at least one piece of data, the at least one piece of data comprising scene data, the scene data being used to render a scene picture;
A creation unit configured to execute, in a case where it is determined that the at least one piece of data belongs to the target data type, creation of the control engine based on the instruction information;
And a display unit configured to display a scene picture corresponding to the scene data based on the data model by the control engine.
In some embodiments, the display unit includes:
A determining subunit configured to perform determining, by the control engine, first scene data from a plurality of pieces of scene data included in the data model;
A display subunit configured to display, by the control engine, a first scene picture based on the first scene data.
In some embodiments, the first scene data includes resource data, and the display subunit is configured to display, by the control engine, a first scene picture including a resource corresponding to the resource data based on the first scene data.
In some embodiments, the display subunit is configured to determine, by the control engine, a data format to which the resource data belongs; and rendering the resource data into a control in the first scene picture according to a control template matched with the data format by the control engine, wherein the control comprises resources corresponding to the resource data.
In some embodiments, the data model further includes an event trigger condition associated with the scene data and a control instruction corresponding to the event trigger condition, and the apparatus further includes:
And the execution unit is configured to execute the control instruction corresponding to the event triggering condition under the condition that the event triggering condition is currently met through the control engine.
In some embodiments, the scene is a first scene corresponding to first scene data in the data model, the event trigger condition associated with the first scene data includes a transition trigger condition, and the control instruction corresponding to the transition trigger condition includes a transition control instruction indicating to switch to display a second scene corresponding to second scene data in the data model; the execution unit is configured to execute the switching display of the first scene picture as the second scene picture based on the transition control instruction under the condition that the transition trigger condition is currently met by the control engine.
In some embodiments, the transition control instruction further indicates a transition mode between the first scene and the second scene, and the execution unit is configured to execute, by the control engine, switching display of the first scene as the second scene in the transition mode based on the transition control instruction if the transition trigger condition is currently satisfied.
In some embodiments, the execution unit is configured to perform any one of:
the transition mode is a first type transition mode, and the control engine is used for displaying the second scene picture while canceling the display of the first scene picture based on the transition control instruction under the condition that the transition trigger condition is currently met;
The transition mode is a second-class transition mode, and the control engine is used for playing a transition animation matched with the second-class transition mode under the condition that the transition trigger condition is met currently, and displaying the second scene picture after the transition animation is played.
In some embodiments, the apparatus further comprises:
A determining unit configured to perform, for each piece of data in the data model, determining that the data belongs to the target data type in a case where the data includes a target character string;
Wherein the target string is used to represent the target data type.
In some embodiments, the control engine operates in a target application, the apparatus further comprising:
A determining unit configured to execute a determination of a display area set by the target application by the control engine, the display area being for displaying a scene picture;
the display unit is configured to display the scene picture in the display area based on the data model by the control engine.
According to still another aspect of the embodiments of the present disclosure, there is provided an electronic device including:
A processor;
A memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the scene-cut display method of the above aspect.
According to still another aspect of embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium, which when executed by a processor of an electronic device, enables the electronic device to perform the scene-picture display method of the above aspect.
According to yet another aspect of embodiments of the present disclosure, there is provided a computer program product, which when executed by a processor of an electronic device, causes the electronic device to perform the scene-picture display method of the above aspect.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a schematic diagram illustrating an implementation environment according to an example embodiment.
Fig. 2 is a flowchart illustrating a scene picture display method according to an exemplary embodiment.
Fig. 3 is a flowchart illustrating another scene picture display method according to an exemplary embodiment.
Fig. 4 is a flowchart illustrating another scene picture display method according to an exemplary embodiment.
Fig. 5 is a flowchart illustrating yet another scene picture display method according to an exemplary embodiment.
Fig. 6 is a schematic diagram illustrating a scene cut according to an exemplary embodiment.
Fig. 7 is a block diagram of a scene cut display device according to an exemplary embodiment.
Fig. 8 is a block diagram of a scene cut display device according to an exemplary embodiment.
Fig. 9 is a block diagram of a terminal according to an exemplary embodiment.
Detailed Description
In order to enable those skilled in the art to better understand the technical solutions of the present disclosure, the technical solutions of the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description of the present disclosure and the claims and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the disclosure described herein may be capable of operation in sequences other than those illustrated or described herein. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
The terms "at least one," "a plurality," "each," "any" as used herein, at least one includes one, two or more, a plurality includes two or more, and each refers to each of a corresponding plurality, any of which refers to any of the plurality. For example, the plurality of pieces of scene data includes 3 pieces of scene data, and each refers to each of the 3 pieces of scene data, and any one of the 3 pieces of scene data can be the first piece of scene data, or the second piece of scene data, or the third piece of scene data.
It should be noted that, information (including but not limited to configuration information), data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present disclosure are all authorized by the user or are fully authorized by the parties.
The scene picture display method provided by the embodiment of the application is executed by the electronic equipment. In some embodiments, the electronic device is provided as a terminal. FIG. 1 is a schematic diagram of an implementation environment provided in accordance with an exemplary embodiment, the implementation environment comprising: terminal 101 and server 102, terminal 101 and server 102 are connected through wireless or wired network, and this disclosure is not limited thereto.
The terminal 101 is a mobile phone, a tablet computer, a computer, or other types of terminals, but is not limited thereto. The server 102 is a server, or a server cluster comprising a plurality of servers, or a cloud computing service center, but is not limited thereto.
The server 102 is configured to provide a data model for the terminal 101, where the terminal 101 is configured to display, through the data model provided by the server 102, a scene picture corresponding to scene data in the data model.
In some embodiments, terminal 101 installs a target application served by server 102, through which terminal 101 can implement functionality such as resource exposure. For example, the target application is a target application in the operating system of the terminal 101 or a target application provided for a third party. For another example, the target application is a resource sharing application having a resource sharing function, and of course, the resource sharing application may also have other functions, such as a comment function, a shopping function, a navigation function, and the like.
The terminal 101 logs in to the target application through the account number, and interacts with the server 102 through the target application. The server pushes a data model to the terminal 101 for installing the target application; or the terminal 101 sends a data request to the server 102 through the target application, and the server 102 sends the data model to the terminal 101 based on the data request. Based on the data model sent by the server 102, the terminal 101 displays a scene picture corresponding to the scene data in the data model in the target application.
Fig. 2 is a flowchart illustrating a scene picture display method, see fig. 2, performed by a terminal, according to an exemplary embodiment, comprising the steps of:
In step 201, configuration information is acquired, the configuration information including a target data type and instruction information indicating an instruction for creating a control engine in the case where data belonging to the target data type is acquired.
Wherein the target data type is an arbitrary data type, the target data type can be represented in an arbitrary form, for example, the target data type is represented in the form of a character string, i.e. the configuration information comprises a character string, which character string represents the target data type. In the embodiment of the present disclosure, the configuration information has defined therein a target data type and an instruction for creating a control engine for running data belonging to the target data type, so the instruction information defines that the control engine is created only when the data of the target data type is acquired, so as to ensure that the acquired data can be normally run when the data belonging to the target data type is acquired.
In step 202, a data model is acquired, the data model comprising at least one piece of data, the at least one piece of data comprising scene data, the scene data being used to render a scene picture.
Wherein the data model comprises at least one piece of data, the data model is a data set. The scene data in the at least one piece of data contains data required for rendering a scene picture, the scene data contains data describing content to be displayed, for example, the scene data comprises resource data describing a resource to be displayed in the scene picture corresponding to the scene data. A scene is a scene to be displayed that can be presented in any form, for example, in a pop-up window, or in other forms, as not limited by the present disclosure.
In step 203, in case it is determined that the at least one piece of data belongs to the target data type, a control engine is created based on the instruction information.
In the embodiment of the disclosure, since the instruction indicated by the instruction information in the configuration information is used for creating the control engine in the case of acquiring the data belonging to the target data type, after the terminal acquires the data model, once it is determined that at least one piece of data included in the data model belongs to the target data type, the control engine is created based on the instruction information so as to subsequently run the data model through the control engine.
In step 204, a scene corresponding to the scene data is displayed by the control engine based on the data model.
In the embodiment of the disclosure, the control engine has a function of running data belonging to a target data type, the acquired data model comprises scene data, and the scene data is used for rendering a scene picture, and the scene picture can be rendered based on the scene data contained in the data model by running the data model through the control engine.
In the scheme provided by the embodiment of the disclosure, the target data type is defined in the acquired configuration information, and the instruction information of the control engine is created under the condition that the data of the data type is acquired, so that once the data model of the target data type is acquired under the condition that the configuration information is acquired, the control engine is created based on the instruction information, the control engine is operated to realize the display of the scene picture corresponding to the scene data in the data model, namely, based on the configuration information, the data model of the target data type can be operated, and therefore, a developer can realize the display of the scene picture corresponding to the data model only by configuring the data model of the target data type without developing a rendering logic code for the data model independently, thereby improving the development efficiency.
In some embodiments, displaying, by the control engine, a scene picture corresponding to the scene data based on the data model, includes:
determining, by a control engine, first scene data from a plurality of pieces of scene data included in a data model;
the first scene picture is displayed based on the first scene data by the control engine.
In some embodiments, the first scene data includes resource data, and displaying, by the control engine, the first scene picture based on the first scene data includes:
And displaying, by the control engine, a first scene picture including resources corresponding to the resource data based on the first scene data.
In some embodiments, displaying, by the control engine, based on the first scene data, a first scene picture including resources corresponding to the resource data, includes:
Determining a data format to which the resource data belong through a control engine;
And rendering the resource data into a control in the first scene picture according to a control template matched with the data format by a control engine, wherein the control contains resources corresponding to the resource data.
In some embodiments, the data model further includes an event trigger condition associated with the scene data and a control instruction corresponding to the event trigger condition, and after displaying, by the control engine, a scene picture corresponding to the scene data based on the data model, the method further includes:
And executing a control instruction corresponding to the event triggering condition under the condition that the event triggering condition is met currently through the control engine.
In some embodiments, the scene is a first scene corresponding to first scene data in the data model, the event trigger condition associated with the first scene data comprises a transition trigger condition, and the control instruction corresponding to the transition trigger condition comprises a transition control instruction indicating to switch to display a second scene corresponding to second scene data in the data model; under the condition that the event triggering condition is met currently, executing a control instruction corresponding to the event triggering condition by a control engine, wherein the control instruction comprises:
And under the condition that the transition trigger condition is met currently, the control engine is used for switching and displaying the first scene picture as a second scene picture based on the transition control instruction.
In some embodiments, the transition control instruction further indicates a transition mode between the first scene and the second scene, and the switching, by the control engine, the first scene to be displayed as the second scene based on the transition control instruction if the transition trigger condition is currently satisfied includes:
And under the condition that the transition trigger condition is met currently, the control engine is used for switching and displaying the first scene picture as a second scene picture according to the transition mode based on the transition control instruction.
In some embodiments, the control engine is configured to switch, based on the transition control instruction, the first scene to be displayed as the second scene in the transition mode when the transition trigger condition is currently satisfied, including any one of the following:
the transition mode is a first type transition mode, and a second scene picture is displayed while the first scene picture is canceled to be displayed based on a transition control instruction under the condition that the transition trigger condition is met currently through a control engine;
The transition mode is a second type transition mode, and the control engine is used for playing the transition animation matched with the second type transition mode under the condition that the transition trigger condition is met currently, and displaying a second scene picture after the transition animation is played.
In some embodiments, in the event that it is determined that the at least one piece of data belongs to the target data type, prior to creating the control engine based on the instruction information, the method further comprises:
For each piece of data in the data model, determining that the data belongs to a target data type when the data comprises a target character string;
wherein the target string is used to represent a target data type.
In some embodiments, the control engine is run in the target application, and in the event that it is determined that the at least one piece of data belongs to the target data type, after creating the control engine based on the instruction information, the method further comprises:
Determining a display area set by a target application through a control engine, wherein the display area is used for displaying a scene picture;
displaying, by the control engine, a scene picture corresponding to the scene data based on the data model, including:
the scene cut is displayed in the display area based on the data model by the control engine.
Based on the embodiment shown in fig. 2, taking an example that the data model includes a plurality of pieces of scene data, the control engine can realize the switching display of the scene pictures corresponding to the plurality of pieces of scene data, and the specific process is as follows in the embodiment.
Fig. 3 is a flowchart illustrating a scene picture display method, see fig. 3, performed by a terminal, according to an exemplary embodiment, comprising the steps of:
In step 301, configuration information is acquired, the configuration information comprising a target data type and instruction information indicating instructions for creating a control engine in case data belonging to the target data type is acquired.
In an embodiment of the present disclosure, the configuration information includes a target data type including one or more, and the instruction information indicates an instruction for creating the control engine in a case where the acquired data belongs to any one or more target data types.
In some embodiments, the target data type is a data type defined in a target protocol. The target protocol is used for defining a data type to which data required for displaying a certain resource belongs, for example, the target protocol is an interactive advertisement protocol, the interactive advertisement protocol indicates the data type to which the data required for displaying an interactive advertisement belongs, and the data types defined in the interactive advertisement protocol are all used as the target data type.
For the acquiring manner of the configuration information, in some embodiments, the configuration information is stored in the terminal, and the terminal acquires the configuration information from the storage area; or the terminal obtains the configuration information sent by the server, which is not limited in this disclosure.
The server is used for providing service for the terminal, the configuration information is stored in the server, and the terminal and the server are interacted to obtain the configuration information stored in the server.
In the disclosed embodiments, the configuration information is configured by a developer. After the developer completes configuration of the configuration information, the configuration information is added and stored in the terminal. Or after the developer completes configuration of the configuration information, the configuration information is stored in a server, and the terminal can acquire the configuration information from the server.
In one possible implementation manner of the foregoing embodiment, a process for obtaining, by a terminal, configuration information sent by a server includes: the terminal sends an information acquisition request to the server, the server receives the information acquisition request and returns the configuration information to the terminal.
In one possible implementation manner of the foregoing embodiment, the terminal installs a target application, the configuration information is stored in a server that provides a service for the target application, the terminal sends an information acquisition request to the server when running the target application, and the server sends the configuration information stored in the server to the terminal based on the information acquisition request, and the terminal receives the configuration information sent by the server.
In one possible implementation manner of the foregoing embodiment, the terminal installs a target application, where the configuration information is included in the target application, and obtains the configuration information in the target application when the terminal runs the target application.
In step 302, a data model is acquired.
Wherein the data model comprises at least one piece of data comprising scene data for rendering a scene picture. For example, the scene data includes a scene identification, scene content, profile information, or scene lifecycle, wherein the scene identification is represented in any form, such as the scene identification is represented in the form of a character string; the scene content is the content contained in the scene picture corresponding to the scene data, for example, the scene content comprises resource data in the scene data and the like; the profile information is profile information for describing a scene picture corresponding to the scene data; the scene life cycle indicates the display time, the vanishing time or the display duration of the scene picture corresponding to the scene data.
In the embodiment of the disclosure, the data model is a data model corresponding to a resource, and the data model is used for showing the resource. For example, the resource is an advertisement, and the data model is a data model corresponding to the advertisement, through which the advertisement can be displayed. In the case where the asset is an advertisement, the scene data in the data model is used to render a scene picture of the advertisement. As another example, the advertisement may be a stream advertisement, an incentive video, an open screen advertisement, or other various forms of advertisement.
In some embodiments, the data model includes a plurality of pieces of data including a plurality of pieces of scene data.
Wherein each piece of scene data is used to render one scene picture. For example, the data model is a data model corresponding to an advertisement, the data model comprising a plurality of scene data, each scene data corresponding to one scene picture, i.e. the advertisement comprises a plurality of scene pictures. When displaying the advertisement, the plurality of scene pictures are displayed according to the data in the data model, that is, the advertisement is displayed.
For the manner in which the data model is obtained, in some embodiments, the terminal receives the data model issued by the server.
The server is used for providing a data model for the terminal, and the terminal stores a plurality of data models. In the disclosed embodiment, the data model is stored in the server after configuration by the developer is completed.
In one possible implementation manner of the foregoing embodiment, the terminal is provided with a target application, the server is configured to provide a service for the target application, the terminal sends a data acquisition request to the server through the target application, the server receives the data acquisition request, sends a data model to the terminal, and the terminal receives the data model sent by the server through the target application.
In the embodiment of the disclosure, the terminal acquires the data model from the server through the target application so as to display a scene picture corresponding to the data model in the target application.
In step 303, in case it is determined that the data in the data model belongs to the target data type, a control engine is created based on the instruction information.
Wherein the control engine is configured to run a data model that belongs to the target data type. In the embodiment of the disclosure, after the terminal acquires the configuration information, the acquired data is checked based on the configuration information, and if it is determined that the acquired data model includes data all belonging to the target data type, a control engine is created for the data model based on instruction information in the configuration information, so that the data model is operated by the control engine later.
In some embodiments, where the configuration information includes a plurality of target data types, the data model includes a plurality of pieces of data, then the step 303 includes: in the event that each piece of data in the data model is determined to belong to a target data type, a control engine is created based on the instruction information.
In the embodiment of the disclosure, the data model includes a plurality of pieces of data, the configuration information includes a plurality of target data types, and the data types of the plurality of pieces of data included in the data model may be different, for example, a first piece of data in the data model belongs to a first target data type, and a second piece of data in the data model belongs to a second target data type. In case it is determined that each piece of data in the data model belongs to any one of the target data types, i.e. a control engine needs to be created for the data model, a control engine is created based on the instruction information.
In some embodiments, the process of determining whether the data belongs to the target data type includes: for each piece of data in the data model, in the case that the data includes a target string, it is determined that the data belongs to the target data type.
Wherein the target string is used to represent the target data type. For example, the target string is a type name of the target data type or a type identifier of the data type, etc., which is not limited herein. In the embodiment of the disclosure, the data belonging to the target data type includes the character string, and in the case that any data includes the target character string, it means that the data belongs to the target data type.
In one possible implementation manner of the foregoing embodiment, the configuration information includes multiple target data types, each target data type corresponds to a target string, where the target string is used to represent a corresponding target data type, and then, for each piece of data in the data model, if the data includes any target string, it is determined that the data belongs to the target data type corresponding to the target string. In the scheme provided by the embodiment of the disclosure, a manner of identifying a target data type is provided, and whether the data in the acquired data model belongs to the target data type is determined according to a target character string for representing the target data type, so that the accuracy of identification is ensured.
In step 304, first scene data is determined from a plurality of pieces of scene data included in the data model by the control engine.
The first scene data is any one of a plurality of pieces of scene data.
In some embodiments, the data model includes a plurality of pieces of scene data and a display order of each piece of scene data, and the scene data whose display order is the forefront is determined as the first scene data by the display order corresponding to the plurality of pieces of scene data.
In some embodiments, the data model includes an event trigger condition associated with each scene data and a control instruction corresponding to the event trigger condition, and the step 304 includes: when any event triggering condition is met currently and a control instruction corresponding to the event triggering condition indicates to display a scene picture corresponding to the designated scene data, the control engine determines the scene data designated in the control instruction as first scene data.
In the embodiment of the present disclosure, the event triggering condition indicates a condition to be satisfied by a control instruction for executing the event triggering condition, where the control instruction is used to execute a certain operation, for example, the control instruction indicates that certain data is acquired, or the control instruction indicates that a scene picture corresponding to certain scene data is displayed. And once determining that any event triggering condition is met currently, executing a control instruction corresponding to the event triggering condition subsequently, determining the scene data specified in the control instruction when the control instruction indicates to display a scene picture corresponding to the specified scene data, namely determining the first scene data, so as to execute corresponding operation based on the determined scene data according to the control instruction subsequently.
For example, if the event triggering condition included in the data model is that the control engine is successfully created, and the control instruction corresponding to the event triggering condition indicates that the scene picture corresponding to the first scene data is displayed, when the control engine is successfully created, the first scene data is determined from the plurality of pieces of scene data included in the data model through the control engine according to the control instruction.
In step 305, a first scene picture is displayed by the control engine based on the first scene data.
In the embodiment of the disclosure, the control engine has a function of running a data model, and then the first scene picture can be rendered based on the first scene data through the control engine. In the embodiment of the disclosure, each piece of scene data included in the data model is used for displaying one scene picture, and after the control engine is created, the first scene data which is preferentially displayed is determined in the data model to display the first scene picture, so that the scene picture display corresponding to the scene data in the data model is controlled by the control engine.
In some embodiments, the first scene data includes resource data, and the step 305 includes: and displaying a first scene picture comprising the resources corresponding to the resource data based on the first scene data through the control engine.
The resource data is used to describe a resource, and the resource is any type of resource, for example, text, image, video, or the like. When the first scene data is rendered into a first scene picture through the control engine, rendering the resource data included in the first scene data into resources, so that the displayed first scene picture includes the resources corresponding to the resource data. In the embodiment of the disclosure, the scene data comprises the resource data, the resource data in the scene data can be rendered into the resource for display, and the content included in the scene picture is enriched, so that the display effect of the scene picture is ensured.
In one possible implementation manner of the foregoing embodiment, the displaying the first scene includes: determining, by the control engine, a data format to which the resource data belongs; and rendering the resource data into a control in the first scene picture according to a control template matched with the data format by the control engine, wherein the control contains resources corresponding to the resource data.
Wherein the data format is any type of data format including, for example, text format, image format, video format, or animation format, etc. The first scene includes a control for carrying resources, where the control is any form of control, for example, the control included in the first scene is a native control.
In the embodiment of the disclosure, each data format corresponds to one control template, for example, the control templates corresponding to multiple data formats include a text control template, an image control template, an animation control template, a horizontal box control template and a vertical box control template. When any resource data is rendered, rendering is performed according to a control template matched with the data format to which the resource data belongs, so that resources corresponding to the resource data are displayed in a first scene picture, and the resources are displayed in a control mode. That is, the control in the first scene image contains the resource corresponding to the resource data. In the embodiment of the disclosure, the samples required to be presented are different in consideration of different data formats to which the resource data belong, and the resources are rendered by using the template control matched with the data format to which each resource data belongs, so that the display style of the resources in the first scene picture is matched with the data format to which the resources belong, and the display effect of the resources in the first scene picture is ensured.
In some embodiments, the first scene data comprises a plurality of pieces of resource data, and the step 305 comprises: and displaying, by the control engine, a first scene picture including resources corresponding to each piece of resource data based on the first scene data.
For example, the first scene data includes text data, image data, and video data, and the displayed first scene picture includes text corresponding to the text data, an image corresponding to the image data, and a video corresponding to the video data.
In some embodiments, the first scenario data includes resource data and operation control data, and the step 305 includes: and displaying a first scene picture comprising resources corresponding to the resource data based on the first scene data through the control engine, wherein the first scene picture comprises an operation control corresponding to the operation control data.
The operation control data is used for describing the operation control, and for example, the operation control data comprises the display size, the display style or the information filled in the operation control. For example, the operation control in the first scene is a "next page" button, or the operation control is a button that can be triggered after 10 seconds, and a 10 second countdown or a progress bar with a duration of 10 seconds is displayed in the button.
It should be noted that, in the embodiment of the present disclosure, the first scene corresponding to the first scene data is displayed as an example, and in another embodiment, the steps 304-305 are not required to be executed, but other manners are adopted, and the control engine displays, based on the data model, the scene corresponding to the scene data.
In step 306, if the transition trigger condition associated with the first scene data is currently satisfied, the control engine switches and displays the first scene as the second scene based on the transition control instruction corresponding to the transition trigger condition.
In an embodiment of the present disclosure, the data model includes an event trigger condition associated with the scene data and a control instruction corresponding to the event trigger condition, where the event trigger condition indicates a condition to be satisfied by the control instruction corresponding to the event trigger condition, and the control instruction is used to perform a certain operation. The first scene data is associated with event triggering conditions, the event triggering conditions associated with the first scene data comprise transition triggering conditions, and the control instruction corresponding to the transition triggering conditions comprises a transition control instruction which instructs switching display of a second scene picture corresponding to second scene data in the data model. The transition trigger condition indicates a condition that needs to be satisfied when a transition control instruction corresponding to the transition trigger condition is executed, that is, a trigger condition for switching the first scene picture to the second scene picture. After the first scene picture is displayed, once the current condition meeting the scene triggering condition is determined, based on a transition control instruction corresponding to the transition triggering condition, based on second scene data indicated by the transition control instruction, a second scene picture can be rendered, and meanwhile, the first scene picture is switched and displayed as the second scene picture.
In some embodiments, the transition trigger condition is that a display duration of a first scene frame reaches a first duration, or a scene switching operation is detected in the first scene frame, which is not limited by the present disclosure.
The first time period is an arbitrary time period, for example, the first time period is 5 seconds or 10 seconds, etc. For example, at the beginning of displaying the first scene, the display duration of the first scene is recorded, and upon determining that the display duration of the first scene reaches the first duration, the display of the second scene is switched according to step 306 described above. For another example, the first scene includes an operation control to switch the scene, and upon detecting that the operation control is triggered, corresponding to detecting a scene switch operation, the second scene is switched to display according to step 306 described above.
In some embodiments, the transition control instruction further indicates a transition mode between the first scene and the second scene, then the step 306 includes: and under the condition that the transition trigger condition is currently met, the control engine switches and displays the first scene picture as the second scene picture according to the transition mode based on the transition control instruction.
Wherein a transition mode is used to indicate the manner in which the first scene is switched to the second scene, the transition mode being any type of mode, for example, including a visibility transition mode, a template transition mode, an element sharing transition mode, or an animated transition mode. When the visibility transition mode indicates that two scene images are switched for display, one scene image is canceled for display and the other scene image is displayed at the same time; the template transition mode indicates that two scene pictures are switched and displayed according to a fixed transition template; when the element sharing transition mode indicates that images containing the same element are displayed in a switching manner, the same element is kept unchanged, and only the images except for the same element are switched. And when the animation transition mode indicates that two scene images are switched for display, playing a transition animation. In the embodiment of the disclosure, the transition control instruction comprises a transition mode between scene images, and the scene images are switched and displayed according to the set transition mode, so that the switching modes of the scene images are enriched, and the display effect is ensured.
For example, when the transition mode between the first scene and the second scene is the visibility transition mode, the first scene gradually and transparently disappears according to the visibility transition mode when the second scene is switched to be displayed, and the second scene gradually displays. For another example, the transition mode between the first scene and the second scene is a template transition mode, that is, when the scene is switched, the switching is performed according to a fixed display template, for example, the first scene moves to the right side of the display interface, gradually moves out of the display interface, and the second scene is displayed and gradually moves from the left side of the display interface to enter the display interface, so that the first scene is switched and displayed as the second scene.
In one possible implementation of the above embodiment, the process of switching the scene according to the transition mode includes any one of the following:
a first item: the transition mode is a first type transition mode, and the control engine is used for displaying the second scene picture while canceling the display of the first scene picture based on the transition control instruction under the condition that the transition trigger condition is currently met.
The second item: the transition mode is a second type transition mode, and the control engine is used for playing the transition animation matched with the second type transition mode under the condition that the transition trigger condition is met currently, and displaying the second scene picture after the transition animation is played.
The second type of transition mode plays transition animation, and the transition animation is contained in the data model. In the embodiment of the disclosure, the transition modes between scene images comprise multiple types, the scene images are directly switched and displayed according to the first type of transition modes, and the transition animation matched with the second type of transition modes needs to be played when the scene images are switched and displayed according to the second type of transition modes. In the embodiment of the disclosure, various types of transition modes are provided, and the scene images are switched and displayed according to the configured transition modes, so that the diversity of scene image switching display can be enriched, and the display effect is ensured.
In some embodiments, the data model includes scene position relation data, and in the case that a transition trigger condition associated with the first scene data is currently satisfied, the first scene is switched and displayed as the second scene based on a transition control instruction corresponding to the transition trigger condition according to a position relation between the first scene and the second scene indicated by the scene position relation data by the control engine. The scene position relation data are used for describing the relation between the display positions of scene pictures corresponding to the scene data in the data model.
In the scheme provided by the embodiment of the disclosure, the target data type is defined in the acquired configuration information, and the instruction information of the control engine is created under the condition that the data of the data type is acquired, so that once the data model of the target data type is acquired under the condition that the configuration information is acquired, the control engine is created based on the instruction information, the control engine is operated to realize the display of the scene picture corresponding to the scene data in the data model, namely, based on the configuration information, the data model of the target data type can be operated, and therefore, a developer can realize the display of the scene picture corresponding to the data model only by configuring the data model of the target data type without developing a rendering logic code for the data model independently, thereby improving the development efficiency.
And the scene data comprises resource data, the resource data in the scene data can be rendered into resources for display, and the content included in the scene picture is enriched, so that the display effect of the scene picture is ensured.
And considering that the samples required to be presented are different in the data format to which the resource data belong, the resources are rendered by using the template control matched with the data format to which each resource data belongs, so that the display style of the resources in the first scene picture is matched with the data format to which the resources belong, and the display effect of the resources in the first scene picture is ensured.
And the transition control instruction comprises a transition mode between scene images, and the scene images are switched and displayed according to the set transition mode, so that the switching modes of the scene images are enriched, and the display effect is ensured.
It should be noted that, in the embodiment shown in fig. 3, after the first scene is displayed, the event trigger condition associated with the first scene and indicating to switch the scene is triggered is taken as an example for explanation, and in another embodiment, other event trigger conditions are also associated with the scene data corresponding to the currently displayed scene, after the first scene is displayed, step 306 is not required to be executed, but the control engine executes the control instruction corresponding to the event trigger condition if any event trigger condition is currently satisfied.
In some embodiments, the target data model includes a position conversion trigger condition associated with the first scene data and a position conversion control instruction corresponding to the position conversion trigger condition, and if the position conversion trigger condition is triggered, the control engine changes the display position of the first scene picture according to the position conversion control instruction.
In one possible implementation manner of the foregoing embodiment, the position conversion control instruction further instructs a position change transition mode, and the display position of the first scene is converted from the second display position to the first display position according to the first display position and the position change transition mode indicated by the position conversion control instruction. The second display position is a display position before the position is unchanged, and the first display position is a display position after the position is changed. The position changing transition mode is used for displaying the effect that the display position of the first scene picture changes.
In some embodiments, the first scene data includes a plurality of sub-scene data, the first scene image is displayed based on the first sub-scene data in the first scene data, the target data model includes a content update trigger condition associated with the first scene data and a content update control instruction corresponding to the content update trigger condition, and in the case of displaying the first scene image, the control engine updates and displays a third scene image corresponding to the first scene data according to the second sub-scene data indicated by the content update control instruction in the case that the content update trigger condition is triggered.
The content of the third scene picture displayed after updating is displayed based on the second sub-scene data, so that the content of the first scene picture is updated. The second sub-scene data is any one of sub-scene data different from the first sub-scene data among the plurality of sub-scene data included in the first scene data.
In one possible implementation manner of the foregoing embodiment, the content update control instruction further indicates a scene transition mode, and the first scene picture is switched to be displayed as the third scene picture according to the scene transition mode indicated by the content update control instruction. The in-scene transition mode is used for presenting the effect of switching between different scene pictures corresponding to the same scene data, the display positions and the picture sizes of the first scene picture and the third scene picture are the same, the contents of the first scene picture and the third scene picture are different, and the in-scene transition mode is used for presenting the effect of updating the contents in the pictures presented by the terminal.
In one possible implementation manner of the foregoing embodiment, the content update control instruction indicates an in-scene animation transition mode, and according to the in-scene animation transition mode indicated by the content update control instruction, an in-field transition animation corresponding to the in-scene animation transition mode is played, and after the in-field transition animation is played, the in-scene transition animation is switched to be displayed as the third scene picture.
On the basis of the embodiment shown in fig. 2, taking an example that a terminal is installed with a target application, the terminal displays a scene picture in the target application, as shown in fig. 4, the method is executed by the terminal, and the method includes the following steps:
In step 401, in case of running a target application, configuration information is acquired, the configuration information including a target data type and instruction information indicating an instruction for creating a control engine in case of acquiring data belonging to the target data type.
In step 402, in the case of running a target application, a data model is acquired, the data model comprising at least one piece of data, the at least one piece of data comprising scene data, the scene data being used to render a scene picture.
The steps 401-402 are similar to the steps 201-202 described above and are not described in detail herein.
In step 403, in case it is determined that the data in the data model belongs to the target data type, a control engine is created in the target application based on the instruction information.
In the disclosed embodiments, the created control engine runs in the target application. The configuration information and the data model are acquired under the condition of running the target application, whether each piece of data in the data model belongs to the target data type is determined through the target application, and the control engine is created in the target application based on the instruction information under the condition that the data in the data model all belong to the target data type, so that a scene picture is displayed in the target application through the control engine running in the target application.
In step 404, a display area set by the target application is determined by the control engine.
Wherein the display area is used for displaying a scene picture. In the embodiment of the present disclosure, the target application is an arbitrary application, the target application has a function of displaying a scene image corresponding to the data model, and a display area is set in the target application for displaying the scene image, so after the control engine is created in the target application, the display area set by the target application is determined by the control engine, so that the scene image corresponding to the data model is displayed in the display area by the control engine.
In some embodiments, the target application obtains the position information corresponding to the display area, sends the position information to the control engine, receives the position information through the control engine, and determines the display area indicated by the position information. Wherein the position information is used for describing a position corresponding to the display area, for example, the position information indicates a certain position in a display screen of the terminal.
In step 405, a scene picture corresponding to the scene data is displayed in the display area based on the data model by the control engine.
This step is similar to step 204 described above and will not be described again.
In the scheme provided by the embodiment of the disclosure, the configuration information and the data model are acquired through the running application, once the data model of the target data type is acquired under the condition that the configuration information is acquired, a control engine is created in the target application based on the instruction information, and the scene picture is displayed in the display area set by the target application through the control engine running in the target application, so that a mode of displaying the scene picture in the application is realized, and the display accuracy of the scene picture is ensured.
It should be noted that, on the basis of the embodiments shown in fig. 2 to 4, the embodiments of the present disclosure provide a data model configured by a developer according to the target protocol. In the target protocol, standardized data types and interfaces required for configuring the data model corresponding to the resource are defined, so that a developer can configure the data model conforming to the target protocol according to the target protocol. The target protocol supports rich interactive advertisement definition, namely, a data model of rich interactive advertisement can be defined according to the target protocol, and data embedding points, conversion jumps, interactive behavior definition and the like of the advertisement can be defined.
In some embodiments, the target protocol corresponds to a plurality of data model templates, and a developer fills data required to be filled in the data model templates in any data model template, so as to obtain a data model. Wherein, different data model templates are applicable to different types of resources, for example, taking the advertisement as an example, a first data model template is applicable to the advertisement of the game type, and a second data model template is applicable to the advertisement of the commodity type. Considering that different types of resources are displayed in different modes, for example, taking the resources as advertisements, the advertisements of the game types can display a test play scene picture of the game, in which a user can test the game, and after the test play, a download scene picture of the game is displayed; the commodity type advertisement displays a scene picture for playing video introducing the commodity, and then the user triggers certain operation in the scene picture and displays a commodity exchange page. Therefore, different data model templates are set for different types of resources, and when a developer configures any type of resources, the developer can obtain a data model corresponding to the resources according to the data model templates matched with the types, so that development efficiency is improved.
In some embodiments, the data model includes a plurality of pieces of data including scene data and at least one of lifecycle data, scene location relationship data, a set of triggers, or default condition data.
The life cycle data is used for indicating the life cycle of the data model, for example, the life cycle data indicates the display time, the disappearance time or the display duration of the scene picture corresponding to the data model. The scene position relationship data is used for describing a relationship between display positions of scene images corresponding to a plurality of pieces of scene data in the data model, for example, the data model includes two pieces of scene data, the scene position relationship data indicates that a first display position is moved leftwards by 10 pixel distances relative to a second display position, wherein the first display position is a display position of the scene images corresponding to the first piece of scene data in the two pieces of scene data, the second display position is a display position of the scene images corresponding to the second piece of scene data in the two pieces of scene data, the pixel distance is a size of each pixel, for example, the pixels are represented by square boxes, and a side length of each square box is one pixel distance. The trigger set includes a plurality of trigger identifications and corresponding triggers, each trigger for representing an event trigger condition. The default condition data is used for describing the state when the scene picture corresponding to the data model is displayed, and can be used as a judgment basis for trigger triggering. For a plurality of pieces of data included in the data model, taking the data model of the advertisement as an example, as shown in table 1, table 1 includes a plurality of pieces of data included in the data model of the advertisement, a type of each piece of data, a tag, and a related description of each piece of data.
TABLE 1
/>
In some embodiments, the target protocol defines a plurality of target data types required for configuring the data model, for example, the plurality of target data types defined by the target protocol include a control engine type, a display area type, a scene data type, a lifecycle type, a control template type, a trigger type, a transition mode type, a control instruction type, and a condition type. In the embodiment of the disclosure, the data of the multiple target data types are contained in the data model in a sub-data model manner.
In the embodiment of the disclosure, the control engine can convert the data model into a scene picture for display, and can also realize interaction with a user based on the triggering operation of the user. For the content contained in the sub data model (control engine lifecycle model) corresponding to the control engine type, as shown in table 2, the control engine lifecycle model includes a trigger identifier 1 and a trigger identifier 2, where the trigger identifier 1 indicates a trigger to be triggered when the control engine creation is successful, the trigger identifier 2 indicates a trigger to be triggered when the control engine disappears, and the trigger identifier 1 and the trigger identifier 2 are both represented in the form of character strings. In the embodiment of the disclosure, in the process of displaying a scene according to the data model, responding to a closing operation of the scene, indicating that the scene corresponding to the data model is not displayed any more, deleting a control engine created for the data model, and at the moment, obtaining the moment when the control engine disappears.
TABLE 2
FIELD Type(s) Label (Label) Description of the invention
Trigger identification 1 Character string Repeatable type Trigger for controlling engine display to trigger
Trigger identification 2 Character string Repeatable type Trigger needed to trigger when control engine disappears
The contents included in the sub data model (scene data model) corresponding to the scene data type are shown in table 3.
TABLE 3 Table 3
FIELD Type(s) Description of the invention
Scene identification Character string type
Scene content Content rendering model Content for indicating scene inclusion
Profile information Character string To illustrate the use or effect of the scene
Scene lifecycle Scene lifecycle For indicating scene lifecycle
For the contents contained in the sub data model (transition pattern model) corresponding to the transition pattern type, as shown in table 4, the transition pattern model includes a plurality of types of scene patterns, the type of each scene pattern, and the related description of each scene pattern.
TABLE 4 Table 4
/>
It should be noted that, on the basis of the embodiment shown in fig. 3, the event triggering condition is implemented by a trigger included in the data model, that is, the data model includes a trigger associated with the scene data and a control instruction corresponding to the trigger. The trigger is any type of trigger, for example, a single trigger, a conditional trigger, a heartbeat trigger, or a delay trigger. The single trigger refers to a trigger triggered when an operation is detected, the conditional trigger refers to a trigger triggered when a certain condition is satisfied, the heartbeat trigger refers to a trigger triggered once every first time interval, for example, a trigger triggered once every 3 seconds interval, and the delay trigger refers to a trigger triggered after a second time interval has elapsed since the detection of the triggering operation. For the contents contained in the sub data model (trigger model) corresponding to the trigger type, as shown in table 5, the trigger model includes a plurality of types of triggers, the type of each trigger, and the associated description of each trigger.
TABLE 5
In the embodiment of the disclosure, the control instruction includes various types of control instructions, for example, the control instruction includes a transition control instruction, a buried point control instruction, a video playing control instruction, a page address jump control instruction, a condition change control instruction, a time trigger cancellation instruction, a custom control instruction, a control instruction for triggering a trigger, a control instruction for converting jump, and a control instruction for step execution. As shown in table 6, the control instruction model includes a plurality of types of control instructions, types of each control instruction, and related descriptions of each control instruction for contents contained in a sub data model (control instruction model) corresponding to the types of control instructions.
TABLE 6
/>
In the embodiment of the present disclosure, the data model configured according to the target protocol may include the data in the foregoing tables 1 to 6, for example, according to the target protocol and the data described in the foregoing tables 1 to 6, an advertisement data model may be configured, and then, according to the embodiments shown in fig. 2 to 4, scene images included in advertisements may be displayed. According to the data model provided by the embodiment of the disclosure, the scene picture display of the interactive resource can be realized, for example, according to the configured advertisement data model, the scene picture display of the rich interactive advertisement can be realized.
On the basis of the embodiments shown in fig. 2 to 4 and the data models provided above, the embodiments of the present disclosure further provide a flow of a method for displaying a scene, taking an advertisement data model as an example, a target application is installed in a terminal, and when the target application is running, the scene is displayed by using the obtained data model, as shown in fig. 5, and the method is executed by the terminal, where the flow includes:
in step 501, configuration information is obtained, the configuration information including a target data type and instruction information.
In step 502, an advertisement data model issued by a server is acquired.
In step 503, in the event that it is determined that the data in the advertisement data model all belong to the target data type, a control engine is created for the advertisement data model based on the instruction information.
In step 504, a display area set by the target application is determined by the control engine.
In step 505, in a case where a first trigger in the advertisement data model is triggered, determining, from a plurality of pieces of scene data included in the advertisement data model, first scene data indicated by a first control instruction corresponding to the first trigger by the interface processing engine.
The first trigger indicates a trigger triggered when the control engine is successfully created, and the first control instruction indicates to display a scene picture corresponding to the first scene data.
In step 506, a first scene picture is displayed in the display area based on the first scene data and a display position of the first scene data indicated by scene position relationship data in the advertisement data model by the interface processing engine.
In step 507, when the second trigger associated with the first scene data is triggered, the interface processing engine switches and displays the first scene as the second scene based on the second control instruction corresponding to the second trigger.
The transition trigger condition in the embodiment shown in fig. 3 is implemented by the second trigger, and the second control instruction is the transition control instruction in the embodiment shown in fig. 3. The step 507 is similar to the step 306, and will not be described again.
In step 508, when the third trigger associated with the first scene data is triggered, the control engine obtains the operation data according to the third control instruction corresponding to the third trigger, and reports the operation data to the server.
The third control instruction indicates to acquire operation data and report the operation data, i.e. the third control instruction is a buried point control instruction.
In step 509, when the fourth trigger associated with the first scene data is triggered, the control engine accesses the page corresponding to the page address included in the fourth control instruction according to the fourth control instruction corresponding to the fourth trigger.
The fourth control instruction indicates to access the page corresponding to the page address, that is, the fourth control instruction is a page address jump control instruction.
In step 510, when a fifth trigger associated with the first scene data is triggered, the control engine plays the video indicated by the fifth control instruction in the display area according to the fifth control instruction corresponding to the fifth trigger.
The fifth control command indicates video playing, i.e. the fifth control command is a video playing control command.
The first to fifth flip-flops are any flip-flops in table 5, and the present disclosure is not limited thereto.
In the embodiment shown in fig. 5, only the case where the plurality of triggers associated with the first scene are triggered is described as an example, and in another embodiment, when the second scene is displayed and the plurality of triggers are associated with the second scene, the trigger associated with the second scene may be triggered in such a manner that the plurality of triggers associated with the first scene are triggered.
In the scheme provided by the embodiment of the disclosure, a developer only needs to configure the data model corresponding to the resource, and does not need to develop rendering logic codes for the data model independently, so that development efficiency is improved.
In addition, according to the scheme provided by the embodiment of the disclosure, the data model is configured, the scene picture corresponding to the data model is displayed, development of resources by a developer is not needed, and after development, the problem of overlong delivery cycle of the resources is solved through complex approval online flow, so that the terminal has the capability of rapidly displaying the scene picture contained in the resources, the application installed by the terminal also has the capability of rapidly displaying the scene picture contained in the resources, the access cost of the resources is reduced, and the frequency of resource publishing is also reduced. For example, the terminal can quickly display advertisements in the application. Moreover, the scheme provided by the embodiment of the disclosure can be suitable for terminals of various operating systems.
In addition, in the scheme provided by the embodiment of the disclosure, the data size of the data model is small, PB (a transmission program) can be adopted for data transmission, when the data model is transmitted, the transmission load is small, and the engine service architecture can be utilized to transmit the data model corresponding to the advertisement in the form of each advertisement/each request, namely, the advertisement can be put out in a time as short as possible. In addition, in the scheme provided by the embodiment of the disclosure, based on the development of the original environment, when the advertisement is updated, only the data model corresponding to the advertisement is updated, version update of the advertisement is not needed, and the hot update of the advertisement is realized.
Based on the embodiment shown in fig. 5, the disclosure further provides a schematic diagram of scene screen switching, as shown in fig. 6, in a display area 601 set by a target application, a first scene screen 602 of a game playing advertisement is displayed, the first scene screen 602 includes an image control 603, a text control 604 and an operation control 605, a game image is displayed in the image control 603, text information for describing a game is displayed in the text control 604, a countdown animation is displayed in the operation control 605 and includes a "free trial", after the countdown of the operation control 605 is finished, the state of the operation control 605 is switched from a non-triggerable state to a triggerable state, the trigger is triggered when a triggering operation on the operation control 605 is detected, and a control instruction corresponding to the trigger indicates that a second scene screen 606 is switched to be displayed according to the set transition mode. After the countdown is finished, the user clicks the operation control 605, a trigger associated with the operation control 605 is triggered, a control engine executes a control instruction corresponding to the trigger, that is, based on the second scene data, the first scene 602 is switched and displayed as a second scene 606, the second scene includes an image control 607, a text control 608 and a "download" operation control 609, a game image is displayed in the image control 607, text information for describing a game is displayed in the text control 608, the "download" operation control 609 is used for skipping a game download page, the "download" operation control 609 is associated with a trigger, the control instruction corresponding to the trigger indicates a page indicated by a skip game download address, and if the user clicks the "download" operation control 609, the controller associated with the "download" operation control 609 is triggered and skips to the game download page.
Fig. 7 is a block diagram of a scene cut display device according to an exemplary embodiment. Referring to fig. 7, the apparatus includes:
An acquisition unit 701 configured to execute acquisition configuration information including a target data type and instruction information indicating an instruction for creating a control engine in the case where data belonging to the target data type is acquired;
an acquisition unit 701 further configured to perform acquiring a data model, the data model comprising at least one piece of data, the at least one piece of data comprising scene data, the scene data being used for rendering a scene;
a creation unit 702 configured to execute creating a control engine based on the instruction information in a case where it is determined that at least one piece of data belongs to the target data type;
The display unit 703 is configured to display a scene picture corresponding to the scene data based on the data model by executing the control engine.
In some embodiments, as shown in fig. 8, the display unit 703 includes:
a determining subunit 7031 configured to perform determining, by the control engine, first scene data from among a plurality of pieces of scene data included in the data model;
the display subunit 7032 is configured to display, by the control engine, the first scene picture based on the first scene data.
In some embodiments, the first scene data includes resource data, and the display subunit 7032 is configured to execute the control engine to display, based on the first scene data, a first scene picture including resources corresponding to the resource data.
In some embodiments, the display subunit 7032 is configured to perform determining, by the control engine, a data format to which the resource data belongs; and rendering the resource data into a control in the first scene picture according to a control template matched with the data format by a control engine, wherein the control contains resources corresponding to the resource data.
In some embodiments, the data model further includes an event trigger condition associated with the scene data and a control instruction corresponding to the event trigger condition, as shown in fig. 8, and the apparatus further includes:
And the execution unit 704 is configured to execute the control instruction corresponding to the event triggering condition when the event triggering condition is currently met through the control engine.
In some embodiments, the scene is a first scene corresponding to first scene data in the data model, the event trigger condition associated with the first scene data comprises a transition trigger condition, and the control instruction corresponding to the transition trigger condition comprises a transition control instruction indicating to switch to display a second scene corresponding to second scene data in the data model; and an execution unit 704 configured to execute switching display of the first scene picture as the second scene picture based on the transition control instruction by the control engine in a case where the transition trigger condition is currently satisfied.
In some embodiments, the transition control instruction further indicates a transition mode between the first scene and the second scene, and the execution unit 704 is configured to execute the switching of the first scene to be displayed as the second scene in the transition mode based on the transition control instruction by the control engine in case the transition trigger condition is currently met.
In some embodiments, the execution unit 704 is configured to perform any one of the following:
the transition mode is a first type transition mode, and a second scene picture is displayed while the first scene picture is canceled to be displayed based on a transition control instruction under the condition that the transition trigger condition is met currently through a control engine;
The transition mode is a second type transition mode, and the control engine is used for playing the transition animation matched with the second type transition mode under the condition that the transition trigger condition is met currently, and displaying a second scene picture after the transition animation is played.
In some embodiments, the apparatus further comprises:
A determining unit 705 configured to perform, for each piece of data in the data model, determining that the data belongs to a target data type in a case where the data includes a target character string;
In some embodiments, the control engine operates in a target application, as shown in fig. 8, the apparatus further comprises:
a determining unit 705 configured to perform determination of a display area set by the target application by the control engine, the display area being used for displaying the scene picture;
A display unit 703 configured to display a scene picture in the display area based on the data model by the control engine.
The specific manner in which the individual units perform the operations in relation to the apparatus of the above embodiments has been described in detail in relation to the embodiments of the method and will not be described in detail here.
In an exemplary embodiment, there is also provided an electronic device including:
one or more processors;
volatile or non-volatile memory for storing one or more processor-executable instructions;
wherein the one or more processors are configured to perform the steps performed by the terminal in the scene-cut display method described above.
In some embodiments, the electronic device is provided as a terminal. Fig. 9 is a block diagram illustrating a structure of a terminal 900 according to an exemplary embodiment. The terminal 900 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. Terminal 900 may also be referred to by other names of user devices, portable terminals, laptop terminals, desktop terminals, etc.
The terminal 900 includes: a processor 901 and a memory 902.
Processor 901 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 901 may be implemented in at least one hardware form of DSP (DIGITAL SIGNAL Processing), FPGA (Field-Programmable gate array), PLA (Programmable Logic Array ). Processor 901 may also include a main processor, which is a processor for processing data in an awake state, also referred to as a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 901 may integrate a GPU (Graphics Processing Unit, image processor) for rendering and drawing of content required to be displayed by the display screen. In some embodiments, the processor 901 may also include an AI (ARTIFICIAL INTELLIGENCE ) processor for processing computing operations related to machine learning.
The memory 902 may include one or more computer-readable storage media, which may be non-transitory. The memory 902 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 902 is used to store at least one program code for execution by processor 901 to implement the scene visual display method provided by the method embodiments in the present disclosure.
In some embodiments, the terminal 900 may further optionally include: a peripheral interface 903, and at least one peripheral. The processor 901, memory 902, and peripheral interface 903 may be connected by a bus or signal line. The individual peripheral devices may be connected to the peripheral device interface 903 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of radio frequency circuitry 904, a display 905, a camera assembly 906, audio circuitry 907, a positioning assembly 908, and a power source 909.
The peripheral interface 903 may be used to connect at least one peripheral device associated with an I/O (Input/Output) to the processor 901 and the memory 902. In some embodiments, the processor 901, memory 902, and peripheral interface 903 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 901, the memory 902, and the peripheral interface 903 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 904 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 904 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 904 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 904 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuit 904 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (WIRELESS FIDELITY ) networks. In some embodiments, the radio frequency circuit 904 may also include NFC (NEAR FIELD Communication) related circuits, which are not limited by the present disclosure.
The display 905 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 905 is a touch display, the display 905 also has the ability to capture touch signals at or above the surface of the display 905. The touch signal may be input as a control signal to the processor 901 for processing. At this time, the display 905 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 905 may be one and disposed on the front panel of the terminal 900; in other embodiments, the display 905 may be at least two, respectively disposed on different surfaces of the terminal 900 or in a folded design; in other embodiments, the display 905 may be a flexible display disposed on a curved surface or a folded surface of the terminal 900. Even more, the display 905 may be arranged in an irregular pattern other than rectangular, i.e., a shaped screen. The display 905 may be made of LCD (Liquid CRYSTAL DISPLAY), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 906 is used to capture images or video. Optionally, the camera assembly 906 includes a front camera and a rear camera. The front camera is arranged on the front panel of the terminal, and the rear camera is arranged on the back of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 906 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuit 907 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 901 for processing, or inputting the electric signals to the radio frequency circuit 904 for voice communication. For purposes of stereo acquisition or noise reduction, the microphone may be plural and disposed at different portions of the terminal 900. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 901 or the radio frequency circuit 904 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, the audio circuit 907 may also include a headphone jack.
The location component 908 is used to locate the current geographic location of the terminal 900 to enable navigation or LBS (Location Based Service, location-based services). The positioning component 908 may be a positioning component based on the United states GPS (Global Positioning System ), the Beidou system of China, or the Galileo system of Russia.
The power supply 909 is used to supply power to the various components in the terminal 900. The power supply 909 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power source 909 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 900 can further include one or more sensors 910. The one or more sensors 910 include, but are not limited to: acceleration sensor 911, gyroscope sensor 912, pressure sensor 913, fingerprint sensor 914, optical sensor 915, and proximity sensor 916.
The acceleration sensor 911 can detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the terminal 900. For example, the acceleration sensor 911 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 901 may control the display 905 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal acquired by the acceleration sensor 911. The acceleration sensor 911 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 912 may detect a body direction and a rotation angle of the terminal 900, and the gyro sensor 912 may collect a 3D motion of the user on the terminal 900 in cooperation with the acceleration sensor 911. The processor 901 may implement the following functions according to the data collected by the gyro sensor 912: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
The pressure sensor 913 may be provided at a side frame of the terminal 900 and/or at a lower layer of the display 905. When the pressure sensor 913 is provided at a side frame of the terminal 900, a grip signal of the user to the terminal 900 may be detected, and the processor 901 performs left-right hand recognition or shortcut operation according to the grip signal collected by the pressure sensor 913. When the pressure sensor 913 is provided at the lower layer of the display 905, the processor 901 performs control of the operability control on the UI interface according to the pressure operation of the user on the display 905. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 914 is used for collecting the fingerprint of the user, and the processor 901 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 914 or the fingerprint sensor 914 identifies the identity of the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 901 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 914 may be provided on the front, back, or side of the terminal 900. When a physical key or a vendor Logo is provided on the terminal 900, the fingerprint sensor 914 may be integrated with the physical key or the vendor Logo.
The optical sensor 915 is used to collect the intensity of ambient light. In one embodiment, the processor 901 may control the display brightness of the display panel 905 based on the intensity of ambient light collected by the optical sensor 915. Specifically, when the ambient light intensity is high, the display luminance of the display screen 905 is turned up; when the ambient light intensity is low, the display luminance of the display panel 905 is turned down. In another embodiment, the processor 901 may also dynamically adjust the shooting parameters of the camera assembly 906 based on the ambient light intensity collected by the optical sensor 915.
A proximity sensor 916, also referred to as a distance sensor, is provided on the front panel of the terminal 900. Proximity sensor 916 is used to collect the distance between the user and the front of terminal 900. In one embodiment, when the proximity sensor 916 detects that the distance between the user and the front face of the terminal 900 gradually decreases, the processor 901 controls the display 905 to switch from the bright screen state to the off screen state; when the proximity sensor 916 detects that the distance between the user and the front surface of the terminal 900 gradually increases, the processor 901 controls the display 905 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 9 is not limiting and that more or fewer components than shown may be included or certain components may be combined or a different arrangement of components may be employed.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, which when executed by a processor of an electronic device, enables the electronic device to perform the steps performed by the terminal in the above-described scene-picture display method. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided, which, when executed by a processor of an electronic device, enables the electronic device to perform the steps performed by the terminal in the above-described scene-picture display method.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (22)

1. A scene cut display method, the method comprising:
Acquiring configuration information, wherein the configuration information comprises a target data type and instruction information, and the instruction information indicates an instruction for creating a control engine when data belonging to the target data type is acquired; the control engine is used for running data belonging to the target data type, the target data type is defined in a target protocol, the target protocol is used for defining the data type of data required by displaying a certain resource, the target protocol corresponds to a plurality of data model templates, and different data model templates are applicable to different types of resources;
Acquiring a data model, wherein the data model comprises at least one piece of data, the at least one piece of data comprises scene data, and the scene data is used for rendering a scene picture; the data model is obtained by filling data in any data model template;
creating the control engine based on the instruction information in case it is determined that the at least one piece of data belongs to the target data type;
And displaying a scene picture corresponding to the scene data based on the data model through the control engine.
2. The method of claim 1, wherein displaying, by the control engine, a scene picture corresponding to the scene data based on the data model, comprises:
determining, by the control engine, first scene data from a plurality of pieces of scene data included in the data model;
and displaying, by the control engine, a first scene picture based on the first scene data.
3. The method of claim 2, wherein the first scene data comprises resource data, wherein displaying, by the control engine, a first scene picture based on the first scene data comprises:
And displaying a first scene picture comprising resources corresponding to the resource data based on the first scene data through the control engine.
4. The method of claim 3, wherein displaying, by the control engine, based on the first scene data, a first scene picture including a resource corresponding to the resource data, comprises:
determining, by the control engine, a data format to which the resource data belongs;
And rendering the resource data into a control in the first scene picture according to a control template matched with the data format by the control engine, wherein the control comprises resources corresponding to the resource data.
5. The method of claim 1, wherein the data model further comprises an event trigger condition associated with the scene data and a control instruction corresponding to the event trigger condition, and wherein after displaying, by the control engine, a scene picture corresponding to the scene data based on the data model, the method further comprises:
and executing a control instruction corresponding to the event triggering condition by the control engine under the condition that the event triggering condition is currently met.
6. The method of claim 5, wherein the scene is a first scene corresponding to first scene data in the data model, the event trigger condition associated with the first scene data comprises a transition trigger condition, and the control instruction corresponding to the transition trigger condition comprises a transition control instruction indicating to switch to display a second scene corresponding to second scene data in the data model; executing, by the control engine, a control instruction corresponding to the event trigger condition when the event trigger condition is currently satisfied, including:
and under the condition that the transition trigger condition is currently met, the control engine switches and displays the first scene picture as the second scene picture based on the transition control instruction.
7. The method of claim 6, wherein the transition control instruction further instructs a transition mode between the first scene and the second scene, the switching, by the control engine, the first scene to be displayed as the second scene based on the transition control instruction if the transition trigger condition is currently satisfied, comprising:
and under the condition that the transition trigger condition is currently met, the control engine switches and displays the first scene picture as the second scene picture according to the transition mode based on the transition control instruction.
8. The method according to claim 7, wherein the switching, by the control engine, the first scene picture to be displayed as the second scene picture in the transition mode based on the transition control instruction in the case where the transition trigger condition is currently satisfied, includes any one of:
the transition mode is a first type transition mode, and the control engine is used for displaying the second scene picture while canceling the display of the first scene picture based on the transition control instruction under the condition that the transition trigger condition is currently met;
The transition mode is a second-class transition mode, and the control engine is used for playing a transition animation matched with the second-class transition mode under the condition that the transition trigger condition is met currently, and displaying the second scene picture after the transition animation is played.
9. The method according to any one of claims 1-8, wherein, in case it is determined that the at least one piece of data belongs to the target data type, before creating the control engine based on the instruction information, the method further comprises:
for each piece of data in the data model, determining that the data belongs to the target data type in the case that the data comprises a target character string;
Wherein the target string is used to represent the target data type.
10. The method according to any of claims 1-8, wherein the control engine is running in a target application, the method further comprising, after creating the control engine based on the instruction information in case it is determined that the at least one piece of data belongs to the target data type:
determining a display area set by the target application through the control engine, wherein the display area is used for displaying a scene picture;
The displaying, by the control engine, a scene picture corresponding to the scene data based on the data model, including:
Displaying, by the control engine, the scene picture in the display area based on the data model.
11. A scene cut display device, the device comprising:
An acquisition unit configured to execute acquisition configuration information including a target data type and instruction information indicating an instruction for creating a control engine in a case where data belonging to the target data type is acquired; the control engine is used for running data belonging to the target data type, the target data type is defined in a target protocol, the target protocol is used for defining the data type of data required by displaying a certain resource, the target protocol corresponds to a plurality of data model templates, and different data model templates are applicable to different types of resources;
The acquisition unit is further configured to execute an acquisition data model, the data model comprising at least one piece of data, the at least one piece of data comprising scene data, the scene data being used to render a scene picture; the data model is obtained by filling data in any data model template;
A creation unit configured to execute, in a case where it is determined that the at least one piece of data belongs to the target data type, creation of the control engine based on the instruction information;
And a display unit configured to display a scene picture corresponding to the scene data based on the data model by the control engine.
12. The apparatus of claim 11, wherein the display unit comprises:
A determining subunit configured to perform determining, by the control engine, first scene data from a plurality of pieces of scene data included in the data model;
A display subunit configured to display, by the control engine, a first scene picture based on the first scene data.
13. The apparatus of claim 12, wherein the first scene data comprises resource data, and wherein the display subunit is configured to display, via the control engine, a first scene picture comprising a resource to which the resource data corresponds based on the first scene data.
14. The apparatus of claim 13, wherein the display subunit is configured to perform determining, by the control engine, a data format to which the resource data belongs; and rendering the resource data into a control in the first scene picture according to a control template matched with the data format by the control engine, wherein the control comprises resources corresponding to the resource data.
15. The apparatus of claim 11, wherein the data model further comprises event trigger conditions associated with the scene data and control instructions corresponding to the event trigger conditions, the apparatus further comprising:
And the execution unit is configured to execute the control instruction corresponding to the event triggering condition under the condition that the event triggering condition is currently met through the control engine.
16. The apparatus of claim 15, wherein the scene is a first scene corresponding to first scene data in the data model, the event trigger condition associated with the first scene data comprises a transition trigger condition, and the control instruction corresponding to the transition trigger condition comprises a transition control instruction indicating to switch to display a second scene corresponding to second scene data in the data model; the execution unit is configured to execute the switching display of the first scene picture as the second scene picture based on the transition control instruction under the condition that the transition trigger condition is currently met by the control engine.
17. The apparatus of claim 16, wherein the transition control instruction further indicates a transition mode between the first scene and the second scene, and wherein the execution unit is configured to execute, by the control engine, switching the first scene to be displayed as the second scene in the transition mode based on the transition control instruction if the transition trigger condition is currently satisfied.
18. The apparatus of claim 17, wherein the execution unit is configured to perform any one of:
the transition mode is a first type transition mode, and the control engine is used for displaying the second scene picture while canceling the display of the first scene picture based on the transition control instruction under the condition that the transition trigger condition is currently met;
The transition mode is a second-class transition mode, and the control engine is used for playing a transition animation matched with the second-class transition mode under the condition that the transition trigger condition is met currently, and displaying the second scene picture after the transition animation is played.
19. The apparatus according to any one of claims 11-18, wherein the apparatus further comprises:
A determining unit configured to perform, for each piece of data in the data model, determining that the data belongs to the target data type in a case where the data includes a target character string;
Wherein the target string is used to represent the target data type.
20. The apparatus of any of claims 11-18, wherein the control engine is operating in a target application, the apparatus further comprising:
A determining unit configured to execute a determination of a display area set by the target application by the control engine, the display area being for displaying a scene picture;
the display unit is configured to display the scene picture in the display area based on the data model by the control engine.
21. An electronic device, the electronic device comprising:
A processor;
A memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the scene-cut display method of any of claims 1 to 10.
22. A non-transitory computer-readable storage medium, wherein instructions in the storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the scene-cut display method of any of claims 1-10.
CN202210344596.1A 2022-03-31 2022-03-31 Scene picture display method and device, electronic equipment and storage medium Active CN114816622B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210344596.1A CN114816622B (en) 2022-03-31 2022-03-31 Scene picture display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210344596.1A CN114816622B (en) 2022-03-31 2022-03-31 Scene picture display method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114816622A CN114816622A (en) 2022-07-29
CN114816622B true CN114816622B (en) 2024-04-30

Family

ID=82532078

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210344596.1A Active CN114816622B (en) 2022-03-31 2022-03-31 Scene picture display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114816622B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105844694A (en) * 2015-08-24 2016-08-10 鲸彩在线科技(大连)有限公司 Game data generating method, game data uploading method, game data generating device, and game data uploading device
CN110062271A (en) * 2019-04-28 2019-07-26 腾讯科技(成都)有限公司 Method for changing scenes, device, terminal and storage medium
CN110941464A (en) * 2018-09-21 2020-03-31 阿里巴巴集团控股有限公司 Light exposure method, device, system and storage medium
CN111432001A (en) * 2020-03-24 2020-07-17 北京字节跳动网络技术有限公司 Method, apparatus, electronic device, and computer-readable medium for jumping scenes
CN111767503A (en) * 2020-07-29 2020-10-13 腾讯科技(深圳)有限公司 Game data processing method and device, computer and readable storage medium
CN112023399A (en) * 2020-08-21 2020-12-04 上海完美时空软件有限公司 Game scene data processing method and device, storage medium and computer equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105844694A (en) * 2015-08-24 2016-08-10 鲸彩在线科技(大连)有限公司 Game data generating method, game data uploading method, game data generating device, and game data uploading device
CN110941464A (en) * 2018-09-21 2020-03-31 阿里巴巴集团控股有限公司 Light exposure method, device, system and storage medium
CN110062271A (en) * 2019-04-28 2019-07-26 腾讯科技(成都)有限公司 Method for changing scenes, device, terminal and storage medium
CN111432001A (en) * 2020-03-24 2020-07-17 北京字节跳动网络技术有限公司 Method, apparatus, electronic device, and computer-readable medium for jumping scenes
CN111767503A (en) * 2020-07-29 2020-10-13 腾讯科技(深圳)有限公司 Game data processing method and device, computer and readable storage medium
CN112023399A (en) * 2020-08-21 2020-12-04 上海完美时空软件有限公司 Game scene data processing method and device, storage medium and computer equipment

Also Published As

Publication number Publication date
CN114816622A (en) 2022-07-29

Similar Documents

Publication Publication Date Title
CN108415705B (en) Webpage generation method and device, storage medium and equipment
CN109920065B (en) Information display method, device, equipment and storage medium
CN109660855B (en) Sticker display method, device, terminal and storage medium
CN109327608B (en) Song sharing method, terminal, server and system
CN108694073B (en) Control method, device and equipment of virtual scene and storage medium
CN109646944B (en) Control information processing method, control information processing device, electronic equipment and storage medium
CN112118477B (en) Virtual gift display method, device, equipment and storage medium
CN112016941A (en) Virtual article pickup method, device, terminal and storage medium
CN108900925B (en) Method and device for setting live broadcast template
CN112181573A (en) Media resource display method, device, terminal, server and storage medium
CN113613028B (en) Live broadcast data processing method, device, terminal, server and storage medium
CN113409427B (en) Animation playing method and device, electronic equipment and computer readable storage medium
CN110750734A (en) Weather display method and device, computer equipment and computer-readable storage medium
CN112052354A (en) Video recommendation method, video display method and device and computer equipment
CN114116053A (en) Resource display method and device, computer equipment and medium
CN111368114A (en) Information display method, device, equipment and storage medium
CN112257006A (en) Page information configuration method, device, equipment and computer readable storage medium
CN111437600A (en) Plot showing method, plot showing device, plot showing equipment and storage medium
CN111083554A (en) Method and device for displaying live gift
CN113032590B (en) Special effect display method, device, computer equipment and computer readable storage medium
CN113609358A (en) Content sharing method and device, electronic equipment and storage medium
CN112004134B (en) Multimedia data display method, device, equipment and storage medium
CN113538633B (en) Animation playing method and device, electronic equipment and computer readable storage medium
CN113467663B (en) Interface configuration method, device, computer equipment and medium
CN113407141B (en) Interface updating method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant