CN114816622A - Scene picture display method and device, electronic equipment and storage medium - Google Patents

Scene picture display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114816622A
CN114816622A CN202210344596.1A CN202210344596A CN114816622A CN 114816622 A CN114816622 A CN 114816622A CN 202210344596 A CN202210344596 A CN 202210344596A CN 114816622 A CN114816622 A CN 114816622A
Authority
CN
China
Prior art keywords
data
scene
control engine
control
scene picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210344596.1A
Other languages
Chinese (zh)
Other versions
CN114816622B (en
Inventor
李伟鹏
蔡晓华
杨小刚
胡方正
鞠达豪
孙弘法
杨凯丽
朱彤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202210344596.1A priority Critical patent/CN114816622B/en
Publication of CN114816622A publication Critical patent/CN114816622A/en
Application granted granted Critical
Publication of CN114816622B publication Critical patent/CN114816622B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/61Scene description

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The disclosure relates to a scene picture display method and device, electronic equipment and a storage medium, and belongs to the technical field of computers. The method comprises the following steps: acquiring configuration information, wherein the configuration information comprises a target data type and instruction information, and the instruction information indicates an instruction for creating a control engine under the condition of acquiring data belonging to the target data type; acquiring a data model, wherein the data model comprises at least one piece of data, the at least one piece of data comprises scene data, and the scene data is used for rendering a scene picture; creating a control engine based on the instruction information in a case where it is determined that the at least one piece of data belongs to the target data type; and displaying a scene picture corresponding to the scene data based on the data model through the control engine. In the embodiment of the disclosure, the display of the scene picture corresponding to the data model can be realized only by configuring the data model of which the contained data belongs to the target data type, and the rendering logic code does not need to be developed for the data model independently, so that the development efficiency is improved.

Description

Scene picture display method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a scene display method and apparatus, an electronic device, and a storage medium.
Background
With the development of internet technology, more and more scenes for putting resources are provided. Usually, developers are required to develop resource data and corresponding rendering logic codes, so that the resource data can be rendered into resources according to the rendering logic codes by devices acquiring the resources when the resources are released subsequently, but the development process is complex and the development efficiency is low due to the above manner.
Disclosure of Invention
The disclosure provides a scene picture display method and device, an electronic device and a storage medium, and development efficiency is improved.
According to an aspect of the embodiments of the present disclosure, there is provided a scene picture display method, including:
acquiring configuration information, wherein the configuration information comprises a target data type and instruction information, and the instruction information indicates an instruction for creating a control engine under the condition that data belonging to the target data type is acquired;
acquiring a data model, wherein the data model comprises at least one piece of data, the at least one piece of data comprises scene data, and the scene data is used for rendering a scene picture;
creating the control engine based on the instruction information if it is determined that the at least one piece of data belongs to the target data type;
and displaying a scene picture corresponding to the scene data based on the data model through the control engine.
In the solution provided by the embodiment of the present disclosure, a target data type is defined in the acquired configuration information, and instruction information of a control engine is created when data of the data type is acquired, so that, when the configuration information is acquired, once a data model of the included data belonging to the target data type is acquired, the control engine is created based on the instruction information, and the data model is run by the control engine to realize display of a scene picture corresponding to scene data in the data model, that is, based on the configuration information, a data model of the included data belonging to the target data type can be run, so that a developer only needs to configure the data model of the included data belonging to the target data type to realize display of the scene picture corresponding to the data model without separately developing and rendering logic codes for the data model, thereby improving the development efficiency.
In some embodiments, the displaying, by the control engine and based on the data model, a scene picture corresponding to the scene data includes:
determining, by the control engine, first scene data from a plurality of pieces of scene data included in the data model;
displaying, by the control engine, a first scene screen based on the first scene data.
In the embodiment of the present disclosure, each piece of scene data included in the data model is used to display one scene, the control engine has a function of operating the data model, and after the control engine is created, the first scene data that is preferentially displayed is determined in the data model to display the first scene, so that the scene corresponding to the scene data in the data model is controlled by using the control engine.
In some embodiments, the first scene data includes resource data, and the displaying, by the control engine, a first scene picture based on the first scene data includes:
displaying, by the control engine, a first scene screen including a resource corresponding to the resource data based on the first scene data.
In the embodiment of the disclosure, the scene data includes the resource data, and the resource data in the scene data can be rendered into the resource for display, so that the content included in the scene picture is enriched, and the display effect of the scene picture is ensured.
In some embodiments, the displaying, by the control engine and based on the first scene data, a first scene picture including a resource corresponding to the resource data includes:
determining, by the control engine, a data format to which the resource data belongs;
rendering the resource data into a control in the first scene picture according to a control template matched with the data format through the control engine, wherein the control contains a resource corresponding to the resource data.
In the embodiment of the present disclosure, in consideration of the fact that the data formats to which the resource data belong are different, and the samples to be presented are also different, the template control matched with the data format to which each resource data belongs is used to render the resource, so that the display style of the resource in the first scene picture is matched with the data format to which the resource belongs, and the display effect of the resource in the first scene picture is ensured.
In some embodiments, the data model further includes an event trigger condition associated with the scene data and a control instruction corresponding to the event trigger condition, and after the scene picture corresponding to the scene data is displayed based on the data model by the control engine, the method further includes:
and executing a control instruction corresponding to the event trigger condition through the control engine under the condition that the event trigger condition is currently met.
In the embodiment of the present disclosure, the data model includes an event trigger condition associated with the scene data and a control instruction corresponding to the event trigger condition, and the display of the scene picture can be controlled by the control engine in combination with the event trigger condition and the control instruction configured in the data model, that is, the data model is run without developing an additional processing logic code, thereby improving the development efficiency.
In some embodiments, the scene picture is a first scene picture corresponding to first scene data in the data model, the event trigger condition associated with the first scene data includes a transition trigger condition, the control instruction corresponding to the transition trigger condition includes a transition control instruction, and the transition control instruction instructs to switch to display a second scene picture corresponding to second scene data in the data model; the executing, by the control engine, the control instruction corresponding to the event trigger condition under the condition that the event trigger condition is currently satisfied includes:
and switching and displaying the first scene picture into the second scene picture based on the transition control instruction by the control engine under the condition that the transition triggering condition is met currently.
In the embodiment of the disclosure, once it is determined that the scene trigger condition is currently satisfied, based on the transition control instruction corresponding to the transition trigger condition, based on the second scene data indicated by the transition control instruction, the control engine can render the second scene picture, and simultaneously switch and display the first scene picture as the second scene picture, that is, the control engine can control the switching and display of the scene pictures by combining the event trigger condition and the control instruction, and it is not necessary to develop rendering logic codes for the data model separately, thereby improving the development efficiency.
In some embodiments, the transition control instruction further indicates a transition mode between the first scene picture and the second scene picture, and the switching, by the control engine, the displaying of the first scene picture as the second scene picture based on the transition control instruction in a case where the transition trigger condition is currently satisfied includes:
and switching and displaying the first scene picture into the second scene picture according to the transition mode based on the transition control instruction under the condition that the transition triggering condition is met currently through the control engine.
In the embodiment of the disclosure, the transition control instruction includes a transition mode between scene pictures, and the scene pictures are switched and displayed according to the set transition mode, so that the switching mode of the scene pictures is enriched, and the display effect is ensured.
In some embodiments, the switching, by the control engine, the first scene picture to be displayed as the second scene picture according to the transition mode based on the transition control instruction in the case that the transition triggering condition is currently satisfied includes any one of:
the transition mode is a first transition type, and the control engine displays the second scene picture while canceling the display of the first scene picture based on the transition control instruction under the condition that the transition triggering condition is currently met;
the transition mode is a second transition mode, the transition animation matched with the second transition mode is played through the control engine under the condition that the transition triggering condition is met currently, and the second scene picture is displayed after the transition animation is played.
In the embodiment of the disclosure, multiple types of transition modes are provided, and the scene pictures are switched and displayed according to the configured transition modes, so that the diversity of switching and displaying the scene pictures can be enriched, and the display effect is ensured.
In some embodiments, before creating the control engine based on the instruction information in the case where the at least one piece of data is determined to belong to the target data type, the method further comprises:
for each piece of data in the data model, determining that the data belongs to the target data type if the data comprises a target string;
wherein the target string is used to represent the target data type.
In the scheme provided by the embodiment of the disclosure, a method for identifying the target data type is provided, and whether the acquired data in the data model belongs to the target data type is determined according to the target character string used for representing the target data type, so that the identification accuracy is ensured.
In some embodiments, the control engine is running in a target application, and after creating the control engine based on the instruction information if it is determined that the at least one piece of data belongs to the target data type, the method further comprises:
determining a display area set by the target application through the control engine, wherein the display area is used for displaying a scene picture;
the displaying, by the control engine and based on the data model, a scene picture corresponding to the scene data includes:
displaying, by the control engine, the scene picture in the display area based on the data model.
In the scheme provided by the embodiment of the disclosure, the control engine running in the target application displays the scene picture in the display area set by the target application, so that a mode of displaying the scene picture in the application is realized, and the display accuracy of the scene picture is ensured.
According to another aspect of the disclosed embodiments, there is provided a scene screen display apparatus, the apparatus including:
an acquisition unit configured to execute acquisition of configuration information including a target data type and instruction information indicating an instruction for creating a control engine in a case where data belonging to the target data type is acquired;
the obtaining unit is further configured to perform obtaining a data model, wherein the data model comprises at least one piece of data, and the at least one piece of data comprises scene data, and the scene data is used for rendering a scene picture;
a creating unit configured to perform, in a case where it is determined that the at least one piece of data belongs to the target data type, creating the control engine based on the instruction information;
a display unit configured to perform displaying, by the control engine, a scene screen corresponding to the scene data based on the data model.
In some embodiments, the display unit includes:
a determining subunit configured to perform determining, by the control engine, first scene data from a plurality of pieces of scene data included in the data model;
a display subunit configured to perform displaying, by the control engine, a first scene screen based on the first scene data.
In some embodiments, the first scene data includes resource data, and the display subunit is configured to perform displaying, by the control engine, based on the first scene data, a first scene screen including a resource to which the resource data corresponds.
In some embodiments, the display subunit is configured to execute determining, by the control engine, a data format to which the resource data belongs; rendering the resource data into a control in the first scene picture according to a control template matched with the data format through the control engine, wherein the control contains a resource corresponding to the resource data.
In some embodiments, the data model further includes an event trigger condition associated with the scene data and a control instruction corresponding to the event trigger condition, and the apparatus further includes:
and the execution unit is configured to execute the control instruction corresponding to the event trigger condition under the condition that the event trigger condition is currently met through the control engine.
In some embodiments, the scene picture is a first scene picture corresponding to first scene data in the data model, the event trigger condition associated with the first scene data includes a transition trigger condition, the control instruction corresponding to the transition trigger condition includes a transition control instruction, and the transition control instruction instructs to switch to display a second scene picture corresponding to second scene data in the data model; the execution unit is configured to execute, by the control engine, switching and displaying the first scene picture as the second scene picture based on the transition control instruction in a case where the transition trigger condition is currently satisfied.
In some embodiments, the transition control instruction further indicates a transition mode between the first scene picture and the second scene picture, and the execution unit is configured to execute switching and displaying, by the control engine, the first scene picture as the second scene picture in the transition mode based on the transition control instruction in a case where the transition trigger condition is currently satisfied.
In some embodiments, the execution unit is configured to perform any one of:
the transition mode is a first transition type, and the control engine displays the second scene picture while canceling the display of the first scene picture based on the transition control instruction under the condition that the transition triggering condition is currently met;
the transition mode is a second transition mode, the transition animation matched with the second transition mode is played through the control engine under the condition that the transition triggering condition is met currently, and the second scene picture is displayed after the transition animation is played.
In some embodiments, the apparatus further comprises:
a determining unit configured to perform, for each piece of data in the data model, in a case where the data includes a target character string, determining that the data belongs to the target data type;
wherein the target string is used to represent the target data type.
In some embodiments, the control engine runs in a target application, the apparatus further comprising:
a determination unit configured to perform determining, by the control engine, a display area set by the target application, the display area being used to display a scene screen;
the display unit is configured to perform displaying the scene picture in the display area based on the data model by the control engine.
According to still another aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the scene picture display method according to the above aspect.
According to still another aspect of the embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium, wherein instructions of the storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the scene screen display method according to the above aspect.
According to still another aspect of the embodiments of the present disclosure, there is provided a computer program product, wherein when the instructions of the computer program product are executed by a processor of an electronic device, the electronic device is enabled to execute the scene picture display method according to the above aspect.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a schematic diagram illustrating one implementation environment in accordance with an example embodiment.
Fig. 2 is a flowchart illustrating a scene screen display method according to an exemplary embodiment.
Fig. 3 is a flowchart illustrating another scene display method according to an exemplary embodiment.
Fig. 4 is a flowchart illustrating another scene display method according to an exemplary embodiment.
Fig. 5 is a flowchart illustrating still another scene display method according to an exemplary embodiment.
Fig. 6 is a schematic diagram illustrating a scene cut switch according to an exemplary embodiment.
Fig. 7 is a block diagram illustrating a scene screen display apparatus according to an exemplary embodiment.
Fig. 8 is a block diagram illustrating a scene screen display apparatus according to an exemplary embodiment.
Fig. 9 is a block diagram illustrating a terminal according to an example embodiment.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the description of the above-described figures are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
As used in this disclosure, the terms "at least one," "a plurality," "each," "any," at least one includes one, two, or more than two, and a plurality includes two or more than two, each referring to each of the corresponding plurality, and any referring to any one of the plurality. For example, the plurality of pieces of scene data include 3 pieces of scene data, each piece of scene data refers to each piece of scene data of the 3 pieces of scene data, and any piece of scene data refers to any piece of scene data of the 3 pieces of scene data, and can be a first piece of scene data, a second piece of scene data, or a third piece of scene data.
It should be noted that information (including but not limited to configuration information), data (including but not limited to data for analysis, stored data, presented data, etc.) referred to in this disclosure are all authorized by the user or sufficiently authorized by various parties.
The scene picture display method provided by the embodiment of the application is executed by the electronic equipment. In some embodiments, the electronic device is provided as a terminal. FIG. 1 is a schematic illustration of an implementation environment provided in accordance with an example embodiment, the implementation environment comprising: the terminal 101 and the server 102, and the terminal 101 and the server 102 are connected through a wireless or wired network, which is not limited in this disclosure.
The terminal 101 is a mobile phone, a tablet computer, a computer, or other types of terminals, but is not limited thereto. The server 102 is a server, or a server cluster composed of several servers, or a cloud computing service center, but is not limited thereto.
The server 102 is configured to provide a data model for the terminal 101, and the terminal 101 is configured to display a scene picture corresponding to scene data in the data model through the data model provided by the server 102.
In some embodiments, the terminal 101 installs a target application served by the server 102, through which the terminal 101 can implement functions such as resource exposure. For example, the target application is a target application in an operating system of the terminal 101 or a target application provided by a third party. For another example, the target application is a resource sharing application, and the resource sharing application has a resource sharing function, and of course, the resource sharing application can also have other functions, such as a comment function, a shopping function, a navigation function, and the like.
The terminal 101 logs in the target application through the account, and interacts with the server 102 through the target application. The server pushes a data model to the terminal 101 where the target application is installed; alternatively, the terminal 101 sends a data request to the server 102 through the target application, and the server 102 sends the data model to the terminal 101 based on the data request. The terminal 101 displays a scene picture corresponding to scene data in the data model in the target application based on the data model sent by the server 102.
Fig. 2 is a flowchart illustrating a scene screen display method according to an exemplary embodiment, referring to fig. 2, the method being performed by a terminal and including the steps of:
in step 201, configuration information is acquired, the configuration information including a target data type and instruction information indicating an instruction for creating a control engine in a case where data belonging to the target data type is acquired.
The target data type is any data type, and the target data type can be represented in any form, for example, the target data type is represented in the form of a character string, that is, the configuration information includes a character string representing the target data type. In the embodiment of the present disclosure, the configuration information already defines a target data type and an instruction for creating a control engine, where the control engine is configured to run data belonging to the target data type, and therefore, the instruction information defines that the control engine is created only when data belonging to the target data type is obtained, so as to ensure that the obtained data can be run normally when data belonging to the target data type is obtained.
In step 202, a data model is obtained, the data model comprising at least one piece of data, the at least one piece of data comprising scene data, the scene data being used for rendering a scene picture.
The data model includes at least one piece of data, and the data model is a data set. The scene data in the at least one piece of data includes data required for rendering a scene picture, and the scene data includes data describing content to be displayed, for example, the scene data includes resource data describing a resource to be displayed in a scene picture corresponding to the scene data. The scene is a picture to be displayed, and the scene can be displayed in any form, for example, the scene is displayed in a pop-up window form, or displayed in other forms, which is not limited by the present disclosure.
In step 203, in the case where it is determined that the at least one piece of data belongs to the target data type, a control engine is created based on the instruction information.
In the embodiment of the present disclosure, since the instruction indicated by the instruction information in the configuration information is used to create the control engine when data belonging to the target data type is acquired, after the terminal acquires the data model, once it is determined that at least one piece of data included in the data model belongs to the target data type, the control engine is created based on the instruction information, so that the data model is subsequently run through the control engine.
In step 204, a scene screen corresponding to the scene data is displayed by the control engine based on the data model.
In the embodiment of the present disclosure, the control engine has a function of executing data belonging to a target data type, the acquired data model includes scene data, and the scene data is used for rendering a scene picture, and then the control engine executes the data model, so that the scene picture can be rendered based on the scene data included in the data model.
In the solution provided by the embodiment of the present disclosure, a target data type is defined in the acquired configuration information, and instruction information of a control engine is created when data of the data type is acquired, so that, when the configuration information is acquired, once a data model of the included data belonging to the target data type is acquired, the control engine is created based on the instruction information, and the data model is run by the control engine to realize display of a scene picture corresponding to scene data in the data model, that is, based on the configuration information, a data model of the included data belonging to the target data type can be run, so that a developer only needs to configure the data model of the included data belonging to the target data type to realize display of the scene picture corresponding to the data model without separately developing and rendering logic codes for the data model, thereby improving the development efficiency.
In some embodiments, displaying, by the control engine, a scene picture corresponding to the scene data based on the data model includes:
determining, by a control engine, first scene data from a plurality of pieces of scene data included in a data model;
displaying, by the control engine, a first scene picture based on the first scene data.
In some embodiments, the first scene data includes resource data, and the displaying, by the control engine, the first scene picture based on the first scene data includes:
and displaying a first scene picture comprising the resource corresponding to the resource data based on the first scene data through the control engine.
In some embodiments, displaying, by the control engine, based on the first scene data, a first scene screen including a resource corresponding to the resource data includes:
determining a data format of the resource data through a control engine;
and rendering the resource data into a control in the first scene picture by a control engine according to a control template matched with the data format, wherein the control contains a resource corresponding to the resource data.
In some embodiments, the data model further includes an event trigger condition associated with the scene data and a control instruction corresponding to the event trigger condition, and after the scene picture corresponding to the scene data is displayed based on the data model by the control engine, the method further includes:
and executing a control instruction corresponding to the event trigger condition by the control engine under the condition that the event trigger condition is currently met.
In some embodiments, the scene picture is a first scene picture corresponding to first scene data in the data model, the event trigger condition associated with the first scene data includes a transition trigger condition, the control instruction corresponding to the transition trigger condition includes a transition control instruction, and the transition control instruction instructs to switch and display a second scene picture corresponding to second scene data in the data model; through the control engine, under the condition that the event trigger condition is currently met, executing a control instruction corresponding to the event trigger condition, wherein the control instruction comprises the following steps:
and switching and displaying the first scene picture into a second scene picture through the control engine based on the transition control instruction under the condition that the transition triggering condition is met currently.
In some embodiments, the transition control instruction further indicates a transition mode between the first scene picture and the second scene picture, and the switching, by the control engine, the first scene picture to be displayed as the second scene picture based on the transition control instruction in a case where the transition triggering condition is currently satisfied includes:
and switching and displaying the first scene picture into a second scene picture according to a transition mode based on a transition control instruction by the control engine under the condition that the transition triggering condition is met currently.
In some embodiments, switching, by the control engine, the first scene picture to be displayed as the second scene picture in the transition mode based on the transition control instruction in a case where the transition triggering condition is currently satisfied, where switching is performed, includes any one of:
the transition mode is a first transition mode, and a control engine is used for displaying a second scene picture while canceling to display a first scene picture on the basis of a transition control instruction under the condition that a transition triggering condition is currently met;
the transition mode is a second transition mode, the transition animation matched with the second transition mode is played through the control engine under the condition that the transition triggering condition is met currently, and a second scene picture is displayed after the transition animation is played.
In some embodiments, in the case that it is determined that the at least one piece of data belongs to the target data type, before creating the control engine based on the instruction information, the method further comprises:
for each piece of data in the data model, determining that the data belongs to a target data type under the condition that the data comprises a target character string;
wherein the target string is used to represent the target data type.
In some embodiments, the control engine runs in the target application, and after creating the control engine based on the instruction information in a case where it is determined that the at least one piece of data belongs to the target data type, the method further comprises:
determining a display area set by a target application through a control engine, wherein the display area is used for displaying a scene picture;
displaying, by a control engine, a scene picture corresponding to scene data based on a data model, including:
displaying, by the control engine, the scene picture in the display area based on the data model.
Based on the embodiment shown in fig. 2, taking the example that the data model includes a plurality of pieces of scene data, the control engine can switch and display the scene pictures corresponding to the plurality of pieces of scene data, and the specific process is described in detail in the following embodiments.
Fig. 3 is a flowchart illustrating a scene screen display method according to an exemplary embodiment, referring to fig. 3, the method being performed by a terminal and including the steps of:
in step 301, configuration information is acquired, the configuration information including a target data type and instruction information indicating an instruction for creating a control engine in a case where data belonging to the target data type is acquired.
In the embodiment of the present disclosure, the target data type included in the configuration information includes one or more types, and the instruction information indicates an instruction for creating the control engine in a case where the acquired data belongs to any one or more target data types.
In some embodiments, the target data type is a data type defined in the target protocol. The target protocol is used to define a data type to which data required for displaying a certain resource belongs, for example, the target protocol is an interactive advertisement protocol, the interactive advertisement protocol indicates a data type to which data required for displaying an interactive advertisement belongs, and the data types defined in the interactive advertisement protocol are all used as the target data type.
As for the obtaining mode of the configuration information, in some embodiments, the terminal stores the configuration information, and the terminal obtains the configuration information from the storage area; or, the terminal obtains the configuration information issued by the server, which is not limited in this disclosure.
The server is used for providing service for the terminal, the configuration information is stored in the server, and the terminal and the server interactively acquire the configuration information stored in the server.
In the disclosed embodiment, the configuration information is configured by a developer. After the configuration information is configured by the developer, the configuration information is added and stored in the terminal. Or after the developer finishes the configuration, the configuration information is stored in a server, and the terminal can acquire the configuration information from the server.
In a possible implementation manner of the foregoing embodiment, a process of a terminal acquiring configuration information sent by a server includes: the terminal sends an information acquisition request to the server, and the server receives the information acquisition request and returns the configuration information to the terminal.
In a possible implementation manner of the foregoing embodiment, the terminal is installed with a target application, the configuration information is stored in a server providing a service for the target application, the terminal sends an information acquisition request to the server in the case of running the target application, the server sends the configuration information stored in the server to the terminal based on the information acquisition request, and the terminal receives the configuration information sent by the server.
In a possible implementation manner of the foregoing embodiment, the terminal is installed with a target application, the configuration information is included in the target application, and the configuration information in the target application is obtained when the terminal runs the target application.
In step 302, a data model is obtained.
Wherein the data model includes at least one piece of data including scene data used to render a scene. For example, the scene data includes a scene identifier, scene content, profile information, or scene lifecycle, wherein the scene identifier is represented in any form, such as a character string; the scene content is content contained in a scene picture corresponding to the scene data, for example, the scene content includes resource data in the scene data; the profile information is profile information for describing a scene picture corresponding to the scene data; the scene life cycle indicates the scene picture display time, disappearance time or display duration and the like corresponding to the scene data.
In the embodiment of the present disclosure, the data model is a data model corresponding to a resource, and the data model is used for displaying the resource. For example, the resource is an advertisement, the data model is a data model corresponding to the advertisement, and the advertisement can be displayed through the data model. In the case where the resource is an advertisement, the scene data in the data model is used to render a scene picture of the advertisement. As another example, the advertisement may be a variety of forms of information-stream advertisement, motivational video, on-screen advertisement, and the like.
In some embodiments, the data model includes a plurality of pieces of data, the plurality of pieces of data including a plurality of pieces of scene data.
Each piece of scene data is used for rendering a scene picture. For example, the data model is a data model corresponding to an advertisement, the data model includes a plurality of pieces of scene data, each piece of scene data corresponds to one scene picture, that is, the advertisement includes a plurality of scene pictures. When the advertisement is displayed, the plurality of scene pictures are displayed according to the data in the data model, namely, the advertisement is displayed.
For the manner of obtaining the data model, in some embodiments, the terminal receives the data model delivered by the server.
The server is used for providing a data model for the terminal, and the terminal stores a plurality of data models. In the embodiment of the disclosure, the data model is stored in the server after being configured by the developer.
In a possible implementation manner of the foregoing embodiment, the terminal is installed with a target application, the server is configured to provide a service for the target application, the terminal sends a data obtaining request to the server through the target application, the server receives the data obtaining request and sends a data model to the terminal, and the terminal receives the data model sent by the server through the target application.
In the embodiment of the present disclosure, the terminal obtains the data model from the server through the target application, so as to display a scene picture corresponding to the data model in the target application in the following.
In step 303, in the event that it is determined that the data in the data model belongs to the target data type, a control engine is created based on the instruction information.
Wherein the control engine is configured to run a data model belonging to the target data type. In this disclosure, after the terminal acquires the configuration information, the terminal checks the acquired data based on the configuration information, and if it is determined that the data included in the acquired data model all belong to the target data type, a control engine is created for the data model based on the instruction information in the configuration information, so that the data model is subsequently run through the control engine.
In some embodiments, the configuration information includes a plurality of target data types, the data model includes a plurality of pieces of data, and step 303 includes: in the event that each piece of data in the data model is determined to be of a target data type, a control engine is created based on the instruction information.
In the embodiment of the present disclosure, the data model includes a plurality of pieces of data, and the configuration information includes a plurality of target data types, and the data types of the plurality of pieces of data included in the data model may be different, for example, a first piece of data in the data model belongs to a first target data type, and a second piece of data in the data model belongs to a second target data type. In the case where each piece of data in the data model is determined to belong to any one of the target data types, i.e., a control engine needs to be created for the data model, a control engine is created based on the instruction information.
In some embodiments, the process of determining whether data is of a target data type includes: for each piece of data in the data model, where the data includes a target string, it is determined that the data belongs to the target data type.
Wherein the target string is used to represent the target data type. For example, the target character string is a type name of the target data type or a type identifier of the data type, and the disclosure is not limited herein. In the embodiment of the present disclosure, the data belonging to the target data type includes the character string, and in the case that any data includes the target character string, it means that the data belongs to the target data type.
In a possible implementation manner of the foregoing embodiment, the configuration information includes multiple target data types, each target data type corresponds to one target character string, and the target character string is used to represent the corresponding target data type, and for each piece of data in the data model, in a case that the data includes any target character string, it is determined that the data belongs to the target data type corresponding to the target character string. In the scheme provided by the embodiment of the disclosure, a method for identifying the target data type is provided, and whether the acquired data in the data model belongs to the target data type is determined according to the target character string used for representing the target data type, so that the identification accuracy is ensured.
In step 304, first scene data is determined from a plurality of scene data included in the data model by the control engine.
The first scene data is any scene data in the plurality of pieces of scene data.
In some embodiments, the data model includes a plurality of pieces of scene data and a display order of each piece of scene data, and the first scene data is determined as the scene data with the most advanced display order by the display order corresponding to the plurality of pieces of scene data.
In some embodiments, the data model includes an event trigger condition associated with each piece of scene data and a control instruction corresponding to the event trigger condition, then step 304 includes: and determining the scene data specified in the control instruction as the first scene data by the control engine under the condition that any event trigger condition is currently met and the control instruction corresponding to the event trigger condition indicates that the scene picture corresponding to the specified scene data is displayed.
In the embodiment of the present disclosure, the event trigger condition indicates a condition to be satisfied for executing a control instruction corresponding to the event trigger condition, and the control instruction is used for executing a certain operation, for example, the control instruction indicates to acquire certain data, or the control instruction indicates to display a scene picture corresponding to certain scene data. And once determining that any event trigger condition is met currently, executing a control instruction corresponding to the event trigger condition subsequently, and determining scene data specified in the control instruction, namely determining first scene data, when the control instruction indicates that a scene picture corresponding to the specified scene data is displayed, so as to execute corresponding operation based on the determined scene data subsequently according to the control instruction.
For example, if the data model includes an event trigger condition that the control engine is successfully created, and a control instruction corresponding to the event trigger condition indicates that a scene picture corresponding to the first scene data is displayed, the control engine determines, according to the control instruction, the first scene data from the plurality of pieces of scene data included in the data model when the control engine is successfully created.
In step 305, a first scene screen is displayed based on the first scene data by the control engine.
In the embodiment of the disclosure, the control engine has a function of running a data model, and the control engine can render the first scene picture based on the first scene data. In the embodiment of the present disclosure, each piece of scene data included in the data model is used to display one scene, and after the control engine is created, the first scene data that is preferentially displayed is determined in the data model to display the first scene, so that the control engine is used to control the scene corresponding to the scene data in the data model to display.
In some embodiments, the first scenario data comprises resource data, and step 305 comprises: and displaying, by the control engine, a first scene picture including a resource corresponding to the resource data based on the first scene data.
The resource data is used to describe a resource, which is any type of resource, such as text, image, video, or the like. When the control engine renders the first scene data into the first scene picture, rendering the resource data included in the first scene data into resources, so that the displayed first scene picture includes the resources corresponding to the resource data. In the embodiment of the disclosure, the scene data includes the resource data, and the resource data in the scene data can be rendered into the resource for display, so that the content included in the scene picture is enriched, and the display effect of the scene picture is ensured.
In a possible implementation manner of the foregoing embodiment, the displaying the first scene picture includes: determining a data format to which the resource data belongs through the control engine; rendering the resource data into a control in the first scene picture according to a control template matched with the data format through the control engine, wherein the control contains a resource corresponding to the resource data.
The data format may be any type of data format, for example, the data format may include a text format, an image format, a video format, an animation format, or the like. The control included in the first scene screen is used for carrying the resource, and the control is in any form, for example, the control included in the first scene screen is a native control.
In the disclosed embodiments, each data format corresponds to a control template, for example, the control templates corresponding to multiple data formats include a text control template, an image control template, an animation control template, a horizontal box control template, and a vertical box control template. When any resource data is rendered, rendering is performed according to a control template matched with the data format to which the resource data belongs, so that the resource corresponding to the resource data is displayed in the first scene picture, and the resource is displayed in a control mode. That is, the control in the first scene picture contains the resource corresponding to the resource data. In the embodiment of the present disclosure, in consideration of the fact that the data formats to which the resource data belong are different, and the samples to be presented are also different, the template control matched with the data format to which each resource data belongs is used to render the resource, so that the display style of the resource in the first scene picture is matched with the data format to which the resource belongs, and the display effect of the resource in the first scene picture is ensured.
In some embodiments, the first scenario data comprises a plurality of pieces of resource data, and this step 305 comprises: and displaying, by the control engine, a first scene picture including a resource corresponding to each piece of resource data based on the first scene data.
For example, the first scene data includes text data, image data, and video data, and the displayed first scene picture includes text corresponding to the text data, an image corresponding to the image data, and a video corresponding to the video data.
In some embodiments, the first scenario data includes resource data and operation control data, then this step 305 includes: and displaying a first scene picture comprising resources corresponding to the resource data based on the first scene data through the control engine, wherein the first scene picture comprises operation controls corresponding to the operation control data.
The operation control data is used to describe an operation control, and for example, the operation control data includes a display size, a display style, or information filled in the operation control. For example, the operation control in the first scene picture is a "next page" button, or the operation control is a button that can be triggered after 10 seconds, and a progress bar for counting down for 10 seconds or displaying the duration of 10 seconds is displayed in the button.
It should be noted that, in the embodiment of the present disclosure, the first scene picture corresponding to the first scene data is displayed as an example, and in another embodiment, the step 304 and the step 305 need not be executed, but other manners are adopted, and the scene picture corresponding to the scene data is displayed based on the data model through the control engine.
In step 306, when the transition triggering condition associated with the first scene data is currently satisfied, the control engine switches and displays the first scene picture to the second scene picture based on the transition control command corresponding to the transition triggering condition.
In the embodiment of the present disclosure, the data model includes an event trigger condition associated with the scene data and a control instruction corresponding to the event trigger condition, where the event trigger condition indicates a condition that needs to be met for executing the control instruction corresponding to the event trigger condition, and the control instruction is used to execute a certain operation. The first scene data is associated with an event trigger condition, the event trigger condition associated with the first scene data comprises a transition trigger condition, the control instruction corresponding to the transition trigger condition comprises a transition control instruction, and the transition control instruction instructs to switch and display a second scene picture corresponding to second scene data in the data model. The transition trigger condition indicates a condition that needs to be satisfied when executing a transition control instruction corresponding to the transition trigger condition, that is, a trigger condition for switching the first scene picture to the second scene picture. After the first scene picture is displayed, once the scene trigger condition is determined to be currently met, the control engine can render a second scene picture based on a transition control instruction corresponding to the transition trigger condition and second scene data indicated by the transition control instruction, and simultaneously switch and display the first scene picture into the second scene picture.
In some embodiments, the transition triggering condition is that the display duration of the first scene picture reaches the first duration, or a scene switching operation is detected in the first scene picture, and the like, which is not limited by the present disclosure.
The first time period is an arbitrary time period, for example, the first time period is 5 seconds or 10 seconds. For example, at the beginning of displaying the first scene, the display duration of the first scene is recorded, and once the display duration of the first scene is determined to reach the first duration, the second scene is switched to be displayed according to the above step 306. For another example, the first scene image includes an operation control for switching the scene images, and once the operation control is detected to be triggered, which is equivalent to the detection of the scene switching operation, the second scene image is switched and displayed according to the step 306.
In some embodiments, the transition control instruction further indicates a transition mode between the first scene picture and the second scene picture, then step 306 comprises: and switching and displaying the first scene picture into the second scene picture according to the transition mode based on the transition control instruction by the control engine under the condition that the transition triggering condition is currently met.
Wherein the transition mode is used to indicate a manner of switching the first scene picture to the second scene picture, the transition mode being any type of mode, for example, the transition mode includes a visibility transition mode, a template transition mode, an element sharing transition mode, or a motion picture transition mode. When the visibility transition mode indicates that two scene pictures are switched to be displayed, one scene picture is cancelled to be displayed while the other scene picture is displayed; the template transition mode indicates that two scene pictures are switched and displayed according to a fixed transition template; the element sharing transition mode indicates that when the picture of the like scene containing the same element is switched to be displayed, the same element remains unchanged and only pictures other than the same element are switched. When the animation transition mode indicates that two scene pictures are switched to be displayed, a transition animation is played. In the embodiment of the disclosure, the transition control instruction includes a transition mode between scene pictures, and the scene pictures are switched and displayed according to the set transition mode, so that the switching mode of the scene pictures is enriched, and the display effect is ensured.
For example, if the transition mode between the first scene picture and the second scene picture is the visibility transition mode, when the second scene picture is switched to be displayed, the first scene picture gradually disappears and the second scene picture gradually appears according to the visibility transition mode. For another example, the transition mode between the first scene picture and the second scene picture is a template transition mode, that is, when the scene pictures are switched, switching is performed according to a fixed display template, for example, the first scene picture moves to the right side of the display interface and gradually moves out of the display interface, and the second scene picture is displayed and gradually moves from the left side of the display interface and enters the display interface, so that switching from the first scene picture to the second scene picture is realized.
In a possible implementation manner of the foregoing embodiment, the process of switching the scene picture according to the transition mode includes any one of the following:
the first item: the transition mode is a transition mode of a first kind, and the control engine is used for displaying the second scene picture while canceling the display of the first scene picture based on the transition control instruction under the condition that the transition triggering condition is currently met.
The second term is: the transition mode is a second transition mode, the transition animation matched with the second transition mode is played through the control engine under the condition that the transition triggering condition is met currently, and the second scene picture is displayed after the transition animation is played.
Wherein the second transition mode plays transition animation, which is included in the data model. In the embodiment of the present disclosure, transition modes between scene pictures include multiple types, and it is only necessary to directly switch and display a scene picture according to a first transition mode, and a transition animation matched with a second transition mode needs to be played when a scene picture is switched and displayed according to the second transition mode. In the embodiment of the disclosure, multiple types of transition modes are provided, and the scene pictures are switched and displayed according to the configured transition modes, so that the diversity of switching and displaying the scene pictures can be enriched, and the display effect is ensured.
In some embodiments, the data model includes scene position relationship data, and the control engine switches and displays, based on a transition control instruction corresponding to a transition trigger condition, a first scene picture as a second scene picture according to a position relationship between the first scene picture and the second scene picture indicated by the scene position relationship data when the transition trigger condition associated with the first scene data is currently satisfied. The scene position relation data is used for describing the relation between the display positions of the scene pictures corresponding to the scene data in the data model.
In the solution provided by the embodiment of the present disclosure, a target data type is defined in the acquired configuration information, and instruction information of a control engine is created when data of the data type is acquired, so that, when the configuration information is acquired, once a data model of the included data belonging to the target data type is acquired, the control engine is created based on the instruction information, and the data model is run by the control engine to realize display of a scene picture corresponding to scene data in the data model, that is, based on the configuration information, a data model of the included data belonging to the target data type can be run, so that a developer only needs to configure the data model of the included data belonging to the target data type to realize display of the scene picture corresponding to the data model without separately developing and rendering logic codes for the data model, thereby improving the development efficiency.
Moreover, the scene data comprises the resource data, the resource data in the scene data can be rendered into resources for display, and the content of the scene picture is enriched, so that the display effect of the scene picture is ensured.
In addition, in consideration of the fact that the data formats of the resource data are different and the samples required to be presented are also different, the template control matched with the data format of each resource data is used for rendering the resource, so that the display style of the resource in the first scene picture is matched with the data format of the resource, and the display effect of the resource in the first scene picture is guaranteed.
And the transition control instruction comprises a transition mode between the scene pictures, and the scene pictures are switched and displayed according to the set transition mode, so that the switching mode of the scene pictures is enriched, and the display effect is ensured.
It should be noted that the above-mentioned embodiment shown in fig. 3 is described by taking as an example that an event trigger condition associated with the first scene screen and instructing to switch the scene screen is triggered after the first scene screen is displayed, while in another embodiment, another event trigger condition is associated with the scene data corresponding to the currently displayed scene screen, and after the first scene screen is displayed, step 306 is not required to be executed, but the control engine executes the control instruction corresponding to the event trigger condition when any event trigger condition is currently satisfied.
In some embodiments, the target data model includes a position change trigger condition associated with the first scene data and a position change control instruction corresponding to the position change trigger condition, and when the first scene picture is displayed, the control engine changes the display position of the first scene picture according to the position change control instruction when the position change trigger condition is triggered.
In a possible implementation manner of the foregoing embodiment, the position change control instruction further instructs a position change transition mode, and the display position of the first scene picture is changed from the second display position to the first display position according to the first display position and the position change transition mode indicated by the position change control instruction. The second display position is a display position before the position is not changed, and the first display position is a display position after the position is changed. The position change transition mode is used for presenting the effect that the display position of the first scene picture is changed.
In some embodiments, the first scene data includes a plurality of sub-scene data, the first scene picture is displayed based on the first sub-scene data in the first scene data, the target data model includes a content update triggering condition associated with the first scene data and a content update control instruction corresponding to the content update triggering condition, and if the first scene picture is displayed, the control engine updates and displays a third scene picture corresponding to the first scene data according to the second sub-scene data indicated by the content update control instruction if the content update triggering condition is triggered.
And the content of the updated and displayed third scene picture is displayed based on the second sub-scene data, so that the content of the first scene picture is updated. The second sub-scene data is any sub-scene data different from the first sub-scene data among the plurality of sub-scene data included in the first scene data.
In a possible implementation manner of the foregoing embodiment, the content update control instruction further indicates an intra-scene transition mode, and the first scene picture is switched and displayed to the third scene picture according to the intra-scene transition mode indicated by the content update control instruction. The intra-scene transition mode is used for presenting the effect of switching between different scene pictures corresponding to the same scene data, the display positions and the picture sizes of the first scene picture and the third scene picture are the same, and the contents of the first scene picture and the third scene picture are different, so that the intra-scene transition mode is used for presenting the effect of updating the contents in the pictures presented by the terminal.
In a possible implementation manner of the foregoing embodiment, the content update control instruction indicates an intra-scene animation transition mode, a intra-scene animation corresponding to the intra-scene animation transition mode is played according to the intra-scene animation transition mode indicated by the content update control instruction, and after the intra-scene animation is played, the intra-scene animation is switched and displayed to be a third scene picture.
On the basis of the embodiment shown in fig. 2, taking as an example that the terminal is installed with a target application in which the terminal displays a scene screen, as shown in fig. 4, the method is executed by the terminal, and the method includes the following steps:
in step 401, in the case of running a target application, configuration information is acquired, the configuration information including a target data type and instruction information indicating an instruction for creating a control engine in the case of acquiring data belonging to the target data type.
In step 402, in the case of running the target application, a data model is obtained, the data model comprising at least one piece of data, the at least one piece of data comprising scene data, the scene data being used for rendering a scene picture.
The steps 401 and 402 are similar to the steps 201 and 202, and are not described herein again.
In step 403, a control engine is created in the target application based on the instructional information, if it is determined that the data in the data model belongs to the target data type.
In the disclosed embodiment, the created control engine runs in the target application. The method comprises the steps of acquiring configuration information and a data model under the condition of running a target application, determining whether each piece of data in the data model belongs to the target data type through the target application, and creating the control engine in the target application based on instruction information under the condition of determining that all the pieces of data in the data model belong to the target data type, so that a scene picture is displayed in the target application through the control engine running in the target application in the following process.
In step 404, the display area set by the target application is determined by the control engine.
Wherein the display area is used for displaying a scene picture. In this disclosure, the target application is an arbitrary application, the target application has a function of displaying a scene corresponding to the data model, and a display area is set in the target application for displaying the scene, so that after the control engine is created in the target application, the display area set by the target application is determined by the control engine, so as to display the scene corresponding to the data model in the display area by the control engine subsequently.
In some embodiments, the target application acquires location information corresponding to the display area, sends the location information to the control engine, receives the location information through the control engine, and determines the display area indicated by the location information. The position information is used to describe a position corresponding to the display area, for example, the position information indicates a certain position in a display screen of the terminal.
In step 405, a scene screen corresponding to the scene data is displayed in the display area by the control engine based on the data model.
This step is similar to step 204, and will not be described herein again.
In the scheme provided by the embodiment of the disclosure, the configuration information and the data model are acquired through the running application, and under the condition that the configuration information is acquired, once the data model containing the data belonging to the target data type is acquired, a control engine is created in the target application based on the instruction information, and the scene picture is displayed in the display area set by the target application through the control engine running in the target application, so that a mode of displaying the scene picture in the application is realized, and the display accuracy of the scene picture is ensured.
It should be noted that, on the basis of the embodiments shown in fig. 2 to 4, the embodiments of the present disclosure provide a data model, which is configured by a developer according to the target protocol. In the target protocol, standardized data types and interfaces required when configuring the data model corresponding to the resource are defined, so that a developer can configure the data model conforming to the target protocol according to the target protocol. The target protocol supports rich interactive advertisement definition, namely a data model of the rich interactive advertisement can be defined according to the target protocol, and data embedding points, conversion skip, interactive behavior definition and the like of the advertisement can be defined.
In some embodiments, the target protocol corresponds to a plurality of data model templates, and a developer fills data required to be filled in any data model template to obtain one data model. Wherein different data model templates are suitable for different types of resources, for example, taking resources as an advertisement as an example, a first data model template is suitable for a game type advertisement, and a second data model template is suitable for a commodity type advertisement. Considering that different types of resources are displayed in different manners, for example, taking the resources as an advertisement, the advertisement of the game type may display a trial play scene picture of the game, in which a user may try to play the game, and after the trial play, a download scene picture of the game is displayed; the advertisement of the commodity type displays a scene picture for playing a video for introducing the commodity, and then a user triggers certain operation in the scene picture and then displays a commodity exchange page. Therefore, different data model templates are set for different types of resources, and when a developer configures any type of resource, the data model corresponding to the resource can be obtained according to the data model template matched with the type, so that the development efficiency is improved.
In some embodiments, the data model includes a plurality of pieces of data including at least one of lifecycle data, scene location relationship data, a set of triggers or default condition data, and scene data.
The lifecycle data is used to indicate a lifecycle of the data model, for example, the lifecycle data indicates a display time, a disappearance time, a display duration, or the like of a scene corresponding to the data model. The scene position relationship data is used to describe a relationship between display positions of scene pictures corresponding to a plurality of pieces of scene data in the data model, for example, the data model includes two pieces of scene data, and the scene position relationship data indicates that a first display position moves by a distance of 10 pixels to the left with respect to a second display position, where the first display position is a display position of a scene picture corresponding to a first piece of scene data in the two pieces of scene data, and the second display position is a display position of a scene picture corresponding to a second piece of scene data in the two pieces of scene data, where the pixel distance is a size of each pixel, for example, the pixel is represented by a square box, and a side length of the square box is a distance of one pixel. The trigger set comprises a plurality of trigger identifications and corresponding triggers, each trigger being used for representing an event trigger condition. The default condition data is used for describing the state when the scene picture corresponding to the data model is displayed, and the default condition data can be used as a judgment basis for triggering of the trigger. For the plurality of pieces of data contained in the data model, taking the data model of the advertisement as an example, as shown in table 1, table 1 includes the plurality of pieces of data contained in the data model of the advertisement, the type of each piece of data, the tag, and the related description of each piece of data.
TABLE 1
Figure BDA0003576145130000211
In some embodiments, a plurality of target data types required for configuring the data model are defined in the target protocol, for example, the plurality of target data types defined by the target protocol include a control engine type, a display area type, a scene data type, a lifecycle type, a control template type, a trigger type, a transition pattern type, a control instruction type, and a condition type. In the embodiment of the present disclosure, the data of the multiple target data types are all included in the data model in the form of one sub data model.
In the embodiment of the disclosure, the control engine can convert the data model into a scene picture for displaying, and can also realize interaction with a user based on a trigger operation of the user. As for the content included in the sub data model (control engine lifecycle model) corresponding to the control engine type, as shown in table 2, the control engine lifecycle model includes a trigger identifier 1 and a trigger identifier 2, where the trigger identifier 1 indicates a trigger that needs to be triggered when the control engine is successfully created, the trigger identifier 2 indicates a trigger that needs to be triggered when the control engine disappears, and both the trigger identifier 1 and the trigger identifier 2 are represented in the form of a character string. In the embodiment of the present disclosure, in the process of displaying a scene according to the data model, in response to a closing operation on the scene, it indicates that the scene corresponding to the data model is no longer currently displayed, and deletes the control engine created for the data model, which is the time when the control engine disappears.
TABLE 2
FIELD Type (B) Label (R) Description of the invention
Trigger identification 1 Character string Repeatable type Trigger needing to be triggered when control engine is displayed
Trigger identification 2 Character string Repeatable type Trigger needing to be triggered when control engine disappears
As for the contents contained in the child data model (scene data model) corresponding to the scene data type, table 3 shows.
TABLE 3
FIELD Type (B) Description of the invention
Scene identification Character string type
Scene content Content rendering model For indicating the content contained in a scene
Brief introduction information Character string To illustrate the purpose or effect of the scene
Life cycle of scene Life cycle of scene For indicating the life cycle of a scene
As for the content included in the sub data model (transition mode model) corresponding to the transition mode type, as shown in table 4, the transition mode model includes multiple types of scene modes, the type of each scene mode, and the associated description of each scene mode.
TABLE 4
Figure BDA0003576145130000221
Figure BDA0003576145130000231
It should be noted that, on the basis of the above-mentioned embodiment shown in fig. 3, the event trigger condition is implemented by a trigger included in the data model, that is, the data model includes a trigger associated with the scene data and a control instruction corresponding to the trigger. Wherein the trigger is any type of trigger, for example, the trigger is a one-shot trigger, a conditional trigger, a heartbeat trigger, or a time delay trigger. The single trigger is a trigger triggered when an operation is detected, the conditional trigger is a trigger triggered when a certain condition is satisfied, the heartbeat trigger is a trigger triggered once every first time period, for example, a trigger triggered once every 3 seconds, and the delay trigger is a trigger triggered after a second time period elapses since the detection of the trigger operation. As for the content contained in the child data model (trigger model) corresponding to the trigger type, as shown in table 5, the trigger model includes multiple types of triggers, the type of each trigger, and the associated description of each trigger.
TABLE 5
Figure BDA0003576145130000232
In the embodiment of the present disclosure, the control instruction includes various types of control instructions, for example, the control instruction includes a transition control instruction, a buried point control instruction, a video playing control instruction, a page address jump control instruction, a condition change control instruction, a time trigger cancellation instruction, a custom control instruction, a control instruction for triggering a trigger, a control instruction for converting a jump, and a control instruction for executing step by step. As for the content included in the sub data model (control instruction model) corresponding to the control instruction type, as shown in table 6, the control instruction model includes a plurality of types of control instructions, the type of each control instruction, and the associated description of each control instruction.
TABLE 6
Figure BDA0003576145130000233
Figure BDA0003576145130000241
In the embodiment of the present disclosure, the data model configured according to the target protocol may include data in the tables 1 to 6, for example, an advertisement data model may be configured according to the target protocol and data described in the tables 1 to 6, and then a scene screen included in an advertisement may be displayed according to the embodiment shown in the fig. 2 to 4. According to the data model provided by the embodiment of the disclosure, scene picture display of interactive resources can be realized, for example, scene picture display of rich interactive advertisements can be realized according to the configured advertisement data model.
On the basis of the embodiments shown in fig. 2 to fig. 4 and the data model provided above, the embodiments of the present disclosure further provide a process of displaying a scene screen, where an advertisement data model is taken as an example, a target application is installed in a terminal, and in a case where the target application is run, the obtained data model is used to display a scene screen, as shown in fig. 5, the process is executed by the terminal, and the process includes:
in step 501, configuration information is obtained, the configuration information including a target data type and instruction information.
In step 502, an advertisement data model issued by a server is obtained.
In step 503, in the event that it is determined that the data in the advertisement data model all belong to the target data type, a control engine is created for the advertisement data model based on the instruction information.
In step 504, the display area set by the target application is determined by the control engine.
In step 505, by the interface processing engine, in a case where a first trigger in the advertisement data model is triggered, first scene data indicated by a first control instruction corresponding to the first trigger is determined from a plurality of pieces of scene data included in the advertisement data model.
The first trigger indicates a trigger triggered when the control engine is successfully created, and the first control instruction indicates that a scene picture corresponding to the first scene data is displayed.
In step 506, a first scene picture is displayed in the display area by the interface processing engine based on the first scene data and the display position of the first scene data indicated by the scene position relationship data in the advertisement data model.
In step 507, the interface processing engine switches and displays the first scene screen to the second scene screen based on a second control instruction corresponding to a second trigger when the second trigger associated with the first scene data is triggered.
The transition triggering condition in the embodiment shown in fig. 3 is implemented by the second flip-flop, and the second control instruction is the transition control instruction in the embodiment shown in fig. 3. Step 507 is similar to step 306, and will not be described herein again.
In step 508, through the control engine, when the third trigger associated with the first scene data is triggered, the operation data is obtained according to the third control instruction corresponding to the third trigger, and the operation data is reported to the server.
The third control instruction indicates to acquire operation data and report the operation data, namely the third control instruction is a buried point control instruction.
In step 509, when the fourth trigger associated with the first scene data is triggered, the control engine accesses the page corresponding to the page address included in the fourth control instruction according to the fourth control instruction corresponding to the fourth trigger.
And the fourth control instruction indicates to access a page corresponding to the page address, namely the fourth control instruction is a page address jump control instruction.
In step 510, when a fifth trigger associated with the first scene data is triggered, the control engine plays a video indicated by the fifth control instruction in the display area according to the fifth control instruction corresponding to the fifth trigger.
The fifth control instruction indicates video playing, that is, the fifth control instruction is a video playing control instruction.
The first to fifth flip-flops are all flip-flops in table 5, and the disclosure is not limited thereto.
In the embodiment shown in fig. 5, the description has been given only by taking an example in which a plurality of triggers related to a first scene screen are triggered, but in another embodiment, when a second scene screen is displayed and a plurality of triggers are related to the second scene screen, it is also possible to trigger a trigger related to a second scene screen in such a manner that the plurality of triggers related to the first scene screen are triggered.
In the scheme provided by the embodiment of the disclosure, developers only need to configure the data model corresponding to the resources, and do not need to develop rendering logic codes for the data model independently, so that the development efficiency is improved.
Moreover, the data model is configured according to the scheme provided by the embodiment of the disclosure, and the scene picture corresponding to the data model is displayed, so that a developer does not need to develop the resource, and the problem of overlong delivery cycle of the resource is solved through a complex approval online process after the development, so that the terminal has the capability of rapidly displaying the scene picture contained in the resource, the application installed on the terminal also has the advantage of rapidly displaying the scene picture contained in the resource, the access cost of the resource is reduced, and the number of times of resource release is reduced. For example, the terminal can quickly display advertisements in an application. Moreover, the scheme provided by the embodiment of the disclosure can be applied to terminals of various operating systems.
In addition, in the solution provided by the embodiment of the present disclosure, the data size of the data model is small, a PB (a transmission program) can be used for data transmission, when the data model is transmitted, the transmission load is small, and the data model corresponding to the advertisement can be transmitted in the form of each advertisement/each request by using the engine service architecture, that is, the advertisement can be delivered in as short a time as possible. In addition, in the scheme provided by the embodiment of the disclosure, the development is based on the native environment, when the advertisement is updated, only the data model corresponding to the advertisement is updated, and the version update of the advertisement is not needed, so that the hot update of the advertisement is realized.
Based on the embodiment shown in fig. 5, the present disclosure also provides a schematic diagram of scene picture switching, as shown in fig. 6, in a display area 601 of the target application setting, a first scene screen 602 of the trial play advertisement is displayed, the first scene screen 602 including an image control 603, a text control 604, and an operation control 605, the image control 603 displays game images, the text control 604 displays text information describing a game, the operation control 605 displays a countdown animation including "free trial", and after the countdown of the operation control 605 is finished, the state of the control 605 is switched from a non-triggerable state to a triggerable state, the control is associated with a trigger, the trigger is triggered when a trigger operation for the operation control 605 is detected, and a control instruction corresponding to the trigger instructs to switch and display the second scene screen 606 in accordance with the set transition mode. After the countdown is finished, the user clicks the operation control 605, the trigger associated with the operation control 605 is triggered, the control engine executes a control instruction corresponding to the trigger, that is, according to the transition mode indicated by the control instruction, based on the second scene data, the first scene 602 is switched and displayed to be the second scene 606, the second scene includes an image control 607, a text control 608 and a "download" operation control 609, a game image is displayed in the image control 607, text information describing the game is displayed in the text control 608, the "download" operation control 609 is used for jumping to a game download page, the trigger is associated with the "download" operation 609, the control instruction corresponding to the trigger indicates a page indicated by a jump game download address, if the user clicks the "download" operation control 609, the controller associated with the "download" operation control 609 is triggered, and jumping to a game downloading page.
Fig. 7 is a block diagram illustrating a scene screen display apparatus according to an exemplary embodiment. Referring to fig. 7, the apparatus includes:
an acquisition unit 701 configured to perform acquisition of configuration information, the configuration information including a target data type and instruction information, the instruction information indicating an instruction for creating a control engine in a case where data belonging to the target data type is acquired;
an obtaining unit 701, further configured to perform obtaining a data model, where the data model includes at least one piece of data, and the at least one piece of data includes scene data, and the scene data is used for rendering a scene picture;
a creating unit 702 configured to perform, in a case where it is determined that at least one piece of data belongs to the target data type, creating a control engine based on the instruction information;
and a display unit 703 configured to perform displaying, by the control engine, a scene screen corresponding to the scene data based on the data model.
In some embodiments, as shown in fig. 8, the display unit 703 includes:
a determining subunit 7031 configured to perform determining, by the control engine, first scene data from among the plurality of pieces of scene data included in the data model;
a display subunit 7032 configured to perform displaying, by the control engine, the first scene screen based on the first scene data.
In some embodiments, the first scene data includes resource data, and the display subunit 7032 is configured to execute displaying, by the control engine, based on the first scene data, a first scene screen including a resource corresponding to the resource data.
In some embodiments, the display subunit 7032 is configured to perform determining, by the control engine, a data format to which the resource data belongs; and rendering the resource data into a control in the first scene picture by a control engine according to a control template matched with the data format, wherein the control contains a resource corresponding to the resource data.
In some embodiments, the data model further includes an event trigger condition associated with the scene data and a control instruction corresponding to the event trigger condition, as shown in fig. 8, the apparatus further includes:
and the execution unit 704 is configured to execute, by the control engine, in the case that the event trigger condition is currently satisfied, a control instruction corresponding to the event trigger condition.
In some embodiments, the scene picture is a first scene picture corresponding to first scene data in the data model, the event trigger condition associated with the first scene data includes a transition trigger condition, the control instruction corresponding to the transition trigger condition includes a transition control instruction, and the transition control instruction instructs to switch and display a second scene picture corresponding to second scene data in the data model; and an execution unit 704 configured to execute switching and displaying, by the control engine, the first scene picture as the second scene picture based on the transition control instruction in a case where the transition trigger condition is currently satisfied.
In some embodiments, the transition control instruction further indicates a transition mode between the first scene picture and the second scene picture, and the execution unit 704 is configured to execute switching and displaying, by the control engine, the first scene picture as the second scene picture in the transition mode based on the transition control instruction in a case where the transition trigger condition is currently satisfied.
In some embodiments, the execution unit 704 is configured to perform any of:
the transition mode is a first transition mode, and a control engine is used for displaying a second scene picture while canceling to display a first scene picture on the basis of a transition control instruction under the condition that a transition triggering condition is currently met;
the transition mode is a second transition mode, the transition animation matched with the second transition mode is played through the control engine under the condition that the transition triggering condition is met currently, and a second scene picture is displayed after the transition animation is played.
In some embodiments, the apparatus further comprises:
a determining unit 705 configured to perform, for each piece of data in the data model, in a case where the data includes a target character string, determining that the data belongs to a target data type;
in some embodiments, the control engine runs in the target application, as shown in fig. 8, the apparatus further comprises:
a determination unit 705 configured to perform determining, by the control engine, a display area set by the target application, the display area being used to display the scene screen;
a display unit 703 configured to perform displaying a scene picture in the display area based on the data model by the control engine.
With regard to the apparatus in the above-described embodiment, the specific manner in which each unit performs the operation has been described in detail in the embodiment related to the method, and will not be described in detail here.
In an exemplary embodiment, there is also provided an electronic device including:
one or more processors;
volatile or non-volatile memory for storing one or more processor-executable instructions;
wherein the one or more processors are configured to perform the steps performed by the terminal in the above-described scene picture display method.
In some embodiments, the electronic device is provided as a terminal. Fig. 9 is a block diagram illustrating a structure of a terminal 900 according to an example embodiment. The terminal 900 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 900 may also be referred to by other names such as user equipment, portable terminals, laptop terminals, desktop terminals, and the like.
The terminal 900 includes: a processor 901 and a memory 902.
Processor 901 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 901 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 901 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in a wake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 901 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed by the display screen. In some embodiments, the processor 901 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 902 may include one or more computer-readable storage media, which may be non-transitory. The memory 902 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 902 is used to store at least one program code for execution by the processor 901 to implement the scene picture display method provided by the method embodiments in the present disclosure.
In some embodiments, terminal 900 can also optionally include: a peripheral interface 903 and at least one peripheral. The processor 901, memory 902, and peripheral interface 903 may be connected by buses or signal lines. Various peripheral devices may be connected to the peripheral interface 903 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 904, a display screen 905, a camera assembly 906, an audio circuit 907, a positioning assembly 908, and a power supply 909.
The peripheral interface 903 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 901 and the memory 902. In some embodiments, the processor 901, memory 902, and peripheral interface 903 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 901, the memory 902 and the peripheral interface 903 may be implemented on a separate chip or circuit board, which is not limited by this embodiment.
The Radio Frequency circuit 904 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 904 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 904 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 904 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 904 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 904 may also include NFC (Near Field Communication) related circuits, which are not limited by this disclosure.
The display screen 905 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 905 is a touch display screen, the display screen 905 also has the ability to capture touch signals on or over the surface of the display screen 905. The touch signal may be input to the processor 901 as a control signal for processing. At this point, the display 905 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 905 may be one, disposed on the front panel of the terminal 900; in other embodiments, the number of the display panels 905 may be at least two, and each of the display panels is disposed on a different surface of the terminal 900 or is in a foldable design; in other embodiments, the display 905 may be a flexible display disposed on a curved surface or a folded surface of the terminal 900. Even more, the display screen 905 may be arranged in a non-rectangular irregular figure, i.e. a shaped screen. The Display panel 905 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
The camera assembly 906 is used to capture images or video. Optionally, camera assembly 906 includes a front camera and a rear camera. The front camera is arranged on the front panel of the terminal, and the rear camera is arranged on the back of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 906 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
Audio circuit 907 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 901 for processing, or inputting the electric signals to the radio frequency circuit 904 for realizing voice communication. For stereo sound acquisition or noise reduction purposes, the microphones may be multiple and disposed at different locations of the terminal 900. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 901 or the radio frequency circuit 904 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuit 907 may also include a headphone jack.
The positioning component 908 is used to locate the current geographic Location of the terminal 900 for navigation or LBS (Location Based Service). The Positioning component 908 may be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 909 is used to provide power to the various components in terminal 900. The power source 909 may be alternating current, direct current, disposable or rechargeable. When the power source 909 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 900 can also include one or more sensors 910. The one or more sensors 910 include, but are not limited to: acceleration sensor 911, gyro sensor 912, pressure sensor 913, fingerprint sensor 914, optical sensor 915, and proximity sensor 916.
The acceleration sensor 911 can detect the magnitude of acceleration in three coordinate axes of the coordinate system established with the terminal 900. For example, the acceleration sensor 911 may be used to detect the components of the gravitational acceleration in three coordinate axes. The processor 901 can control the display screen 905 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 911. The acceleration sensor 911 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 912 may detect a body direction and a rotation angle of the terminal 900, and the gyro sensor 912 may cooperate with the acceleration sensor 911 to acquire a 3D motion of the user on the terminal 900. The processor 901 can implement the following functions according to the data collected by the gyro sensor 912: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensor 913 may be disposed on a side bezel of the terminal 900 and/or underneath the display 905. When the pressure sensor 913 is disposed on the side frame of the terminal 900, the user's holding signal of the terminal 900 may be detected, and the processor 901 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 913. When the pressure sensor 913 is disposed at a lower layer of the display screen 905, the processor 901 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 905. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 914 is used for collecting a fingerprint of the user, and the processor 901 identifies the user according to the fingerprint collected by the fingerprint sensor 914, or the fingerprint sensor 914 identifies the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, processor 901 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 914 may be disposed on the front, back, or side of the terminal 900. When a physical key or vendor Logo is provided on the terminal 900, the fingerprint sensor 914 may be integrated with the physical key or vendor Logo.
The optical sensor 915 is used to collect ambient light intensity. In one embodiment, the processor 901 may control the display brightness of the display screen 905 based on the ambient light intensity collected by the optical sensor 915. Specifically, when the ambient light intensity is high, the display brightness of the display screen 905 is increased; when the ambient light intensity is low, the display brightness of the display screen 905 is reduced. In another embodiment, the processor 901 can also dynamically adjust the shooting parameters of the camera assembly 906 according to the ambient light intensity collected by the optical sensor 915.
A proximity sensor 916, also referred to as a distance sensor, is provided on the front panel of the terminal 900. The proximity sensor 916 is used to collect the distance between the user and the front face of the terminal 900. In one embodiment, when the proximity sensor 916 detects that the distance between the user and the front face of the terminal 900 gradually decreases, the processor 901 controls the display 905 to switch from the bright screen state to the dark screen state; when the proximity sensor 916 detects that the distance between the user and the front surface of the terminal 900 gradually becomes larger, the display 905 is controlled by the processor 901 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 9 does not constitute a limitation of terminal 900, and may include more or fewer components than those shown, or may combine certain components, or may employ a different arrangement of components.
In an exemplary embodiment, there is also provided a non-transitory computer-readable storage medium, which when executed by a processor of an electronic device, enables the electronic device to perform the steps performed by the terminal in the above-described scene screen display method. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, there is also provided a computer program product, wherein instructions of the computer program product, when executed by a processor of an electronic device, enable the electronic device to perform the steps performed by the terminal in the above-described scene screen display method.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. A method for displaying a scene, the method comprising:
acquiring configuration information, wherein the configuration information comprises a target data type and instruction information, and the instruction information indicates an instruction for creating a control engine under the condition that data belonging to the target data type is acquired;
acquiring a data model, wherein the data model comprises at least one piece of data, the at least one piece of data comprises scene data, and the scene data is used for rendering a scene picture;
creating the control engine based on the instruction information if it is determined that the at least one piece of data belongs to the target data type;
and displaying a scene picture corresponding to the scene data based on the data model through the control engine.
2. The method according to claim 1, wherein the displaying, by the control engine, the scene picture corresponding to the scene data based on the data model comprises:
determining, by the control engine, first scene data from a plurality of pieces of scene data included in the data model;
displaying, by the control engine, a first scene screen based on the first scene data.
3. The method of claim 2, wherein the first scene data includes resource data, and wherein displaying, by the control engine, a first scene screen based on the first scene data comprises:
displaying, by the control engine, a first scene screen including a resource corresponding to the resource data based on the first scene data.
4. The method of claim 3, wherein the displaying, by the control engine and based on the first scene data, a first scene picture including a resource corresponding to the resource data comprises:
determining, by the control engine, a data format to which the resource data belongs;
rendering the resource data into a control in the first scene picture according to a control template matched with the data format through the control engine, wherein the control contains a resource corresponding to the resource data.
5. The method according to claim 1, wherein the data model further includes an event trigger condition associated with the scene data and a control instruction corresponding to the event trigger condition, and after the scene picture corresponding to the scene data is displayed based on the data model by the control engine, the method further comprises:
and executing a control instruction corresponding to the event trigger condition through the control engine under the condition that the event trigger condition is currently met.
6. The method according to claim 5, wherein the scene picture is a first scene picture corresponding to first scene data in the data model, the event trigger condition associated with the first scene data includes a transition trigger condition, the control instruction corresponding to the transition trigger condition includes a transition control instruction, and the transition control instruction instructs to switch to display a second scene picture corresponding to second scene data in the data model; the executing, by the control engine, the control instruction corresponding to the event trigger condition under the condition that the event trigger condition is currently satisfied includes:
and switching and displaying the first scene picture into the second scene picture based on the transition control instruction by the control engine under the condition that the transition triggering condition is currently met.
7. A scene display apparatus, comprising:
an acquisition unit configured to execute acquisition of configuration information including a target data type and instruction information indicating an instruction for creating a control engine in a case where data belonging to the target data type is acquired;
the obtaining unit is further configured to perform obtaining a data model, wherein the data model comprises at least one piece of data, and the at least one piece of data comprises scene data, and the scene data is used for rendering a scene picture;
a creating unit configured to perform, in a case where it is determined that the at least one piece of data belongs to the target data type, creating the control engine based on the instruction information;
a display unit configured to perform displaying, by the control engine, a scene screen corresponding to the scene data based on the data model.
8. An electronic device, characterized in that the electronic device comprises:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the scene picture display method according to any one of claims 1 to 6.
9. A non-transitory computer-readable storage medium, wherein instructions in the storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the scene picture display method according to any one of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program realizes the scene picture display method according to any one of claims 1 to 6 when executed by a processor.
CN202210344596.1A 2022-03-31 2022-03-31 Scene picture display method and device, electronic equipment and storage medium Active CN114816622B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210344596.1A CN114816622B (en) 2022-03-31 2022-03-31 Scene picture display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210344596.1A CN114816622B (en) 2022-03-31 2022-03-31 Scene picture display method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114816622A true CN114816622A (en) 2022-07-29
CN114816622B CN114816622B (en) 2024-04-30

Family

ID=82532078

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210344596.1A Active CN114816622B (en) 2022-03-31 2022-03-31 Scene picture display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114816622B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105844694A (en) * 2015-08-24 2016-08-10 鲸彩在线科技(大连)有限公司 Game data generating method, game data uploading method, game data generating device, and game data uploading device
CN110062271A (en) * 2019-04-28 2019-07-26 腾讯科技(成都)有限公司 Method for changing scenes, device, terminal and storage medium
CN110941464A (en) * 2018-09-21 2020-03-31 阿里巴巴集团控股有限公司 Light exposure method, device, system and storage medium
CN111432001A (en) * 2020-03-24 2020-07-17 北京字节跳动网络技术有限公司 Method, apparatus, electronic device, and computer-readable medium for jumping scenes
CN111767503A (en) * 2020-07-29 2020-10-13 腾讯科技(深圳)有限公司 Game data processing method and device, computer and readable storage medium
CN112023399A (en) * 2020-08-21 2020-12-04 上海完美时空软件有限公司 Game scene data processing method and device, storage medium and computer equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105844694A (en) * 2015-08-24 2016-08-10 鲸彩在线科技(大连)有限公司 Game data generating method, game data uploading method, game data generating device, and game data uploading device
CN110941464A (en) * 2018-09-21 2020-03-31 阿里巴巴集团控股有限公司 Light exposure method, device, system and storage medium
CN110062271A (en) * 2019-04-28 2019-07-26 腾讯科技(成都)有限公司 Method for changing scenes, device, terminal and storage medium
CN111432001A (en) * 2020-03-24 2020-07-17 北京字节跳动网络技术有限公司 Method, apparatus, electronic device, and computer-readable medium for jumping scenes
CN111767503A (en) * 2020-07-29 2020-10-13 腾讯科技(深圳)有限公司 Game data processing method and device, computer and readable storage medium
CN112023399A (en) * 2020-08-21 2020-12-04 上海完美时空软件有限公司 Game scene data processing method and device, storage medium and computer equipment

Also Published As

Publication number Publication date
CN114816622B (en) 2024-04-30

Similar Documents

Publication Publication Date Title
CN109982102B (en) Interface display method and system for live broadcast room, live broadcast server and anchor terminal
CN109660855B (en) Sticker display method, device, terminal and storage medium
CN110097428B (en) Electronic order generation method, device, terminal and storage medium
CN109327608B (en) Song sharing method, terminal, server and system
CN108694073B (en) Control method, device and equipment of virtual scene and storage medium
CN109646944B (en) Control information processing method, control information processing device, electronic equipment and storage medium
CN108897597B (en) Method and device for guiding configuration of live broadcast template
CN108900925B (en) Method and device for setting live broadcast template
CN112016941A (en) Virtual article pickup method, device, terminal and storage medium
CN110321126B (en) Method and device for generating page code
CN110732136B (en) Method, device, terminal and storage medium for previewing in-office behavior in out-office environment
CN112181573A (en) Media resource display method, device, terminal, server and storage medium
CN111880888B (en) Preview cover generation method and device, electronic equipment and storage medium
CN108717365B (en) Method and device for executing function in application program
CN113395566B (en) Video playing method and device, electronic equipment and computer readable storage medium
CN113613028B (en) Live broadcast data processing method, device, terminal, server and storage medium
CN113409427B (en) Animation playing method and device, electronic equipment and computer readable storage medium
CN110750734A (en) Weather display method and device, computer equipment and computer-readable storage medium
CN114116053A (en) Resource display method and device, computer equipment and medium
CN111083526B (en) Video transition method and device, computer equipment and storage medium
CN112257006A (en) Page information configuration method, device, equipment and computer readable storage medium
CN111368114A (en) Information display method, device, equipment and storage medium
CN112023403A (en) Battle process display method and device based on image-text information
CN108228052B (en) Method and device for triggering operation of interface component, storage medium and terminal
CN113538633B (en) Animation playing method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant