CN112306324A - Information processing method, apparatus, device and medium - Google Patents

Information processing method, apparatus, device and medium Download PDF

Info

Publication number
CN112306324A
CN112306324A CN202011197443.6A CN202011197443A CN112306324A CN 112306324 A CN112306324 A CN 112306324A CN 202011197443 A CN202011197443 A CN 202011197443A CN 112306324 A CN112306324 A CN 112306324A
Authority
CN
China
Prior art keywords
information
interaction
template
user interface
action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011197443.6A
Other languages
Chinese (zh)
Inventor
刘凯
李长顺
陈靖
李泽琛
韩青暖
张矗
刘伟民
方绍晟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Didi Infinity Technology and Development Co Ltd
Original Assignee
Beijing Didi Infinity Technology and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Didi Infinity Technology and Development Co Ltd filed Critical Beijing Didi Infinity Technology and Development Co Ltd
Priority to CN202011197443.6A priority Critical patent/CN112306324A/en
Publication of CN112306324A publication Critical patent/CN112306324A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

Embodiments of the present disclosure relate to methods, apparatuses, devices, and media for information processing. The method comprises the following steps: obtaining description information related to a user interface to be presented at a terminal device; generating a template of the user interface based on the descriptive information, the template comprising: view information associated with a view presented on a user interface, interaction information associated with an interaction event supported by the user interface, and action information associated with an action triggered by the interaction event; and providing the template to the terminal device to cause the terminal device to present a user interface based on the template. In this manner, a user interface may be generated flexibly and efficiently.

Description

Information processing method, apparatus, device and medium
Technical Field
Embodiments of the present disclosure relate generally to information processing, and more particularly, to an information processing method, apparatus, electronic device, and computer storage medium.
Background
Applications in terminal devices typically require the presentation of various user interfaces or pages of low complexity. For example, there are many operational scenarios in applications providing wind-to-wind services that require a user to be presented with various user interfaces such as a security task page, a subsidy activity page, a pull-up activity page, and the like, such as in a home page, a waiting list page, an order detail page, and the like. Furthermore, there is a need to present a large number of similar user interfaces of lower complexity in applications providing windmilling services. For example, a novice guide page that presents the user with some new functionality, a penalty result page that shows the user is penalized for some violation, and so on.
Due to different operation scenes, the presented content will be different to meet the characteristics of the operation scenes. The content presented is typically developed through a simple user interface. However, traditionally, the ability to generate such user interfaces has been poor.
Disclosure of Invention
According to an embodiment of the present disclosure, an information processing scheme is provided.
In a first aspect of the present disclosure, an information processing method is provided. The method comprises the following steps: obtaining description information related to a user interface to be presented at a terminal device; generating a template of the user interface based on the descriptive information, the template comprising: view information associated with a view presented on a user interface, interaction information associated with an interaction event supported by the user interface, and action information associated with an action triggered by the interaction event; and providing the template to the terminal device to cause the terminal device to present a user interface based on the template.
In a second aspect of the present disclosure, an information processing method is provided. The method comprises the following steps: obtaining a template of a user interface to be presented at a terminal device, the template comprising: view information associated with a view presented on a user interface, interaction information associated with an interaction event supported by the user interface, and action information associated with an action triggered by the interaction event; analyzing the template to extract view information, interaction information and action information; and enabling the terminal equipment to present a user interface based on the view information, the interaction information and the action information.
In a third aspect of the present disclosure, an apparatus for information processing is provided. The device includes: a description information acquisition module configured to acquire description information related to a user interface to be presented at a terminal device; a generation module configured to generate a template of the user interface based on the description information, the template including: view information associated with a view presented on a user interface, interaction information associated with an interaction event supported by the user interface, and action information associated with an action triggered by the interaction event; and a providing module configured to provide the template to the terminal device to cause the terminal device to present a user interface based on the template.
In a fourth aspect of the present disclosure, an apparatus for information processing is provided. The device includes: a template acquisition module configured to acquire a template of a user interface to be presented at a terminal device, the template comprising: view information associated with a view presented on a user interface, interaction information associated with an interaction event supported by the user interface, and action information associated with an action triggered by the interaction event; a parsing module configured to parse the template to extract view information, interaction information, and action information; and a presentation module configured to cause the terminal device to present a user interface based on the view information, the interaction information, and the action information.
In a fifth aspect of the present disclosure, an electronic device is provided. The electronic device includes: one or more processors; and memory for storing one or more programs that, when executed by the one or more processors, cause the electronic device to implement the method according to the first aspect of the disclosure.
In a sixth aspect of the present disclosure, an electronic device is provided. The electronic device includes: one or more processors; and memory for storing one or more programs that, when executed by the one or more processors, cause the electronic device to implement a method according to the second aspect of the disclosure.
In a seventh aspect of the present disclosure, a computer readable medium is provided, on which a computer program is stored, which program, when executed by a processor, performs the method according to the first aspect of the present disclosure.
In an eighth aspect of the present disclosure, a computer-readable medium is provided, on which a computer program is stored which, when executed by a processor, implements a method according to the second aspect of the present disclosure.
It should be understood that the statements herein reciting aspects are not intended to limit the critical or essential features of the embodiments of the present disclosure, nor are they intended to limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. In the drawings, like or similar reference characters designate like or similar elements, and wherein:
FIG. 1 illustrates a schematic diagram of an exemplary environment in which embodiments of the present disclosure can be implemented;
FIG. 2 illustrates a flow diagram of a method for generating and providing a template according to some embodiments of the present disclosure;
FIG. 3 illustrates a schematic diagram of an example of a nesting relationship of view information, interaction information, and action information, in accordance with some embodiments of the present disclosure;
FIG. 4 illustrates a schematic diagram of another example of a nested relationship of view information, interaction information, and action information, in accordance with some embodiments of the present disclosure;
FIG. 5 illustrates a flow diagram of a method for presenting a user interface, in accordance with some embodiments of the present disclosure;
FIG. 6 shows a schematic diagram of an example of a process for presenting a user interface, in accordance with some embodiments of the present disclosure;
FIG. 7 shows a schematic diagram of an example of a user interface according to some embodiments of the present disclosure;
FIG. 8 shows a schematic diagram of an example of a process for optimizing debugging in accordance with some embodiments of the present disclosure;
FIG. 9 illustrates a block diagram of an apparatus for generating and providing templates, according to some embodiments of the present disclosure;
FIG. 10 illustrates a block diagram of an apparatus for presenting a user interface, in accordance with some embodiments of the present disclosure; and
FIG. 11 illustrates a block diagram of an electronic device capable of implementing embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
In describing embodiments of the present disclosure, the terms "include" and its derivatives should be interpreted as being inclusive, i.e., "including but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The terms "first," "second," and the like may refer to different or the same object. Other explicit and implicit definitions are also possible below.
As described above, conventionally, the ability to generate a user interface that meets the requirements of an operational scenario is poor. Common user interfaces consist of basic elements such as images, text, buttons, etc., but the layout, content, and actions after interacting with the elements in the user interface may vary.
Generally, such a user interface can be generated in three ways. In the first way, if only images need to be presented, image presentation functionality can be pre-embedded in some locations in the user interface to support the provision of different images to the user. However, this approach has very limited capabilities and is difficult to meet with new demands that arise during subsequent use. For example, if only one button is pre-embedded in the user interface, but a new requirement requires the addition of another button, the new requirement cannot be satisfied. In addition, if the new requirement (e.g., making a call) is not in the pre-built function list, the new requirement cannot be satisfied.
In the second way, an HTML5(HyperText Markup Language 5) container may be pre-embedded in some location in the user interface to present more complex content to the user. Although pre-buried H5 containers support more complex requirements, the loading speed of H5 is typically lower than that of client native development, and the consumption of its storage resources is higher than that of client native development. Especially when two or more H5 containers are present in one user interface, this may result in a stuck or flashed back, thereby compromising the user experience.
In a third approach, developers perform client-side native development according to specific needs. However, client-side native development needs to be published as the application is published. Therefore, it takes too long (e.g., 1 month) to update the user interface, and thus cannot satisfy the requirement of updating the user interface in time.
It can be seen that the conventional approaches all have several drawbacks. Ideally, on the one hand, it is possible to be unlimited by the application release time, and the user interface can also be updated in time for released applications. On the other hand, it is possible to have good performance without degrading the user experience.
There are other ways to achieve the above in the field of conventional client technology. For example, a framework such as real Native, Weex, or the like may be used. However, such a framework requires a high level of skill for the developer. Furthermore, since the technical implementation is very complex, the maintenance costs are also high. Further, the framework also adopts a built-in JavaScript (JS) language engine for real-time dynamic parsing, so that the performance is lower than that of the native development of the client.
Furthermore, open source items such as Tangram may also be used. However, the dynamic nature of such open source items only supports changing the view of the user interface, not changing interactions and corresponding actions. For example, while button icons may be dynamically changed, the actions triggered by clicking a button cannot be changed.
In summary, the conventional methods cannot support the timely update of the user interface in a simple, fast and maintainable manner, or have the defects of high development labor cost, being unable to fully predict, or reducing the user experience.
To this end, embodiments of the present disclosure provide a scheme for information processing. In this approach, a computing device of a developer may obtain descriptive information about a user interface to be presented at a terminal device of a user and generate a template of the user interface based on the descriptive information. The template includes: view information associated with a view presented on a user interface, interaction information associated with an interaction event supported by the user interface, and action information associated with an action triggered by the interaction event. Thus, the computing device can provide the template to the terminal device to cause the terminal device to present a user interface based on the template. The user's terminal device may acquire and parse the template to extract the view information, interaction information, and action information. Thus, the terminal device can present a user interface based on the view information, the interaction information, and the action information.
In this way, in the present solution, timely updating of the user interface can be supported in a simple, fast, and maintainable manner without reducing the user experience. Embodiments of the present disclosure will be described below in detail with reference to the accompanying drawings.
FIG. 1 illustrates a schematic diagram of an exemplary environment 100 in which embodiments of the present disclosure can be implemented. Included in the exemplary environment 100 are a computing device 120 and a terminal device 140. Computing device 120 may be used, for example, by a developer to develop user interface 145 to be presented on terminal device 140, and terminal device 140 may be used, for example, by a user to present user interface 145. Computing device 120 and terminal device 140 may contain at least a processor, memory, and other components typically found in a general purpose computer to implement computing, storage, communication, control, and the like functions. For example, the computing device 120 and the terminal device 140 may be smart phones, tablet computers, personal computers, desktop computers, notebook computers, servers, mainframes, distributed computing systems, and the like.
In particular, the computing device 120 may obtain descriptive information 110 related to a user interface 145 to be presented at the user's terminal device 140 and generate a template 130 of the user interface 145 based on the descriptive information 110. The template 130 includes: view information associated with a view (e.g., button, picture, text, etc.) presented on the user interface 145, interaction information associated with an interaction event (e.g., click, double-click, slide, drag, zoom, etc.) supported by the user interface 145, and action information associated with an action (e.g., place a call, navigate to a page, present detailed information, etc.) triggered by the interaction event.
Thus, computing device 120 can provide template 130 to end device 140 to cause end device 140 to present user interface 145 based on template 130. Specifically, the user's terminal device 140 may retrieve and parse the template 130 to extract view information, interaction information, and action information. The end device 140 may then present the user interface 145 based on the view information, the interaction information, and the action information.
Hereinafter, the operation of computing device 120 will be described in conjunction with fig. 2-4, and the operation of terminal device 140 will be described in conjunction with fig. 5-8.
Fig. 2 illustrates a flow diagram of a method 200 for generating and providing the template 130, according to some embodiments of the present disclosure. The method 200 may be implemented by the computing device 120 as shown in FIG. 1. Alternatively, method 200 may be implemented by subjects other than computing device 120. It should be understood that method 200 may also include additional steps not shown and/or may omit steps shown, as the scope of the present disclosure is not limited in this respect.
At 210, computing device 120 obtains descriptive information 110 relating to user interface 145 to be presented at terminal device 140. In some embodiments, the computing device 120 may determine the views, interaction events, and actions based on the functionality to be provided by the user interface 145. For example, the function to be provided may be "make phone call when clicking a button", in which case the view is "button", the interaction event is "click", and the action is "make phone call".
The computing device 120 may then generate the descriptive information 110 in accordance with a predetermined protocol based on the views, interaction events, and actions. The descriptive information 110 may include view information associated with views (e.g., buttons, pictures, text, etc.) presented on the user interface 145, interaction information associated with interaction events (e.g., click, double-click, slide, drag, zoom, etc.) supported by the user interface 145, and action information associated with actions (e.g., placing a call, navigating to a page, presenting detailed information, etc.) triggered by the interaction events.
The predetermined protocol may indicate an affiliation between view information, interaction information, and action information in the description information 110. In some embodiments, the dependencies may be nested relationships. Fig. 3 illustrates a schematic diagram 300 of an example of a nested relationship of view information, interaction information, and action information, according to some embodiments of the present disclosure. As shown in FIG. 3, action information 330 is nested within interaction information 320, and interaction information 320 is nested within view information 310.
FIG. 4 illustrates a schematic diagram 400 of another example of a nested relationship of view information, interaction information, and action information, according to some embodiments of the present disclosure. As shown in fig. 4, the description information 110 may include a view set 410 consisting of one or more view information (e.g., view information 1420 — view information N470, where N is an integer greater than 1). Each view information may have one or more attributes (e.g., attribute 1430 and attribute 2440). For example, the attribute may be the size, identification, fetch address, etc. of the view. Further, each view information may also nest or bind one or more interaction information (e.g., interaction information 450). And each interaction information may nest or bind one or more action information (e.g., action 460). It should be understood that the number of view information, attributes, interaction information, and action information shown in fig. 4 is only an example, and the description information 110 may include more or less view information, attributes, interaction information, and action information.
Thus, the functions to be provided by the user interface 145 are organized as data having a hierarchical nesting relationship, i.e., the descriptive information 110. In some embodiments, the description information 110 may be generated using a language such as XML, JSON, YAML, or the like. In the description information 110, view information, interaction information, and action information may be regarded as nodes. For example, in the example code of the following description information 110, view nodes (e.g., view nodes indicated by < text and </text > and view nodes indicated by < view and /) are contained under a < render > node. Interaction nodes are included under the view nodes (e.g., interaction nodes indicated by < text > and </text > are included under the view nodes). The interaction nodes include action nodes (e.g., action nodes indicated by < dialog and/> are included under the interaction nodes indicated by < onClick > and </onClick >).
Referring back to FIG. 2, at 220, the computing device 120 generates the template 130 of the user interface 145 based on the descriptive information 110. Since the description information 110 includes view information, interaction information, and action information, the template 130 also includes view information, interaction information, and action information.
The template 130 is compiled from the description information 110 and optimized, and the template 130 may have the following advantages:
having an optimized data structure, the parsing speed and processing performance of the terminal device 140 can be improved;
having the verification information, the stability and security of the terminal device 140 can be improved;
version information with a parser capable of parsing the template 130, compatibility can be guaranteed; and
automatic compression can save network traffic and speed up rendering time.
To this end, in some embodiments, computing device 120 may determine a hash value (e.g., an MD5 value) of descriptive information 110 as the verification information to provide the verification functionality. Further, the computing device 120 can also determine version information of resolvers associated with the terminal device 140 that can resolve the templates 130, e.g., a minimum version of resolvers that can resolve the templates 130. Thus, the computing device 120 may generate the template 130 based on the hash value, the version information, and the description information 110.
For example, the generated template 130 may have the following format:
at 230, computing device 120 can provide template 130 to end device 140 to cause end device 140 to present user interface 145 based on template 130. The operations performed after the terminal device 140 receives the template 130 will be described in conjunction with fig. 5-7.
Fig. 5 illustrates a flow diagram of a method 500 for presenting a user interface 145 according to some embodiments of the present disclosure. Method 500 may be implemented by terminal device 140 as shown in fig. 1. Alternatively, method 500 may be implemented by a subject other than terminal device 140. It should be understood that method 500 may also include additional steps not shown and/or may omit steps shown, as the scope of the present disclosure is not limited in this respect.
At 510, the terminal device 140 obtains a template 130 of the user interface 145 to be presented at the terminal device 140. As described above, the template 130 includes: view information associated with a view presented on the user interface 145, interaction information associated with interaction events supported by the user interface 145, and action information associated with actions triggered by the interaction events. At 520, the end device 140 parses the template 130 to extract view information, interaction information, and action information. At 530, the end device 140 causes the end device 140 to present a user interface based on the view information, the interaction information, and the action information.
In some embodiments, the operations performed by the end device 140 may be performed by a parser in the end device 140. Fig. 6 shows a schematic diagram of an example of a process 600 for presenting a user interface, according to some embodiments of the present disclosure.
The parser 610 in fig. 6 may retrieve the template 130 of the user interface 145 to be presented at the terminal device 140. The parser 610 may then parse the template 130 to extract view information, interaction information, and action information. In some embodiments, parser 610 may first extract descriptive information 110 from template 130. For example, the parser 610 may extract version information of a parser capable of parsing the template 130 associated with the terminal device 140 from the template 130, and thus may determine whether the version of the parser 610 is higher than or equal to the lowest version indicated by the version information to ensure that it can parse the description information 110, thereby ensuring compatibility.
Furthermore, in some cases, there may be errors in the templates 130 received by the parser 610, for example, due to poor network conditions. To do so, the parser 610 may extract the hash value associated with the description information 110 and the candidate description information from the template 130. Parser 610 may then determine a hash value of the candidate description information and compare it to the extracted hash value to determine if the two match. In case of a match, the parser 610 may consider the candidate description information as correct and determine it as description information 110, thereby ensuring security and stability.
Then, the parser 610 may parse the description information 110 according to a predetermined protocol to extract view information, interaction information, and action information. Thus, parser 610 may cause end device 140 to present user interface 145 based on the view information, interaction information, and action information.
As described above, in the descriptive information 110, action information is nested in interaction information, and interaction information is nested in view information. Parser 610 may preserve the nested relationship between view information, interaction information, and action information. Further, the parser 610 may also retain the binding relationship between the interaction information and the action information to ensure that the desired action can be triggered when the related interaction event occurs.
Further, the parser 610 may generate native view information based on the view information according to rules of the native view processor 630 in the terminal device 140, thereby constructing the view information in a format that the native view processor 630 can accept. Parser 610 may also bind the native view information and interaction information according to such rules to ensure that the user can respond to an expected interaction event when the interaction event is triggered.
Thus, the parser 610 may provide the bound native view information 620 to the view processor 630, causing the view processor 630 to render the view to cause the end device 140 to present the user interface 145. Fig. 7 shows a schematic diagram 700 of an example of a user interface 145 according to some embodiments of the present disclosure. As shown in FIG. 7, user interface 710 is an example of user interface 145 when no interaction event has occurred.
In some embodiments, there may be multiple views in the user interface 145. In this case, the parser 610 may also combine the bound native view information for the multiple views according to the rules of the view processor 630, for example, to determine the order, layout, etc. of the multiple views. The parser 610 then provides the combined, bound native view information 620 to the view processor 630.
In addition to rendering the view, the view processor 630 may also monitor for an interaction event and upon monitoring for an interaction event, send an interaction event notification to the parser 610. The interaction event notification indicates an interaction by the user that occurred on the user interface 145. The parser 610 may receive an interaction event notification from the view processor 610. The parser 610 may determine interaction information associated with the interaction based on the interaction event notification and determine action information associated with (e.g., bound to) the interaction information based on the interaction information. Thus, parser 610 may perform an action associated with the action information.
For example, view processor 630 sends an interaction event notification to parser 610 when it detects that the text "Hello DreamBox" is clicked. The parser 610, upon receiving the interaction event notification, may determine the interaction information as interaction information associated with the click and determine a pop-up dialog window action bound to the click interaction information. Thus, the parser 610 may pop up a dialog window, thereby presenting the user interface 720.
In this manner, the present solution can support timely updating of the user interface in a simple, fast, and maintainable manner without degrading the user experience, relying solely on the descriptive information 110 and without using a runtime language parsing engine such as a JS engine.
In the above, a process developed at computing device 120 and presenting user interface 145 at terminal device 140 is described. Further, computing device 120 may also optimize development debugging. Conventionally, after a new data code file is written each time, a terminal device code project needs to be recompiled to see the effect. Typically, a complete code engineering compilation takes a long time (e.g., minutes), while the implementation of the intended function typically requires several compilations, wasting a lot of time in the debugging phase.
Fig. 8 shows a schematic diagram of an example of a process 800 for optimizing debugging in accordance with some embodiments of the present disclosure. The computing device 120 may listen to the descriptive information 110 to determine whether the descriptive information 110 changes relative to the previously described information. If the descriptive information 110 changes, the computing device 120 may regenerate the template 130 based on the changed descriptive information 110.
The computing device 120 may present the changing descriptive information 110. The descriptive information 110 may be developer-friendly and readable to facilitate verification of compilation efforts and debugging of problems.
Further, the computing device 120 can also present an address 820 (e.g., a two-dimensional code address) for obtaining the regenerated template 130 to enable another terminal device 830 for debugging (e.g., a developer's terminal device such as a smartphone) to obtain the regenerated template 130 via the address 820.
In some embodiments, the terminal device 830 for debugging may be bound to the template 130 corresponding to the descriptive information 110 by the presented address. After regenerating the template 130, the computing device 120 can provide the regenerated template 130 to the bound terminal device 830. For example, computing device 120 may have a broadcaster. The broadcaster may attempt to deliver the regenerated template 130 to one or more of the terminal devices that have been bound. The broadcaster may also respond to inquiry requests received from one or more terminal devices that have bound to it to pass the latest template 130 to the requesting terminal device.
In this way, after the description information 110 is changed, the regenerated template 130 corresponding thereto may be directly provided to the terminal device 830 for debugging, and the terminal device 830 for debugging may render based on the regenerated template 130, thereby providing a real-time preview. Thus, debug time can be significantly reduced (e.g., on the order of seconds) and the latest effects can be previewed in real time as the description information 110 changes without requiring compilation by the product engineering.
Fig. 9 illustrates a block diagram of an apparatus 900 for information processing according to some embodiments of the present disclosure. For example, the apparatus 900 may be disposed in the computing device 120. As shown in fig. 9, the apparatus 900 includes a description information obtaining module 910 configured to obtain description information related to a user interface to be presented at the terminal device; a generating module 920 configured to generate a template of the user interface based on the description information, the template including: view information associated with a view presented on a user interface, interaction information associated with an interaction event supported by the user interface, and action information associated with an action triggered by the interaction event; and a providing module 930 configured to provide the template to the terminal device to cause the terminal device to present a user interface based on the template.
In some embodiments, the description information obtaining module 910 includes: a determination module configured to determine views, interaction events and actions based on a function to be provided by a user interface; and the description information generation module is configured to generate the description information according to a predetermined protocol based on the view, the interaction event and the action, wherein in the description information corresponding to the template, the action information is nested in the interaction information, and the interaction information is nested in the view information.
In some embodiments, the generation module 920 includes: a hash value determination module configured to determine a hash value describing the information; a version information determination module configured to determine version information of a parser capable of parsing a template associated with a terminal device; and a template generation module configured to generate a template based on the hash value, the version information, and the description information.
In some embodiments, the apparatus 900 further comprises: a change determination module configured to determine whether the description information changes with respect to the previous description information; and the description information presenting module is configured to present the changed description information if the description information changes.
In some embodiments, the apparatus 900 further comprises: a regeneration module configured to regenerate the template based on the changed description information; an address presenting module configured to present an address for acquiring the regenerated template to enable another terminal device to acquire the regenerated template via the address.
Fig. 10 illustrates a block diagram of an apparatus 1000 for information processing according to some embodiments of the present disclosure. For example, the apparatus 1000 may be provided in the terminal device 140. As shown in fig. 10, the apparatus 1000 includes a template obtaining module 1010 configured to obtain a template of a user interface to be presented at a terminal device, the template including: view information associated with a view presented on a user interface, interaction information associated with an interaction event supported by the user interface, and action information associated with an action triggered by the interaction event; a parsing module 1020 configured to parse the template to extract view information, interaction information, and action information; and a presentation module 1030 configured to cause the terminal device to present a user interface based on the view information, the interaction information, and the action information.
In some embodiments, parsing module 1020 includes: an extraction module configured to extract description information from the template; and a description information parsing module configured to parse the description information according to a predetermined protocol to extract view information, interaction information, and action information, in which the action information is nested in the interaction information, and the interaction information is nested in the view information.
In some embodiments, the extraction module comprises: a template extraction module configured to extract from the template: the method comprises the steps that a hash value associated with description information, version information of a resolver which is associated with a terminal device and can resolve a template, and candidate description information are obtained; and the description information determination module is configured to determine the candidate description information as the description information if the version indicated by the version information is determined to be not higher than the version of the resolver on the terminal equipment and is matched with the hash value and the hash value of the candidate description information.
In some embodiments, the presentation module 1030 comprises: a native view information generation module configured to generate native view information based on the view information according to a rule of a native view processor in the terminal device; a binding module configured to bind the native view information and the interaction information according to a rule; and the native view information providing module is configured to provide the bound native view information to the view processor, so that the view processor renders the view, and the terminal device presents the user interface.
In some embodiments, the apparatus 1000 further comprises: a receiving module configured to receive an interactivity event notification from a view processor, the interactivity event notification indicating an interaction by a user occurring on a user interface; an interaction information determination module configured to determine interaction information associated with the interaction based on the interaction event notification; an action information determination module configured to determine action information associated with the interaction information based on the interaction information; and an execution module configured to execute the action associated with the action information.
FIG. 11 shows a schematic block diagram of an electronic device 1100 that may be used to implement embodiments of the present disclosure. Device 1100 may be used to implement apparatus 900 of fig. 9 and apparatus 1000 of fig. 10. As shown, device 1100 includes a Central Processing Unit (CPU)1101 that can perform various appropriate actions and processes in accordance with computer program instructions stored in a Read Only Memory (ROM)1102 or loaded from a storage unit 1108 into a Random Access Memory (RAM) 1103. In the RAM 1103, various programs and data necessary for the operation of the device 1100 may also be stored. The CPU 1101, ROM 1102, and RAM 1103 are connected to each other by a bus 1104. An input/output (I/O) interface 1105 is also connected to bus 1104.
A number of components in device 1100 connect to I/O interface 1105, including: an input unit 1106 such as a keyboard, a mouse, and the like; an output unit 1107 such as various types of displays, speakers, and the like; a storage unit 1108 such as a magnetic disk, optical disk, or the like; and a communication unit 1109 such as a network card, a modem, a wireless communication transceiver, and the like. The communication unit 1109 allows the device 1100 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
Various processes and processes described above, such as methods 200 and/or 500, may be performed by processing unit 1101. For example, in some embodiments, methods 200 and/or 500 may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as storage unit 1108. In some embodiments, part or all of the computer program may be loaded and/or installed onto device 1100 via ROM 1102 and/or communication unit 1109. When the computer program is loaded into RAM 1103 and executed by CPU 1101, one or more steps of methods 200 and/or 500 described above may be performed. Alternatively, in other embodiments, CPU 1101 may be configured to perform methods 200 and/or 500 in any other suitable manner (e.g., by way of firmware).
The present disclosure may be methods, apparatus, systems, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for carrying out various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of the embodiments of the present disclosure has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (24)

1. An information processing method comprising:
obtaining description information related to a user interface to be presented at a terminal device;
generating a template of the user interface based on the descriptive information, the template comprising:
view information associated with a view presented on the user interface,
interaction information associated with interaction events supported by the user interface, an
Action information associated with an action triggered by the interaction event; and
providing the template to the terminal device to cause the terminal device to present the user interface based on the template.
2. The method of claim 1, wherein obtaining the description information comprises:
determining the view, the interaction event, and the action based on a function to be provided by the user interface;
generating the description information according to a predetermined protocol based on the view, the interaction event, and the action, the action information being nested in the interaction information and the interaction information being nested in the view information in the description information corresponding to the template.
3. The method of claim 1, wherein generating the template comprises:
determining a hash value of the description information;
determining version information of a resolver which is associated with the terminal equipment and can resolve the template; and
generating the template based on the hash value, the version information, and the description information.
4. The method of claim 1, further comprising:
determining whether the description information changes with respect to previous description information;
and if the description information changes, presenting the changed description information.
5. The method of claim 4, further comprising:
regenerating the template based on the changed description information;
presenting an address for acquiring the regenerated template to enable another terminal device to acquire the regenerated template via the address.
6. An information processing method comprising:
obtaining a template of a user interface to be presented at a terminal device, the template comprising:
view information associated with a view presented on the user interface,
interaction information associated with interaction events supported by the user interface, an
Action information associated with an action triggered by the interaction event;
parsing the template to extract the view information, the interaction information, and the action information; and
and enabling the terminal equipment to present the user interface based on the view information, the interaction information and the action information.
7. The method of claim 6, wherein parsing the template to extract the view information, the interaction information, and the action information comprises:
extracting the description information from the template; and
parsing the description information according to a predetermined protocol to extract the view information, the interaction information, and the action information, in the description information, the action information is nested in the interaction information, and the interaction information is nested in the view information.
8. The method of claim 7, wherein extracting the description information from the template comprises:
extracting from the template:
a hash value associated with the descriptive information,
version information of a parser associated with the terminal device capable of parsing the template, an
Candidate description information;
and if the version indicated by the version information is not higher than the version of a resolver on the terminal equipment and is matched with the hash value and the hash value of the candidate description information, determining the candidate description information as the description information.
9. The method of claim 6, wherein causing the terminal device to present the user interface comprises:
generating native view information based on the view information according to rules of a native view processor in the terminal device;
binding the native view information and the interaction information according to the rule; and
and providing the bound native view information to the view processor, and enabling the view processor to render the view so as to enable the terminal device to present the user interface.
10. The method of claim 9, further comprising:
receiving an interactivity event notification from the view processor, the interactivity event notification indicating the interaction by a user that occurred on the user interface;
determining the interaction information associated with the interaction based on the interaction event notification;
determining the action information associated with the interaction information based on the interaction information; and
performing the action associated with the action information.
11. An apparatus for information processing, comprising:
a description information acquisition module configured to acquire description information related to a user interface to be presented at a terminal device;
a generation module configured to generate a template of the user interface based on the description information, the template comprising:
view information associated with a view presented on the user interface,
interaction information associated with interaction events supported by the user interface, an
Action information associated with an action triggered by the interaction event; and
a providing module configured to provide the template to the terminal device to cause the terminal device to present the user interface based on the template.
12. The apparatus of claim 11, wherein the description information acquisition module comprises:
a determination module configured to determine the view, the interaction event, and the action based on a function to be provided by the user interface;
a description information generation module configured to generate the description information according to a predetermined protocol based on the view, the interaction event, and the action, the action information being nested in the interaction information and the interaction information being nested in the view information in the description information corresponding to the template.
13. The apparatus of claim 11, wherein the generating means comprises:
a hash value determination module configured to determine a hash value of the description information;
a version information determination module configured to determine version information of a parser associated with the terminal device capable of parsing the template; and
a template generation module configured to generate the template based on the hash value, the version information, and the description information.
14. The apparatus of claim 11, further comprising:
a change determination module configured to determine whether the description information changes relative to previous description information;
a description information presentation module configured to present the changed description information if the description information changes.
15. The apparatus of claim 14, further comprising:
a regeneration module configured to regenerate the template based on the changed description information;
an address presenting module configured to present an address for acquiring the regenerated template to enable another terminal device to acquire the regenerated template via the address.
16. An apparatus for information processing, comprising:
a template acquisition module configured to acquire a template of a user interface to be presented at a terminal device, the template comprising:
view information associated with a view presented on the user interface,
interaction information associated with interaction events supported by the user interface, an
Action information associated with an action triggered by the interaction event;
a parsing module configured to parse the template to extract the view information, the interaction information, and the action information; and
a presentation module configured to cause the terminal device to present the user interface based on the view information, the interaction information, and the action information.
17. The apparatus of claim 16, wherein the parsing module comprises:
an extraction module configured to extract the description information from the template; and
a description information parsing module configured to parse the description information according to a predetermined protocol to extract the view information, the interaction information, and the action information, in which the action information is nested in the interaction information and the interaction information is nested in the view information.
18. The apparatus of claim 17, wherein the extraction module comprises:
a template extraction module configured to extract from the template:
a hash value associated with the descriptive information,
version information of a parser associated with the terminal device capable of parsing the template, an
Candidate description information;
a description information determination module configured to determine the candidate description information as the description information if it is determined that the version indicated by the version information is not higher than the version of the parser on the terminal device and matches the hash value and the hash value of the candidate description information.
19. The device of claim 16, wherein the presentation module comprises:
a native view information generation module configured to generate native view information based on the view information according to rules of a native view processor in the terminal device;
a binding module configured to bind the native view information and the interaction information according to the rule; and
a native view information providing module configured to provide the bound native view information to the view processor, so that the view processor renders the view to cause the terminal device to present the user interface.
20. The apparatus of claim 19, further comprising:
a receiving module configured to receive an interactivity event notification from the view processor, the interactivity event notification indicating the interaction by a user occurring on the user interface;
an interaction information determination module configured to determine the interaction information associated with the interaction based on the interaction event notification;
an action information determination module configured to determine the action information associated with the interaction information based on the interaction information; and
an execution module configured to execute the action associated with the action information.
21. An electronic device, the electronic device comprising:
one or more processors; and
memory storing one or more programs that, when executed by the one or more processors, cause the electronic device to implement the method of any of claims 1-5.
22. An electronic device, the electronic device comprising:
one or more processors; and
memory storing one or more programs that, when executed by the one or more processors, cause the electronic device to implement the method of any of claims 6-10.
23. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-5.
24. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 6-10.
CN202011197443.6A 2020-10-30 2020-10-30 Information processing method, apparatus, device and medium Pending CN112306324A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011197443.6A CN112306324A (en) 2020-10-30 2020-10-30 Information processing method, apparatus, device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011197443.6A CN112306324A (en) 2020-10-30 2020-10-30 Information processing method, apparatus, device and medium

Publications (1)

Publication Number Publication Date
CN112306324A true CN112306324A (en) 2021-02-02

Family

ID=74333544

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011197443.6A Pending CN112306324A (en) 2020-10-30 2020-10-30 Information processing method, apparatus, device and medium

Country Status (1)

Country Link
CN (1) CN112306324A (en)

Similar Documents

Publication Publication Date Title
US9372688B1 (en) Automatic discovery of a JavaScript API
US7870482B2 (en) Web browser extension for simplified utilization of web services
KR101645052B1 (en) Debugging pipeline
US9959198B1 (en) Simulated testing of API
WO2018228211A1 (en) Application conversion method, apparatus and device
MX2008011058A (en) Rss data-processing object.
US9491266B2 (en) Representational state transfer communications via remote function calls
US20190220304A1 (en) Method and device for processing application program page, and storage medium
CN107239265B (en) Binding method and device of Java function and C function
CN111045653A (en) System generation method and device, computer readable medium and electronic equipment
CN112306324A (en) Information processing method, apparatus, device and medium
CN110598135A (en) Network request processing method and device, computer readable medium and electronic equipment
KR101553539B1 (en) Dynamic link providing method and computer readable storage medium for program therefor
CN107451162B (en) Network resource access device, hybrid device and method
CN110020329B (en) Method, device and system for generating webpage
WO2021087858A1 (en) Web application component migration to a cloud computing system
CN112579427A (en) Method and device for generating virtualization test code
CN111176718A (en) Script online method and device, storage medium and electronic equipment
CN113326079A (en) Service version switching method, switching device, electronic equipment and storage medium
CN110825622A (en) Software testing method, device, equipment and computer readable medium
CN112395098A (en) Application program interface calling method and device, storage medium and electronic equipment
Shustov Evaluation and development of a progressive web app
CN111414154A (en) Method and device for front-end development, electronic equipment and storage medium
CN112882698A (en) Development environment generation method and device, computer storage medium and electronic device
CN110647331A (en) Development tool acquisition method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination