WO2022042162A1 - 用户接口界面实现方法及装置 - Google Patents

用户接口界面实现方法及装置 Download PDF

Info

Publication number
WO2022042162A1
WO2022042162A1 PCT/CN2021/108273 CN2021108273W WO2022042162A1 WO 2022042162 A1 WO2022042162 A1 WO 2022042162A1 CN 2021108273 W CN2021108273 W CN 2021108273W WO 2022042162 A1 WO2022042162 A1 WO 2022042162A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
interface
description file
application
control
Prior art date
Application number
PCT/CN2021/108273
Other languages
English (en)
French (fr)
Inventor
熊石一
汤博
陈晓晓
殷志华
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202011475517.8A external-priority patent/CN114115870A/zh
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21860002.1A priority Critical patent/EP4191400A4/en
Priority to US18/042,929 priority patent/US20230325209A1/en
Publication of WO2022042162A1 publication Critical patent/WO2022042162A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/60Software deployment
    • G06F8/61Installation

Definitions

  • the present application relates to the technical field of terminals, and in particular, to a method and apparatus for implementing a user interface interface.
  • a developer usually develops an application (application, App) based on a certain operating system (operator system, OS) platform.
  • OS operating system
  • UI development a very important task is to develop the user interface (UI) of the App.
  • the developer uses a software development kit (software development kit, SDK) provided by the OS platform to develop the UI of the App.
  • SDK software development kit
  • UI development mainly includes interface description and interface behavior definition.
  • Interface description refers to the use of an interface description language to describe the layout of the UI, the controls used, and the visual style of the layout and controls.
  • Interface behavior definition refers to using interface description language to define interface behavior; interface behavior includes dynamic changes of UI and responses of electronic devices to dynamic changes of UI (such as responses to user operations on UI).
  • Each OS platform has its corresponding interface description language; for example Use xml (extensible markup language, extensible markup language) format, Use the embedded domain specific language (EDSL) built in swift for interface description and interface behavior definition.
  • the UI engine provided by the OS platform can interpret the interface description language that executes the UI, and render the UI for presentation to the user.
  • each OS platform also has corresponding programming languages for implementing interface behavior, realizing dynamic changes of UI, and responding to user operations on UI; for example, using JAVA, Implement interface behavior using the Swift programming language.
  • the embodiments of the present application provide a method and device for implementing a user interface interface, which can provide rich UI programming capabilities, and provide convenience for developers to develop a UI that is compatible with the operating system and has rich functions.
  • the application adopts the following technical solutions:
  • the present application provides a method for implementing a user interface interface, including: an application installation package of a first application of an electronic device, where the application installation package includes a first description file and a second description file; wherein the first description file and the second description file
  • the description file is used to perform interface description and interface behavior definition on the first user interface UI of the first application; the first description file adopts the first interface description language, and the second description file adopts the second interface description language; the first interface description language Different from the second interface description language.
  • the electronic device runs the first application; wherein, the first UI engine of the electronic device reads, parses and executes the first description file to generate the first part of the first UI; the second UI engine of the electronic device reads, parses and executes the second The description file generates the second part of the first UI; the electronic device displays the first UI.
  • the operating system of the electronic device includes two UI engines, which respectively parse and execute two different interface description languages.
  • One of the UI engines can be a generic OS (like ) UI engine, which can parse common interface description languages; another UI engine is an extended UI engine not related to the OS platform, which can parse DSL.
  • developers can use the basic interface description language to describe the UI layout, included controls, etc.; and optionally use DSL to apply custom UI programming capabilities to some controls, add some animation effects to the UI, and so on.
  • the extended UI engine provided by the embodiment of the present application is not related to the OS platform, so it can adapt to various OS platforms, the technical realization difficulty is low, and it is convenient for developers to use.
  • generating the first part of the first UI by the first UI engine of the electronic device includes: the first UI engine of the electronic device generating one or more first controls in the first UI according to the first description file; Wherein, one or more first controls have the first UI programming capability.
  • the first control is a general-purpose OS (such as ) generated controls.
  • the first UI programming capability is a general-purpose OS (such as ) supported UI programming capabilities. For example, including setting the length, width, height, spacing, and color of the control; selecting the control, and entering text on the control.
  • the second UI engine of the electronic device generating the second part of the first UI includes: the second UI engine of the electronic device applies the second UI to one or more first controls according to the second description file programming ability. That is, the developer can use the custom interface description language to apply the custom UI programming capability to the controls generated by the general OS in the second description file, so as to expand the capabilities of the general OS controls and enrich the use effects of the general OS controls.
  • the second UI engine of the electronic device generating the second part of the first UI includes: the second UI engine of the electronic device generating one or more second controls in the first UI according to the second description file ; wherein, one or more second controls have the second UI programming capability.
  • the second control is a custom control provided by the OEM OS in the embodiment of the application, supports a self-defined second UI programming capability, and supports rich control effects.
  • the second UI programming capability includes at least one of visual attribute capability, layout capability, unified interaction capability and dynamic effect capability.
  • the layout capability is used to describe the layout of the controls in the UI; for example, the shape, position, and size of the controls.
  • the visual property capability is used to describe the visual properties of the control; for example, the color, grayscale and other visual effects of the control.
  • Unified Interaction capabilities are used to provide control responses based on user actions; such as performing searches based on the user's "confirmation" action.
  • the animation capability is used to display animation effects on the control; for example, displaying the click rebound animation effect on the control.
  • the layout capability includes at least one of stretching, hiding, folding, dividing, proportioning, and extending.
  • stretching refers to the display ability of the control's width and height to be enlarged or reduced according to different proportions
  • hiding refers to the display ability of the control to be visible or invisible in the display interface
  • wrapping refers to the content of the control in the display interface.
  • the average point refers to the display ability of the control evenly distributed in the display interface
  • the proportion refers to the ability of the control to occupy the total layout according to the specified percentage in the specified direction
  • extension refers to the control in the display interface.
  • the first UI engine is triggered to read the first description file, and the second UI engine reads the second description file. That is, the distribution process is controlled by the second UI engine, and the first UI engine and the second UI engine are triggered to parse and execute the description file.
  • the first description file and the second description file have different paths in the application installation package.
  • the first UI engine and the second UI engine respectively read description files in different paths according to preset rules.
  • the first description file and the second description file are preset with different tags.
  • the first UI engine and the second UI engine respectively read the corresponding description files according to the preset tags.
  • the second UI engine also performs syntax verification on the second interface description language; if the syntax verification passes, the second UI engine parses and executes the second description file.
  • the second UI engine of the electronic device implements the mapping between the device event and the user behavior in the second description file; in response to the device event, executes the control action corresponding to the user behavior in the second description file.
  • the OEM OS can map events triggered by different electronic devices to the same user behavior (for example, map the mouse double-click event on the PC to the "confirm” behavior, and map the finger click event on the mobile phone to the "confirm” behavior "behavior"), to avoid developers from defining the corresponding relationship between device events and user behaviors for different forms of electronic devices, resulting in duplication of effort; making the same description file applicable to various forms of electronic devices, reducing the difficulty of development and bringing developers Come convenient.
  • the second UI engine includes a set of syntax and semantic specifications for fields in the second description file.
  • developers can develop UIs on the OEM OS platform according to the OEM OS syntax and semantics specifications.
  • the first interface description language is an extensible markup language xml language
  • the second interface description language is a domain-specific language DSL.
  • the present application provides a method for implementing a user interface interface, including: displaying a development interface of a first application; the development interface of the first application includes a first description file and a second description file; the first description file and the second description The file is used to perform interface description and interface behavior definition for the first user interface UI of the first application; the first description file adopts the first interface description language, and the second description file adopts the second interface description language; the first interface description language and the The second interface description language is different; in response to the first operation input by the user, a description of the first part of the first UI is added to the first description file; in response to the second operation input by the user, a description of the first part is added to the second description file The description of the second part of the first UI; the application installation package of the first application is generated according to the first description file and the second description file.
  • developers can jointly develop UI using two different interface description languages.
  • One of the languages is a general-purpose OS (like ) supports the basic interface description language, and the other language is the custom interface description language.
  • Developers can use the basic interface description language to describe the UI layout, included controls, etc.; and optionally use DSL to apply custom UI programming capabilities to some controls, add some animation effects to the UI, and so on.
  • the custom interface description language is not related to the OS platform, it can adapt to a variety of OS platforms, the technical implementation difficulty is low, and it is convenient for developers to use.
  • adding a description of the first part of the first UI to the first description file includes: adding a description of one or more first controls in the first UI to the first description file; The one or more first controls apply first UI programming capabilities.
  • the first control is a general-purpose OS (such as ) supported controls.
  • the first UI programming capability is a general-purpose OS (such as ) supported UI programming capabilities. For example, including setting the length, width, height, spacing, and color of the control; selecting the control, and entering text on the control.
  • adding a description of the second part of the first UI to the second description file includes: adding a description of one or more first controls to the second description file; A first control applies a second UI programming capability.
  • the developer can use the custom interface description language to apply the custom UI programming capability to the controls generated by the general OS in the second description file, so as to expand the capabilities of the general OS controls and enrich the use effects of the general OS controls.
  • adding a description of the second part of the first UI to the second description file includes: adding a description of one or more second controls to the second description file;
  • a second control applies a second UI programming capability.
  • the second control is a custom control provided by the OEM OS in the embodiment of the application, supports a self-defined second UI programming capability, and supports rich control effects.
  • the second UI programming capability includes at least one of visual attribute capability, layout capability, unified interaction capability and dynamic effect capability.
  • the layout capability includes at least one of stretching, hiding, folding, dividing, proportioning, and extending.
  • the first description file and the second description file have different paths in the application installation package.
  • the first UI engine and the second UI engine of the OEM OS can respectively read files in different paths according to preset rules to obtain corresponding description files.
  • the first description file and the second description file are preset with different tags.
  • the first UI engine and the second UI engine of the OEM OS can respectively read the corresponding description files according to the preset tags.
  • the present application provides a computer-readable storage medium, including computer instructions for performing interface description and interface behavior definition on a first user interface UI of a first application, wherein the computer instructions include storage in The first instruction in the first description file, and the second instruction stored in the second description file; the first description file adopts the first interface description language, and the second description file adopts the second interface description language; the first interface description language Different from the second interface description language; the first instruction is used to describe the first part of the first UI, and the second instruction is used to describe the second part of the first UI.
  • developers use two different interface description languages to jointly develop UI.
  • One of the interface description languages is a general-purpose OS (such as ) supports the basic interface description language, and another interface description language is the custom interface description language.
  • Developers use the basic interface description language to describe the UI layout, included controls, etc.; and optionally use DSL to apply custom UI programming capabilities to some controls, add some animation effects to the UI, and so on. Since the custom interface description language is not related to the OS platform, it can adapt to a variety of OS platforms, the technical implementation difficulty is low, and it is convenient for developers to use.
  • the first instruction is specifically used to: describe one or more first controls in the first UI, and apply the first UI programming capability to the one or more first controls.
  • the first control is a general-purpose OS (such as ) supported controls.
  • the first UI programming capability is a general-purpose OS (such as ) supported UI programming capabilities. For example, including setting the length, width, height, spacing, and color of the control; selecting the control, and entering text on the control.
  • the second instruction is specifically used to: apply the second UI programming capability to one or more first controls.
  • the second instruction is specifically used to: describe one or more second controls in the first UI, and apply the second UI programming capability to the one or more second controls.
  • developers can use a custom interface description language to apply custom UI programming capabilities to the controls generated by the general-purpose OS in the second description file to expand the capabilities of general-purpose OS controls; they can also add customizations with rich control effects. controls.
  • the second UI programming capability includes at least one of visual attribute capability, layout capability, unified interaction capability and dynamic effect capability.
  • Layout capabilities include at least one of: stretch, hide, wrap, divide, proportion, and stretch.
  • the first description file and the second description file have different paths in the computer-readable storage medium.
  • the first UI engine and the second UI engine of the OEM OS can respectively read files in different paths according to preset rules to obtain corresponding description files.
  • the first description file and the second description file are preset with different tags.
  • the first UI engine and the second UI engine of the OEM OS can respectively read the corresponding description files according to the preset tags.
  • the present application provides a computer-readable storage medium, for example, an application development tool
  • the application development tool may specifically include computer instructions, and when the computer instructions are executed on the above-mentioned electronic device, the electronic device can be made to execute the above-mentioned first.
  • the present application provides an electronic device, comprising: a display screen, an input device, one or more processors, one or more memories, and one or more computer programs; wherein the processor and the input device, the display The screen and the memory are all coupled, and the above-mentioned one or more computer programs are stored in the memory, and when the electronic device runs, the processor can execute the one or more computer programs stored in the memory, so that the electronic device executes any of the first aspects. one of the methods described.
  • the present application provides an electronic device, comprising: a display screen, one or more processors, one or more memories, and one or more computer programs; wherein the processor is coupled to both the display screen and the memory,
  • the above-mentioned one or more computer programs are stored in the memory, and when the electronic device runs the above-mentioned first application, the processor can execute the one or more computer programs stored in the memory, so that the electronic device executes any one of the above-mentioned second aspects the method described.
  • the embodiments of the present application provide a user interface interface implementation method and device, which can realize one-time development and multi-device deployment; namely, develop a set of interface description files, which are suitable for various types of electronic devices; and reduce the development difficulty of developers.
  • the application adopts the following technical solutions:
  • the present application provides a method for implementing a user interface interface, including: a first electronic device and a second electronic device respectively download an application installation package of a first application from a server; and install the application installation package respectively.
  • the application installation package includes a description file and a resource file; wherein, the description file is used to perform interface description and interface behavior definition on the first UI of the first application; the resource file includes the resources used to generate the UI of the first application.
  • the first electronic device reads the first code corresponding to the device type of the first electronic device in the description file, and uses the resources of the resource file to generate the first UI of the first electronic device according to the definition of the first code; the second electronic device reads The second code corresponding to the device type of the second electronic device in the description file, the first UI of the second electronic device is generated using the resources of the resource file according to the definition of the second code; the device type of the first electronic device is the same as that of the second electronic device.
  • the device type of the electronic device may include a mobile phone, a smart TV, a smart watch, a tablet computer, a notebook computer, a netbook, a large screen, a car computer, and the like.
  • the method further includes: the first electronic device generates the first control in the first UI of the first electronic device according to the definition of the third code in the description file, the first electronic device
  • the first control in the first UI of the first electronic device has the control attribute customized by the operating system of the first electronic device;
  • the third electronic device generates the first control in the first UI of the third electronic device according to the definition of the third code in the description file, and the third electronic device generates the first control in the first UI of the third electronic device according to the definition of the third code in the description file.
  • the first control in the first UI of the three electronic devices has control properties of a general operating system; wherein, the third code is part or all of the first code.
  • the first control is defined in the description file to support the control property customized by the operating system.
  • the operating system of the first electronic device provides custom control attributes, and the first control in the first UI of the first electronic device has control attributes customized by the operating system of the first electronic device; the third electronic device supports a general operating system (such as ), the first control in the first UI of the third electronic device has the control attribute of the general operating system.
  • the same description file can be successfully run in different operating systems, so that the cross-operating system platform can be run, and the development difficulty of developers is reduced.
  • the present application provides a method for implementing a user interface interface, comprising: downloading an application installation package of a first application by a first electronic device; and installing the application installation package; wherein the application installation package includes a description file and a resource file; The file is used for interface description and interface behavior definition for the first user interface UI of the first application; the resource file includes resources used to generate the UI of the first application.
  • the first electronic device reads the first code corresponding to the device type of the first electronic device in the description file, and generates the first UI of the first electronic device by using the resources of the resource file according to the definition of the first code.
  • the electronic device reads the code corresponding to the device type of the electronic device in the description file.
  • different electronic devices can present different UI layouts by reading the same description file. It can realize the development of a set of description files, which is suitable for various types of electronic devices, and reduces the development difficulty of developers.
  • the first electronic device generates the first control in the first UI of the first electronic device according to the definition of the third code in the description file, and the first control in the first UI of the first electronic device
  • the first control has control properties defined by the operating system of the first electronic device; wherein, the third code is part or all of the first code.
  • the first control is defined in the description file to support the control property customized by the operating system.
  • the operating system of the first electronic device provides customized control attributes, and the first control in the first UI of the first electronic device has the control attributes customized by the operating system of the first electronic device.
  • the operating system of the first electronic device includes a custom UI programming capability, and the custom UI programming capability is used to provide a customized operating system of the first electronic device. Control properties.
  • the custom control properties include at least one of visual properties, layout properties, interaction properties, dynamic properties, and software and hardware dependency properties.
  • the layout attributes include at least one of stretching, hiding, folding, dividing, proportioning, and extending.
  • the first UI of the first electronic device includes a second control, and the second control has control properties of a general operating system. That is to say, the first UI generated by the first electronic device according to the description file may include controls with control properties customized by the operating system of the first electronic device, or controls with control properties of a general operating system. Provides more morphological controls.
  • the first UI of the first electronic device includes a third control, and the third control has a custom control attribute in the first application.
  • the developer can customize the properties of the controls belonging to the first application in the files of the installation package, so that the UI is richer.
  • the description file includes a fourth code, which is used to define the control attribute of the fourth control in the first UI of the first electronic device and the operation of the first electronic device The corresponding relationship of the first data in the system.
  • the method further includes: the first electronic device receives a first input of the user on the fourth control; and modifying the value of the first data according to the first input.
  • the developer defines the corresponding relationship between the control attributes of the control and the background data in the operating system in the description file; the UI engine of the electronic device realizes the function of modifying the background data according to the user input. It avoids the developer to describe the implementation of modifying the background data according to the user input in the description file, and reduces the development difficulty of the developer.
  • the method further includes: the control attribute of the fourth control in the first UI of the first electronic device follows the first control attribute in the operating system of the first electronic device data changes.
  • the developer defines the corresponding relationship between the control attributes of the control and the background data in the operating system in the description file; the UI engine of the electronic device realizes that the control attributes of the control change with the change of the background data in the operating system of the electronic device .
  • the controls in the UI can be changed with the change of the parameters of the electronic device, and the realization that the control properties of the controls described by the developer in the description file are changed with the change of the parameters of the electronic device is avoided, and the development difficulty of the developer is reduced.
  • the present application provides a method for implementing a user interface interface, including: displaying a development interface of a first application; the development interface of the first application includes a description file for performing the first user interface interface UI of the first application. Interface description and interface behavior definition; in response to the first operation input by the user, a first code corresponding to the device type of the first electronic device is added to the description file; in response to the second operation input by the user, the description file is added with The second code corresponding to the device type of the second electronic device; the application installation package of the first application is generated according to the description file.
  • the device type of the first electronic device is different from the device type of the second electronic device.
  • the device type of the electronic device may include a mobile phone, a smart TV, a smart watch, a tablet computer, a notebook computer, a netbook, a large screen, a car computer, and the like.
  • a description file includes codes corresponding to different types of electronic devices. Different types of electronic devices can present different UI layouts by reading the same description file of the same UI. It can realize the development of a set of description files, which is suitable for various types of electronic devices, and reduces the development difficulty of developers.
  • the application installation package of the first application further includes a resource file, and the resource file includes resources used to generate the UI of the first application.
  • the description file includes a third code that defines that the first control has a control property defined by the operating system of the first electronic device, and defines that the second control has a general-purpose operating system.
  • the fourth code for the control property may include controls with control properties customized by the operating system of the first electronic device, or controls with control properties of a general operating system. Provides more morphological controls.
  • control attributes customized by the operating system of the first electronic device include at least one of visual attributes, layout attributes, interaction attributes, dynamic effect attributes and software and hardware dependency attributes.
  • the layout attributes include at least one of stretching, hiding, folding, dividing, proportioning, and extending.
  • the present application provides a computer-readable storage medium, for example, an application development tool
  • the application development tool may specifically include computer instructions, and when the computer instructions are executed on the above-mentioned electronic device, the electronic device can be made to execute the above-mentioned ninth The method of any one of the aspects.
  • the present application provides a computer-readable storage medium, including computer instructions for performing interface description and interface behavior definition on a first user interface UI of a first application; wherein the computer instructions include The first code corresponding to the device type of the first electronic device and the second code corresponding to the device type of the second electronic device; wherein the device type of the first electronic device is different from the device type of the second electronic device.
  • the device type of the electronic device may include a mobile phone, a smart TV, a smart watch, a tablet computer, a notebook computer, a netbook, a large screen, a car computer, and the like.
  • the computer instructions further include resources used to generate the UI of the first application.
  • the computer instructions further include a third code that defines that the first control has a control attribute defined by the operating system of the first electronic device, and defines that the second control has a general-purpose operating system.
  • the fourth code of the control property is not limited to the eleventh aspect.
  • the present application provides an electronic device, comprising: a display screen, an input device, one or more processors, one or more memories, and one or more computer programs; wherein the processor and the input device, The display screen and the memory are both coupled, and the above-mentioned one or more computer programs are stored in the memory.
  • the processor can execute the one or more computer programs stored in the memory, so that the electronic device executes the eighth aspect. any of the methods described.
  • the present application provides an electronic device, comprising: a display screen, one or more processors, one or more memories, and one or more computer programs; wherein the processor is coupled to both the display screen and the memory , the above-mentioned one or more computer programs are stored in the memory, when the electronic device runs the above-mentioned first application, the processor can execute the one or more computer programs stored in the memory, so that the electronic device executes any one of the above ninth aspects method described in item.
  • Embodiments of the present application provide a method and device for implementing a user interface interface, which supports displaying various layout modes and control types on the UI of an application widget, so as to facilitate users to use the application widget and improve user experience.
  • the application adopts the following technical solutions:
  • the present application provides a method for implementing a user interface interface, comprising: a first application process of an electronic device reads a component interface description file, and generates first widget UI data according to the component interface description file, The controls in the widget UI data are bound with the background data in the electronic device operating system; wherein, the component interface description file is used to perform interface description and interface behavior definition on the first UI of the application widget of the first application; then, the third An application process sends first data to the application widget process; the application widget process receives the first data, obtains the first widget UI data according to the first data, and displays the first UI of the application widget according to the first widget UI data .
  • both the application process and the application widget process generate widget UI data according to the component interface description file.
  • the application process binds the controls in the widget UI data to the background data, and the application widget process displays the widget UI data as the application widget UI.
  • the developer can define various types of controls in the component interface description file, so that the UI of the application widget supports various types of controls.
  • the application process can execute corresponding business logic according to the corresponding relationship between the controls in the UI data of the widget and the background data.
  • what the first application process sends to the application widget process is the component interface description file; the application widget process receives the component interface description file, and generates the first widget UI data according to the component interface description file, And display the first UI of the application widget according to the first widget UI data.
  • what the first application process sends to the application widget process is the first widget UI data; the application widget process receives the first widget UI data, and displays the application according to the first widget UI data The first UI for widgets.
  • the method further includes: the first application process generates a first control in the UI data of the first widget according to the definition of the first code in the component interface description file, the first control It has control properties native to the operating system of the electronic device.
  • the native controls of the operating system include: input box, check box, sliding selector, scroll view, radio button, rating bar, search box, drag bar, or switch, etc.
  • the developer can define various control properties native to the operating system in the component interface description file, so that the UI of the application widget supports various controls native to the operating system.
  • the method further includes: the first application process generates a second control in the UI data of the first widget according to the definition of the second code in the component interface description file, the second control Have control properties customized in the operating system of the electronic device.
  • the custom control properties include at least one of visual properties, layout properties, interaction properties, dynamic effect properties, and software and hardware dependency properties.
  • Layout properties include at least one of stretch, hide, wrap, divide, proportion, and stretch.
  • the developer can define various control properties customized in the operating system in the component interface description file, so that the UI of the application widget supports various customized controls in the operating system.
  • the component interface description file includes a third code, which is used to define the control attribute of the third control in the first UI of the application widget and the first control property in the operating system of the electronic device.
  • the method further includes: the electronic device receives a first input of the user on the third control; and modifies the value of the first data according to the first input.
  • control property of the third control in the first UI of the application widget changes with the change of the first data in the operating system of the electronic device.
  • the method further includes: the electronic device downloads an application installation package of the first application from the server; the application installation package includes a component interface description file; the electronic device uses the application installation package to install first application.
  • the component interface description file is obtained from the application installation package.
  • the present application provides a method for implementing a user interface interface, including: displaying a development interface of a first application; the development interface of the first application includes a component interface description file; the component interface description file is used for applying the first application
  • the first UI of the widget performs interface description and interface behavior definition.
  • a first code that defines the first control in the first widget UI is added to the component interface description file; wherein, the first control has control properties native to the operating system; the native controls of the operating system include : Input boxes, checkboxes, sliding selectors, scroll views, radio buttons, rating bars, search boxes, drag bars, or switches, etc.
  • An application installation package of the first application is generated according to the component interface description file.
  • the developer can define in the component interface description file that the control has the native control properties of the operating system.
  • the UI of the application widget running on the electronic device includes various controls with the native control properties of the operating system.
  • the method further includes: in response to the second operation input by the user, adding a second code that defines the second control in the UI of the first widget to the component interface description file , the second control has custom control properties in the operating system;
  • the custom control properties include: at least one of visual properties, layout properties, interaction properties, dynamic properties and software and hardware dependency properties.
  • the layout properties include: stretch, hide, wrap, divide, proportion and extend at least one.
  • the developer can define in the component interface description file that the control has the control properties customized by the operating system.
  • the UI of the application widget running on the electronic device includes various controls with control properties customized by the operating system.
  • the present application provides a computer-readable storage medium, for example, an application development tool
  • the application development tool may specifically include computer instructions, and when the computer instructions are executed on the above-mentioned electronic device, the electronic device can be made to execute the above-mentioned first step.
  • the present application provides a computer-readable storage medium, comprising computer instructions for performing interface description and interface behavior definition on a first user interface UI of an application widget of a first application; wherein, The computer instruction includes a first code for generating a first control in the UI of the first widget, and the first control has control properties native to the operating system; the native controls of the operating system include: an input box, a check box, a sliding selector, and a scroll view, Radio buttons, rating bars, search boxes, drag bars, or switches, etc.
  • the computer instruction further includes a second code for generating a second control in the UI of the first widget, and the second control has a custom control attribute in the operating system;
  • the control properties include: at least one of visual properties, layout properties, interaction properties, animation properties, and software and hardware dependency properties.
  • the layout properties include: stretch, hide, wrap, divide, proportion and extend at least one.
  • the present application provides a computer-readable storage medium.
  • the computer-readable storage medium includes a computer program, which, when executed on the electronic device, causes the electronic device to perform the method described in any one of the fourteenth aspect above.
  • the present application provides an electronic device, comprising: a display screen, an input device, one or more processors, one or more memories, and one or more computer programs; wherein the processor and the input device, The display screen and the memory are both coupled, and the above-mentioned one or more computer programs are stored in the memory, and when the electronic device runs, the processor can execute the one or more computer programs stored in the memory, so that the electronic device executes the above-mentioned fourteenth aspect or the method of any one of the fifteenth aspect.
  • Embodiments of the present application provide a method and apparatus for implementing a user interface interface, which supports projecting various UIs on a control device to an IoT device for playback, thereby improving user experience.
  • the application adopts the following technical solutions:
  • the present application provides a method for implementing a user interface interface, comprising: a first electronic device reading a first player interface description file of a first application, and generating a first player according to the first player interface description file UI data, the controls in the UI data of the first player are bound with the background data in the operating system of the first electronic device; wherein, the interface description file of the first player is used to play the first player on the second electronic device.
  • the first player UI of the application performs interface description and interface behavior definition; the first electronic device sends the first data to the second electronic device; the second electronic device receives the first data, and obtains the first player UI data according to the first data, The UI of the first player is displayed according to the UI data of the first player.
  • both the control device and the player generate the player UI data according to the player interface description file.
  • the control device binds the controls in the UI data of the player to the background data, and the player displays the UI data of the player as the UI of the player.
  • developers can define various UIs in the player interface description file, making the player UI richer. It is also possible to define different UI layouts for players of different device types, so that the player UI matches the size and shape of the player screen.
  • the control device may execute corresponding business logic according to the corresponding relationship between the controls in the player UI data and the background data.
  • the first electronic device sends the first player interface description file to the second electronic device, and the second electronic device generates the first player UI data according to the first player interface description file, according to The first player UI data displays the first player UI.
  • the first electronic device sends the UI data of the first player to the second electronic device
  • the second electronic device receives the UI data of the first player, and displays the first player according to the UI data of the first player.
  • a player UI is
  • the method further includes: the second electronic device receives the user's first operation on the UI of the first player; In one operation, the second electronic device sends the first instruction to the first electronic device; the first electronic device receives the first instruction, reads the interface description file of the second player, and generates the second player according to the interface description file of the second player UI data, the controls in the UI data of the second player are bound with the background data in the operating system of the first electronic device; wherein, the interface description file of the second player is used to play the first application on the second electronic device.
  • the second player UI of the second player carries out interface description and interface behavior definition; the first electronic device sends the second player interface description file to the second electronic device; the second electronic device receives the second player interface description file, according to the second player The interface description file generates UI data of the second player, and displays the UI of the second player according to the UI data of the second player.
  • the user can directly operate the player UI on the playback device, control the device to execute the business logic corresponding to the operation, and send the updated player interface description file corresponding to the player UI to the playback device.
  • the updated player interface description file generates an updated player UI. It is realized that the player UI can be directly operated on the playback device, and the player UI can be switched successfully.
  • the method further includes: the second electronic device receives the user's first operation on the UI of the first player; In one operation, the second electronic device sends the first instruction to the first electronic device; the first electronic device receives the first instruction and reads the interface description file of the second player; the interface description file of the second player is used to The second player UI that plays the first application on the electronic device performs interface description and interface behavior definition; the first electronic device generates the second player UI data according to the second player interface description file, and uses the second player UI data in the second player UI data.
  • the control is bound with the background data in the operating system of the first electronic device; the first electronic device sends the second player UI data to the second electronic device; the second electronic device receives the second player UI data, according to the second player The UI data displays the UI of the second player.
  • the user can directly operate the player UI on the playback device, control the device to execute the business logic corresponding to the operation, and send the updated player UI data to the playback device.
  • Update the player UI It is realized that the player UI can be directly operated on the playback device, and the player UI can be switched successfully.
  • the method further includes: the first electronic device downloads an application installation package of the first application from the server; the application installation package includes a first player interface description file and a resource file; The resource file includes the resources used by the player UI for generating the first application; the first electronic device uses the application installation package to install the first application.
  • the method further includes: the first electronic device reads the first code corresponding to the device type of the third electronic device in the interface description file of the first player, and according to the first code
  • the definition of a code uses the resources of the resource file to generate the UI data of the third player; the first electronic device reads the second code corresponding to the device type of the fourth electronic device in the interface description file of the first player, and according to the second code It is defined to use the resources of the resource file to generate the UI data of the fourth player; wherein, the device type of the fourth electronic device is different from the device type of the third electronic device.
  • the first electronic device sends the first player interface description file and the resource file to the third electronic device and the fourth electronic device respectively; the third electronic device according to the first player interface description file corresponding to the device type of the third electronic device
  • the definition of the first code uses the resources of the resource file to generate the third player UI data; the first player UI is displayed according to the third player UI data; the fourth electronic device is based on the first player interface description file and the fourth electronic device.
  • the definition of the second code corresponding to the device type of the device uses the resources of the resource file to generate the fourth player UI data; the first player UI is displayed according to the fourth player UI data.
  • the method further includes: the first electronic device reads the first code corresponding to the device type of the third electronic device in the interface description file of the first player, and according to the first code
  • the definition of a code uses the resources of the resource file to generate the UI data of the third player; the first electronic device reads the second code corresponding to the device type of the fourth electronic device in the interface description file of the first player, and according to the second code It is defined to use the resources of the resource file to generate the UI data of the fourth player; wherein, the device type of the fourth electronic device is different from the device type of the third electronic device.
  • the first electronic device sends the third player UI data to the third electronic device; the third electronic device displays the first player UI according to the third player UI data; the first electronic device sends the fourth player UI to the fourth electronic device data; the fourth electronic device displays the UI of the first player according to the UI data of the fourth player.
  • different types of playback devices present different UI layouts of the playback side according to the same playback side interface description file of the same UI. It can realize the development of a set of playback interface description files, which are suitable for various types of playback devices, reducing the development difficulty of developers.
  • the method further includes: the first electronic device generates the first control in the UI of the first player according to the definition of the third code in the description file of the interface of the first player;
  • a control has control properties customized by the operating system of the first electronic device; wherein, the control properties customized by the operating system of the first electronic device include: visual properties, layout properties, interaction properties, dynamic properties, and software and hardware dependency properties. at least one.
  • Layout properties include at least one of stretch, hide, wrap, divide, proportion, and stretch.
  • the method further includes: displaying, by the first electronic device, the UI of the first player according to the UI data of the first player.
  • the control device and the playback device play the player UI synchronously, which can realize mirroring and projection, and the control device and the playback device work together.
  • the present application provides a method for implementing a user interface interface, including: displaying a development interface of a first application; the development interface of the first application includes a player interface description file; the player interface description file is used to The player UI of the first application of the terminal player performs interface description and interface behavior definition; in response to the user's first input, the first code corresponding to the device type of the first electronic device is added in the player interface description file; in response to The second input of the user adds a second code corresponding to the device type of the second electronic device in the player interface description file; the device type of the first electronic device is different from the device type of the second electronic device; according to the player interface description
  • the file generates an application installation package of the first application.
  • a player interface description file includes codes corresponding to different types of playback devices. Different types of playback devices can present different playback UI layouts by reading the same playback interface description file of the same UI. It can realize the development of a set of playback interface description files, which are suitable for various types of playback devices, reducing the development difficulty of developers.
  • the application installation package of the first application further includes a resource file, and the resource file includes resources used to generate the UI of the player of the first application.
  • the player interface description file includes a third code that defines that the first control in the first player UI has a control attribute customized by the operating system of the first electronic device ;
  • the control properties customized by the operating system of the first electronic device include at least one of visual properties, layout properties, interaction properties, dynamic effect properties and software and hardware dependency properties.
  • Layout properties include at least one of stretch, hide, wrap, divide, proportion, and stretch.
  • the present application provides a computer-readable storage medium, for example, an application development tool
  • the application development tool may specifically include computer instructions, and when the computer instructions are executed on the above-mentioned electronic device, the electronic device can be made to execute the above-mentioned The method of any one of the twenty-first aspect.
  • the present application provides a computer-readable storage medium, including computer instructions for performing interface description and interface behavior definition on the first player UI of the first application; wherein the computer instructions include and The first code corresponding to the device type of the first electronic device and the second code corresponding to the device type of the second electronic device; wherein the device type of the first electronic device is different from the device type of the second electronic device.
  • the device type of the electronic device may include a mobile phone, a smart TV, a smart watch, a tablet computer, a notebook computer, a netbook, a large screen, a car computer, and the like.
  • the computer instructions further include generating resources used by the player UI of the first application.
  • the computer instructions further include a third code that defines that the first control in the UI of the first player has a control attribute customized by the operating system of the first electronic device;
  • the first The control properties customized by the operating system of the electronic device include at least one of visual properties, layout properties, interaction properties, dynamic effect properties and software and hardware dependency properties.
  • Layout properties include at least one of stretch, hide, wrap, divide, proportion, and stretch.
  • the present application provides a computer-readable storage medium.
  • the computer-readable storage medium includes a computer program that, when executed on an electronic device, causes the electronic device to perform the method described in any one of the twentieth aspects above.
  • the present application provides an electronic device, comprising: a display screen, an input device, one or more processors, one or more memories, and one or more computer programs; wherein the processor and the input device , the display screen and the memory are all coupled, the above-mentioned one or more computer programs are stored in the memory, and when the electronic device runs, the processor can execute the one or more computer programs stored in the memory, so that the electronic device executes the above-mentioned twentieth The method of any one of Aspects or the twenty-first aspect.
  • the electronic device and the computer-readable storage medium provided by the above aspects are all applied to the corresponding methods provided above. Therefore, the beneficial effects that can be achieved can refer to the beneficial effects in the corresponding methods provided above. The effect will not be repeated here.
  • FIG. 1 is a schematic diagram of a scenario of a method for implementing a user interface interface provided by an embodiment of the present application
  • FIG. 2 is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a software architecture of an electronic device provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a user interface interface implementation method
  • FIG. 5 is a schematic diagram of a user interface interface implementation method
  • FIG. 6 is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • FIG. 7 is a schematic structural diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • FIG. 9 is a schematic flowchart of a method for implementing a user interface interface provided by an embodiment of the present application.
  • FIG. 10 is a schematic flowchart of a method for implementing a user interface interface provided by an embodiment of the present application
  • FIG. 11 is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • FIG. 14 is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 15 is a schematic diagram of a software architecture of an electronic device provided by an embodiment of the present application.
  • 16 is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • FIG. 17 is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • FIG. 18 is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • FIG. 19 is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • 20A is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • 20B is a schematic flowchart of a method for implementing a user interface interface provided by an embodiment of the present application
  • 20C is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 21 is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • FIG. 22 is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • FIG. 23 is a schematic diagram of a scenario of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 24A is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • 24B is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • 24C is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • 24D is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • 25 is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 26 is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • FIG. 27 is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • FIG. 28 is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • 29A is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 29B is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 29C is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 29D is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • FIG. 30 is a schematic flowchart of a method for implementing a user interface interface provided by an embodiment of the present application.
  • FIG. 31 is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • 32 is a schematic flowchart of a method for implementing a user interface interface provided by an embodiment of the present application.
  • FIG. 33 is a schematic diagram of a scenario of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 34 is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 35A is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 35B is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 36 is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 37A is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 37B is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 38A is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • 38B is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • 39A is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • 39B is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 40A is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • 40B is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 40C is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • 40D is a schematic diagram of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 41A is a schematic flowchart of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 41B is a schematic flowchart of a method for implementing a user interface interface provided by an embodiment of the present application.
  • 42A is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • 42B is a schematic diagram of a scenario example of a user interface interface implementation method provided by an embodiment of the present application.
  • FIG. 43 is a schematic structural composition diagram of an electronic device provided by an embodiment of the present application.
  • references in this specification to "one embodiment” or “some embodiments” and the like mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically emphasized otherwise.
  • the terms “including”, “including”, “having” and their variants mean “including but not limited to” unless specifically emphasized otherwise.
  • the term “connected” includes both direct and indirect connections unless otherwise specified.
  • an application development tool eg, Android Studio, DevEco Studio, etc.
  • an application development tool eg, Android Studio, DevEco Studio, etc.
  • developers use an interface description language to develop UIs in application development tools to form interface description files.
  • the electronic device 200 may also be referred to as a developer device in this application.
  • Interface description files can also be called description files.
  • Interface description refers to the use of an interface description language to describe the layout of the UI, the controls used, and the visual style of the layout and controls.
  • Interface behavior definition refers to using interface description language to define interface behavior; interface behavior includes dynamic changes of UI and responses of electronic devices to dynamic changes of UI (for example, responses of users to UI operations).
  • Each OS platform has its corresponding interface description language; for example Use xml (extensible markup language, extensible markup language) format, Use the embedded domain specific language (EDSL) built in swift for interface description and interface behavior definition.
  • Use xml extensible markup language, extensible markup language
  • EDSL embedded domain specific language
  • the developer packages the interface description file into the installation package of the App, and publishes the App in the application market provided by the server 300 .
  • the installation package of each App can be provided in the application market for users to download.
  • the installation package can be Application package (Android application package, APK) file.
  • a user can use the mobile phone to download an installation package of an App in the application market.
  • the video App can be installed in the mobile phone by running the installation package.
  • the mobile phone also obtains the interface description file in the installation package.
  • the mobile phone can build the UI according to the interface description file.
  • the UI engine provided by the OS platform of the mobile phone interprets and executes the interface description language, and renders the UI for presentation to the user.
  • the constructed UI is presented on the display device (such as the display screen) of the mobile phone.
  • the OS platform of the mobile phone also executes a programming language that implements interface behavior, implements dynamic changes to the UI, and responds to user operations on the UI.
  • the developer uses the interface description language supported by the OS platform to develop the UI of the video App on the electronic device 200, and publish the video App.
  • the user installs the video App on the mobile phone using the installation package of the "Video” App, and the "Video” icon 101 is generated on the mobile phone desktop.
  • the user can click the "video” icon 101 to open the video app.
  • the mobile phone runs the video App.
  • An OS platform is installed on the mobile phone, the OS platform reads the interface description file, parses and executes the interface description language, renders the UI of the video app according to the interface description in the interface description file, and presents the UI 102 of the video app on the display screen.
  • the interface description file may also include the definition of the interface behavior.
  • the mobile phone can execute corresponding interface actions according to the interface behaviors defined in the interface description file to realize the interface behaviors.
  • the OS platform also has a corresponding programming language for implementing interface behaviors, implementing dynamic changes of UI 102 and responding to user operations on UI 102; for example, using JAVA, Implement interface behavior using the Swift programming language.
  • the developer can directly develop the UI of the App on the electronic device 100 and run the App on the electronic device 100; that is, the electronic device 200 and the electronic device 100 can be the same electronic device. This embodiment of the present application does not limit this.
  • the above-mentioned electronic device 100 may include a portable computer (such as a mobile phone, etc.), a handheld computer, a tablet computer, a notebook computer, a netbook, a personal computer (personal computer, PC), a smart home device (such as a smart TV, a smart screen, a large screen, a smart speakers, etc.), personal digital assistant (PDA), wearable devices (such as smart watches, smart bracelets, etc.), augmented reality (AR) ⁇ virtual reality (VR) devices, automotive A computer, etc., the embodiments of the present application do not make any restrictions on this.
  • Exemplary embodiments of electronic device 100 include, but are not limited to, onboard Or portable electronic devices with other operating systems. It can be understood that, in some other embodiments, the above-mentioned electronic device 100 may not be a portable electronic device, but a desktop computer.
  • FIG. 2 shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, an audio module 130, a speaker 130A, a microphone 130B, a display screen 140, a wireless communication module 150, a power module 160, and the like.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • Processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), an image signal processor (ISP), a controller, a video Codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc.
  • AP application processor
  • modem processor graphics processing unit
  • ISP image signal processor
  • controller a video Codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • NPU neural-network processing unit
  • electronic device 100 may also include one or more processors 110 .
  • the controller is the nerve center and command center of the electronic device 100 .
  • the operation control signal can be generated according to the instruction operation code and the timing signal to complete the control of fetching and executing the instruction.
  • An operating system of the electronic device 100 may be run on the application processor to manage hardware and software resources of the electronic device 100 . For example, managing and configuring memory, prioritizing the supply and demand of system resources, controlling input and output devices, operating networks, managing file systems, managing drivers, etc.
  • the operating system can also be used to provide an operating interface for the user to interact with the system.
  • various types of software can be installed in the operating system, for example, a driver program, an application program (application, App), and the like.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an inter-integrated circuit (I2C) interface, an inter-integrated circuit sound (I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver (universal asynchronous receiver) /transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, SIM card interface, and/or USB interface, etc.
  • I2C inter-integrated circuit
  • I2S inter-integrated circuit sound
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous receiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM card interface SIM card interface
  • USB interface etc.
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store one or more computer programs including instructions.
  • the processor 110 may execute the above-mentioned instructions stored in the internal memory 121, thereby causing the electronic device 100 to execute the user interface interface implementation methods provided in some embodiments of the present application, as well as various applications and data processing.
  • the internal memory 121 may include a code storage area and a data storage area.
  • the data storage area may store data and the like created during the use of the electronic device 100 .
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage components, flash memory components, universal flash storage (UFS), and the like.
  • the processor 110 may execute the instructions stored in the internal memory 121 and/or the instructions stored in the memory provided in the processor 110 to cause the electronic device 100 to execute the instructions provided in the embodiments of the present application User interface interface implementation methods, and other applications and data processing.
  • the electronic device 100 may implement audio functions through an audio module 130, a speaker 130A, a microphone 130B, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 130 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 130 may also be used to encode and decode audio signals.
  • the audio module 130 may be provided in the processor 110 , or some functional modules of the audio module 130 may be provided in the processor 110 .
  • Speaker 130A also referred to as a “speaker” is used to convert audio electrical signals into sound signals.
  • the microphone 130B also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 130B through the human mouth, and input the sound signal to the microphone 130B.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1 , the antenna 2 , the wireless communication module 150 and the like.
  • the wireless communication module 150 may provide wireless communication solutions including Wi-Fi, Bluetooth (BT), and wireless data transmission modules (eg, 433MHz, 868MHz, 915MHz) applied on the electronic device 100 .
  • the wireless communication module 150 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 150 receives electromagnetic waves via the antenna 1 or the antenna 2 , filters and frequency modulates the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 150 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and then convert it into electromagnetic waves and radiate it out through the antenna 1 or the antenna 2 .
  • the electronic device 100 implements a display function through a GPU, a display screen 140, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 140 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 140 is used to display images, videos, and the like.
  • the display screen 140 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • emitting diode, AMOLED organic light-emitting diode
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 100 may include 1 or N display screens 140 , where N is a positive integer greater than 1.
  • the display screen 140 may be used to display the UI and receive user operations on the UI.
  • the display screen 140 is provided with a pressure sensor 170A, a touch sensor 170B, and the like.
  • the pressure sensor 170A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 170A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 170A.
  • the touch sensor 170B also called “touch panel”, can form a touch screen with the display screen 140, also called “touch screen”.
  • the touch sensor 170B is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may also be provided through display screen 140 .
  • the power module 160 can be used to supply power to various components included in the electronic device 100 .
  • the power module 160 may be a battery, such as a rechargeable battery.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present invention use a layered architecture Taking the system as an example, the software structure of the electronic device 100 is exemplarily described.
  • FIG. 3 is a block diagram of a software structure of an electronic device 100 according to an embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the software system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include camera, gallery, calendar, call, map, negative screen, WLAN, desktop, music, video, short message, and other applications.
  • the application framework layer includes an OS, which provides an application programming interface (API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions that implement predefined functions. For example, get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.; provide data accessed by the application; provide various resources for the application, such as localized strings, icons, pictures, interface description files, videos documents, etc.
  • the view system of the OS includes visual controls, such as controls for displaying text, controls for displaying pictures, and the like. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the OS can also make the application display notification information in the status bar, which can be used to convey notification-type messages, which can disappear automatically after a short stay without user interaction; notifications can also appear in the status bar at the top of the system in the form of charts or scroll bar text , such as notifications for apps running in the background; notifications can also appear on the screen as a dialog window. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • the Android runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • the OEM OS released by the manufacturer supports The interface description language (such as xml), can provide Basic UI programming capabilities and manufacturer-defined UI programming capabilities.
  • the UI development language of the App is Applicable interface description language (such as xml) for declaring Provides basic UI programming capabilities.
  • the manufacturer-defined field is added to the interface description language (such as xml) of the user interface, which is used to declare the UI programming ability customized by the manufacturer.
  • OEM OS platform based on The provided basic UI engine interprets and executes the interface description language; the basic UI engine adds the interpretation and execution of manufacturer-defined fields.
  • OEM OS support Provides basic UI programming capabilities, and provides manufacturers' custom UI programming capabilities. In this method, use Native development interface, interface description language and UI engine, using the The native interface description language and the way of adding custom fields in the UI engine provide manufacturers with custom UI programming capabilities.
  • the OEM OS released by the manufacturer is independent of the general OS platform (such as ) to provide manufacturer-defined UI programming capabilities.
  • the UI development language of the App is a manufacturer-defined interface description language.
  • the OEM OS platform provides a custom UI engine, which parses and executes the custom interface description language, and provides manufacturer-defined UI programming capabilities.
  • the manufacturer customizes a complete set of UI programming framework independent of the general OS platform, which can meet the demands of developers to develop and run apps across platforms.
  • the embodiments of the present application provide a method and device for implementing a user interface interface, which can provide rich UI programming capabilities, can adapt to various OS platforms, and has low technical implementation difficulty and is convenient for developers to use.
  • the OEM OS platform provided by the embodiment of the present application supports a basic interface description language and a custom interface description language.
  • the basic interface description language is the interface description language supported by the general OS platform; for example, xml, swift etc.
  • the custom interface description language is a domain specific language (domain specific language, DSL), and the custom interface description language is not related to the general OS platform.
  • DSL domain specific language
  • the custom interface description language is called DSL. Developers can use the basic interface description language and DSL to jointly develop the UI of the app.
  • custom UI programming capabilities can include layout capabilities, visual attribute capabilities, unified interaction capabilities, and dynamic effects capabilities.
  • Layout capabilities are used to describe the layout of controls in the UI; such as the shape, position, size, etc. of controls.
  • the visual property capability is used to describe the visual properties of the control; for example, the color, grayscale and other visual effects of the control.
  • Unified Interaction capabilities are used to provide control responses based on user actions; such as performing searches based on the user's "confirmation" action.
  • the animation capability is used to display animation effects on the control; for example, displaying the click rebound animation effect on the control.
  • the OEM OS provided by the embodiments of the present application can realize not only the basic UI programming capability provided by the general OS platform, but also the custom UI programming capability extended relative to the OS platform.
  • the OEM OS platform includes a basic UI engine and an extended UI engine.
  • the basic UI engine is used to interpret and execute the basic interface description language to generate a basic UI (with basic UI programming capabilities);
  • the extended UI engine is used to interpret and execute DSL, and a custom UI is superimposed on the basic UI. programming ability.
  • the custom interface description language and the extended UI engine only need to cover the custom UI programming capability, so the release difficulty of the manufacturer is low, and the expansion is easy; and the developer access threshold is low.
  • the custom interface description language and extension UI engine are not related to the general OS platform, which can be It can also be other general-purpose OS platforms.
  • the custom interface description language and extended UI engine can be easily applied to a variety of common OS platforms.
  • the UI engine of the OEM OS on the electronic device parses and executes the interface description language (basic interface description language and custom interface description language) to generate UI.
  • the basic UI engine is used to interpret and execute the basic interface description language
  • the extended UI engine provided by the OEM OS in the embodiment of the present application is used to parse and execute the custom interface description language.
  • the extended UI engine 310 includes modules such as flow control 311 , DSL file loading 312 , parsing engine 313 , and execution engine 314 .
  • the flow control 311 is used to control the execution flow of each module in the extended UI engine 310, and the interaction flow between the extended UI engine 310 and other modules in the OEM OS, and the like.
  • DSL file loading 312 is used to read DSL files.
  • the parsing engine 313 includes sub-modules such as DSL syntax verification and DSL parsing.
  • the DSL syntax check submodule is used to perform syntax check on the content in the DSL file.
  • the DSL parsing submodule is used to parse the DSL file and convert the content in the DSL file into a data format that matches the execution engine.
  • the parsing engine 313 may also include sub-modules such as DSL preprocessing.
  • the DSL preprocessing submodule is used to precompile DSL files, etc.
  • the execution engine 314 includes sub-modules such as version management, control construction, event proxy, interpretation execution engine, and semantic support library.
  • the version management sub-module is used to match the version of the extended UI engine 310 and the DSL file in the App.
  • the version of the extended UI engine 310 needs to be consistent with the version of the DSL file or be newer than the version of the DSL file in order to run normally.
  • the control building submodule is used to build the UI controls based on the content of the DSL file.
  • the event proxy submodule is used to implement the mapping between device events and user behavior. For example, a mouse double-click event and a finger click event on the display screen can all be mapped to the user's "confirmation" behavior on the electronic device.
  • the interpretation execution engine is used to interpret and execute the code in the DSL file, and in response to the user behavior, execute actions corresponding to the user behavior defined in the DSL file.
  • the semantic support library includes a set of syntax and semantic specifications for all fields in the DSL file, such as environment variable interfaces, public fields, layout template attributes, visual attributes, layout attributes, interaction attributes, animation attributes and other field definitions and syntaxes.
  • the OEM OS further includes a custom UI programming capability 320.
  • the custom UI programming capability 320 includes a DSL adaptation layer for providing an adaptation interface for the custom UI programming capability for the extended UI engine 310 .
  • the custom UI programming capability 320 also provides the realization of custom UI programming capabilities such as visual attribute capability, layout capability, unified interaction capability, and dynamic effect capability.
  • the DSL file declares that a control enables vertical stretching capability, and the realization of the custom UI programming capability (vertical stretching capability) is completed by the custom UI programming capability 320; that is, the display window of the control changes
  • the custom UI programming capability 320 realizes the vertical stretching of the control, the developer does not need to implement the vertical stretching of the control in the DSL.
  • the grammar rules and development tools of the basic interface description language can follow conventional techniques.
  • the embodiments of the present application also provide grammar rules and development tools for the DSL.
  • an embodiment of the present application provides a development tool that supports the grammar rules of the basic interface description language and the DSL, and provides an editing and compiling environment for the basic interface description language and the DSL.
  • the embodiments of the present application provide a development tool, and a development interface of the development tool includes a basic interface description language file and a DSL file.
  • the developer opens the development interface of the development tool, and the development interface includes the initial version of the basic interface description language file and the initial version of the DSL file.
  • the developer can use the basic interface description language to add control descriptions in the initial version of the basic interface description language file, and can also use DSL to add control descriptions in the initial version of the DSL file.
  • the initial version of the DSL file may be preset in the development tool, or may be added by the developer in the development tool.
  • the development tool may further include DSL templates, DSL syntax rules description files, interface description examples, and the like.
  • the basic interface description language file is used to describe native controls, and basic UI programming capabilities are applied to the native controls.
  • DSL files are used to declare custom UI programming capabilities for controls. For example, you can apply custom UI programming capabilities to native controls in a DSL file; for another example, you can also declare custom controls in a DSL file to apply custom UI programming capabilities to custom controls.
  • the basic interface description language file and the DSL file are respectively set in different paths of the development tool folder.
  • the basic interface description language is carried by one or more files in xml format
  • the DSL is carried by one or more files in json format.
  • Figure 8 With Taking the platform as a general OS platform as an example, the developer creates an App folder app in the development tool.
  • An AndroidManifest.xml file is integrated in the res directory of the folder app. Developers can declare the basic UI programming capabilities used in the xml file.
  • a huawei_dsl.json file is integrated in the assets directory of the folder app. Developers can declare the custom UI programming capabilities used in the DSL file in json format.
  • the above-mentioned setting of the basic interface description language file and the DSL file in different paths of the development tool folder is only for the UI engine of the OEM OS to distinguish the basic interface description language file and the DSL file.
  • other methods may also be used to distinguish the basic interface description language file and the DSL file.
  • the UI engine of the OEM OS can obtain the basic interface description language file and the DSL file respectively according to the preset tags. This embodiment of the present application does not limit this.
  • the developer completes the App development in the development tool, and generates the App installation package after compiling.
  • the basic interface description language file and the DSL file are integrated into the App installation package, so that the UI engine of the OEM OS can read the basic interface description language file and the DSL file.
  • the storage location of the basic interface description language file and the DSL file in the App installation package is consistent with the location in the App folder in the development tool.
  • the DSL file uses the standard json format; for example, the DSL file includes content such as version, app, and layout.
  • version indicates the version number of the DSL file; exemplarily, the format of version is x.y.z, where x indicates the product, y indicates the subsystem of the product, and z indicates the number of times of development; for example, it may be 101.1.003.
  • the app content block is used to declare custom UI programming capabilities that act on the global controls of the App in the App installation package where the DSL file is located.
  • the app content block format is:
  • feature_name is an attribute of custom UI programming capability.
  • value is the attribute value of the custom UI programming capability.
  • the layout content block is used to declare custom UI programming capabilities that apply to controls in a layout.
  • the layout content block format is:
  • the layoutId is used to indicate a layout; for example, the layoutId is an identifier of a layout.
  • widgetId is used to indicate a widget in the layout; for example, widgetId is the widget ID.
  • prop_name is an attribute of custom UI programming capability, representing a feature of custom UI programming capability; for example, custom UI programming capability enablement, custom UI programming capability priority, custom UI programming capability parameters, etc.
  • value is the attribute value of the custom UI programming ability, and the attribute value is used to specify the value of the attribute; for example, if the attribute is enabled for the custom UI programming ability, correspondingly, the attribute value of true indicates that the custom UI programming ability is enabled, and the attribute value If it is false, it means that the custom UI programming capability is not enabled.
  • the version number is 101.1.003.
  • the attribute value of zoom (zoom) of the custom UI programming capability is enabled, that is, the global control of the App enables the zoom (zoom) capability.
  • the control named R.id.edit_text in the layout named R.layout.mainpage in the App enables the onSearch (search) capability, and the attribute value of onSearch is com.app.Search$onSearchPrice (that is, the specific execution action of the search function is in com.app.Search$onSearchPrice).
  • a DSL file can include version and layout content blocks, but not app content blocks.
  • the layout content block may include description fields of multiple controls.
  • multiple custom UI programming capabilities can be enabled for a control. This embodiment of the present application does not limit this.
  • the custom UI programming capability of the embodiment of the present application may include visual attribute capability, layout capability, unified interaction capability, dynamic effect capability, and the like. Developers can declare custom UI programming capabilities in the DSL file to use the custom UI programming capabilities provided by the OEM OS.
  • the visual properties of the UI are embodied as the visual properties of the controls.
  • OEM OS defines a set of visual parameter variables for the control to describe the visual properties of the control; this set of visual parameter variables can be used to switch the visual properties of multiple brands or devices.
  • developers describe the visual attributes of UI, they can use visual parameter variables (dynamically obtain attribute values that match the brand or the electronic device itself when the electronic device is running), and the developer does not need to specify specific variable values. Declaring the visual parameter variable in the DSL file enables the control to have the corresponding visual attribute capability.
  • the property value of the visual property textColor of the control R.id.textview is emuiColor1; the property value of the visual property foreground of the control R.id.image is emui_color_bg.
  • emuiColor1 and emui_color_bg are visual parameter variables, which are mapped to different color values on different brands or devices.
  • the mapping relationship between visual parameter variables and color values on different brands or devices is preset in the OEM OS, which avoids the repeated effort of developers to specify textColor and foreground attribute values on different brands or devices.
  • the OEM OS provides adaptive layout capabilities to build a responsive UI, so that the layout of the UI can be adapted to displays of different sizes and shapes; it prevents developers from performing different layout work for different devices.
  • the adaptive layout capabilities include layout capabilities such as automatic stretching, hiding, wrapping, dividing, proportioning, and extending.
  • the adaptive layout capabilities provided by the OEM OS apply to the LinearLayout layout and the controls within the layout.
  • the master switch of adaptive layout capability is used to turn on or off the adaptive layout capability of the control. In an example, only when the master switch of the adaptive layout capability is turned on can any one of the layout capabilities be enabled.
  • “capability” is used to indicate the custom UI programming capability.
  • “Properties” represent characteristic parameters of custom UI programming capabilities.
  • “Attribute belongs” indicates the classification of the function of the attribute; for example, if the attribute belongs to layout, it means that the attribute is used for the layout of the control; for example, if the attribute belongs to a child element, it means that the attribute is used to describe the control.
  • the automatic stretching ability If the control enables the automatic stretching capability, the control can be automatically stretched in the UI according to the window size to fit the window size.
  • the vertical stretch ability of the control is enabled in the R.layout.linearlayout_vertical layout.
  • the controls can be automatically stretched vertically to fit the size of the display window.
  • the hidden ability If the control enables the hiding ability, it can be hidden in the UI.
  • control R.id.container in the R.layout.mainpage layout enables the ability to hide vertically.
  • the priority of vertical hiding of R.id.image1 in R.id.container is 2, and the priority of vertical hiding of R.id.image2 is 1.
  • control enables line wrapping, it can be implemented in the UI, and the control can be folded into multiple lines for display.
  • the wrap width limit value can be used to specify the maximum width of the control displayed on each line.
  • the controls in the R.layout.mainpage layout enable wrapping.
  • the width limit of R.id.image1 is 160dp
  • the width limit of R.id.image2 is 160dp; it means that the maximum width of R.id.image1 displayed in each line is 160dp, and R.id.image2 The maximum width value displayed on each row is 160dp.
  • control can be displayed equally in the UI.
  • the controls in the R.layout.mainpage layout enable split capability.
  • the R.id.image1 equalization type is spread.
  • the control enables the proportioning capability, which means that the supported control occupies the total size of the layout according to the specified percentage in the specified direction.
  • the controls in the R.layout.mainpage layout enable vertical scaling.
  • the vertical proportion of R.id.image1 is 33.33%.
  • control can be extended and displayed on the UI according to the size of the display screen.
  • the exposure value is used to specify the exposure characteristics on the UI of the last control displayable on the UI.
  • the controls in the R.layout.mainpage layout enable extension capability.
  • R.id.image1 enables the ability to expose features, and the exposure value is 40dp.
  • OEM OS also provides unified interaction capabilities, allowing developers to define the response of controls based on behavior.
  • unified interaction capabilities include search, zoom, and the like. Developers can declare unified interaction capabilities in the DSL file to enable controls to have search, zoom and other capabilities.
  • the developer defines the behavior corresponding to the event. For example, define the mouse double-click event to correspond to the "confirm” behavior, define the finger click event on the display screen to correspond to the "confirm” behavior, and define the corresponding relationship between other events and the "confirm” behavior.
  • the workload of developers is large.
  • the OEM OS provided by the embodiments of this application supports developers to directly define the response to the "confirm” behavior (that is, define the unified interaction capability corresponding to the behavior) without defining the event corresponding to the "confirm” behavior; the mapping relationship between events and behaviors Done by OEM OS.
  • the OEM OS maps events triggered by different forms of electronic devices to the same behavior (for example, mapping the mouse double-click event on the PC to the "confirm” behavior, and mapping the finger click event on the mobile phone to the “confirm” behavior) to avoid development
  • the author defines the corresponding relationship between events and behaviors for different forms of electronic devices, which brings duplication of work.
  • control R.id.sample_text has onSearch capability.
  • the electronic device receives the user's "confirmation" behavior in the control R.id.sample_text (for example, receives the mouse double-click R.id.sample_text event, receives the finger click R.id.sample_text event, etc.), and executes it in com.sample
  • the search function defined in .SearchImplSample$onSearchSample.
  • control R.id.sample_text has onZoom (zoom) capability.
  • the electronic device receives the user's "confirmation" behavior in the control R.id.sample_text (for example, receives the mouse double-click R.id.sample_text event, receives the finger click R.id.sample_text event, etc.), and executes it in com.sample
  • the zoom function defined in .ZoomImplSample$onZoomSample.
  • OEM OS also provides enhanced animation capabilities to make the animation of controls more expressive.
  • the animation capabilities provided by the OEM OS are applicable to Button and its subclasses; it supports App global enable, or enable for controls.
  • the animation capability includes a click-rebound micro-motion of the Button control (field definition: reboundAnimation).
  • Declare reboundAnimation in the app content block and all controls within the app within the app can enable click-rebound micro-motion effects.
  • Declare reboundAnimation in the layout content block and the target control enables the click rebound micro-motion effect.
  • FIG. 9 shows a schematic flowchart of a UI generated when an App is running.
  • the process control module of the extended UI engine reads the basic interface description language file, and invokes the basic UI engine to parse and execute the basic interface description language to construct a basic UI.
  • the controls of the basic UI use the basic UI programming capabilities.
  • the process control module of the extended UI engine calls the DSL file loading module to read and load the DSL file.
  • the parsing engine performs syntax verification, parsing, and preprocessing on the content in the DSL file, and obtains a data format matching the execution engine.
  • the DSL syntax verification submodule performs syntax verification on the content in the DSL file, and if the verification passes, the DSL parsing submodule parses the fields in the DSL file. Further, the DSL preprocessing submodule preprocesses the DSL file to obtain a data format matching the execution engine.
  • the execution engine builds an enhanced UI in units of controls based on the basic UI constructed in S401 and according to the content of the DSL file.
  • control building sub-module sequentially obtains the semantic processing components corresponding to the fields in the DSL file from the semantic support library. For example, get the semantic processing component SearchHandler of the "onSearch" field from the semantic support library. Further, the control building sub-module applies the custom UI programming capability to the control through the DSL adaptation layer to build an enhanced UI.
  • FIG. 10 shows a schematic flowchart of an electronic device responding to a user's operation in the UI.
  • the execution engine creates an event proxy, and registers the event proxy with the UI through the DSL adaptation layer.
  • the OEM OS monitors user operation events of the UI, and reports the user operation events to the event agent.
  • the event agent implements the mapping between events and behaviors.
  • the interpretation and execution engine interprets and executes the code in the DSL file, implements the response specified in the DSL file according to the behavior, and completes the response to the user's operation on the UI.
  • the user interface interface implementation method provided by the embodiment of the present application supports that the App includes native controls and custom controls, and also supports the application of custom UI programming capabilities in the native controls.
  • the native control is a general-purpose OS (such as ) supported controls, the general-purpose OS provides basic UI programming capabilities for native controls; custom controls are controls that are not supported by the general-purpose OS, but are supported by the OEM OS, and the OEM OS provides custom UI programming capabilities for custom controls.
  • FIG. 11 shows a schematic flow chart of the OEM OS constructing native controls.
  • the App development engineering package of the OEM OS 1101 includes a basic interface description language file, and the flow control 1111 of the basic UI engine 1110 instructs the parsing engine 1112 to process the basic interface description language file.
  • the parsing engine 1112 reads and loads the basic interface description language file, and converts the basic interface description language file into a data format matching the execution engine 1113 .
  • the execution engine 1113 constructs a basic UI according to the content of the basic interface description language file, and generates a native control 1130 .
  • FIG. 12 shows a schematic flowchart of the OEM OS applying custom UI programming capability to native controls.
  • the App development project package of OEM OS 1101 includes basic interface description language files and DSL files.
  • the flow control 1121 of the extended UI engine 1120 instructs the parsing engine 1112 in the basic UI engine 1110 to process the basic interface description language file.
  • the parsing engine 1112 reads and loads the basic interface description language file, and converts the basic interface description language file into a data format matching the execution engine 1113 .
  • the execution engine 1113 constructs a basic UI according to the content of the basic interface description language file, and generates a native control 1130 .
  • Flow control 1121 instructs the parsing engine 1122 in the extended UI engine 1120 to process the DSL file.
  • the parsing engine 1122 reads and loads the DSL file, and converts the DSL file into a data format matching the execution engine 1123 .
  • the execution engine 1123 applies custom UI programming capabilities to the native controls 1130 according to the DSL file.
  • FIG. 13 shows a schematic flow chart of the OEM OS building a custom control.
  • the App development project package of OEM OS 1101 includes basic interface description language files and DSL files.
  • the flow control 1121 of the extended UI engine 1120 instructs the parsing engine 1112 in the basic UI engine 1110 to process the basic interface description language file.
  • the parsing engine 1112 reads and loads the basic interface description language file, and converts the basic interface description language file into a data format matching the execution engine 1113 .
  • the execution engine 1113 constructs a basic UI according to the content of the basic interface description language file, and generates a native control 1130 .
  • Flow control 1121 instructs the parsing engine 1122 in the extended UI engine 1120 to process the DSL file.
  • the parsing engine 1122 reads and loads the DSL file, and converts the DSL file into a data format matching the execution engine 1123 .
  • the execution engine 1123 generates custom controls 1140 on the base UI according to the DSL
  • the OEM OS provided by the embodiments of this application includes a basic UI engine and an extended UI engine.
  • the basic UI engine is used to interpret and execute the basic interface description language to generate a basic UI (with basic UI programming capabilities);
  • the extended UI engine is used to interpret and execute DSL, and a custom UI is superimposed on the basic UI programming ability.
  • the user interface interface implementation method provided by the embodiments of the present application can adapt to various OS platforms, and provide rich UI programming capabilities; the technical implementation difficulty is low, and it is convenient for developers to use.
  • the embodiments of the present application also provide a method for implementing a user interface interface, which is easy to implement and convenient for developers to use.
  • IoT Internet of things
  • the types and quantities of IoT devices are growing rapidly.
  • Different IoT devices have different screen sizes and user interaction methods.
  • the screen size of mobile phones is mostly about 4-6 inches, and the user interaction method is mainly touch and click on the display screen; the screen size of TV can reach 50 inches or more, and the user interaction method is usually remote control operation; Devices have a wider range of screen forms and user interactions.
  • common OS platforms such as
  • using this method to develop differentiated UIs for various devices is a lot of work and difficult to develop.
  • the embodiments of the present application provide a user interface interface implementation method and device, which can realize one-time development and multi-device deployment; namely, develop a set of interface description files, which are suitable for various types of electronic devices; and reduce the development difficulty of developers.
  • This UI programming framework includes a UI interface description language and a corresponding parsing and execution engine, and provides an independent interface control library, layout engine, and rendering engine. It can run across devices, but has poor compatibility.
  • Embodiments of the present application provide a method and apparatus for implementing a user interface interface.
  • the developer opens a development tool (for example, DevEco Studio) in the electronic device 200 (developer device), and generates an interface description file in the development interface of the development tool.
  • a development tool for example, DevEco Studio
  • the developer opens the development interface of the development tool, and the development interface includes the initial version of the interface description file.
  • the initial version of the interface description file can be a blank file, or it can contain a simple example. It is understandable that the initial version of the interface description file may be preset in the development tool, or may be added by the developer in the development tool.
  • the development tool may further include, for example, an interface description language template, an interface description language grammar rule description file, an interface description example, etc., which will not be repeated in this embodiment of the present application.
  • the developer can use an interface description language to add the interface description and the interface behavior definition in the initial version of the interface description file to form the interface description file for publishing.
  • the developer generates an interface description file for each UI in the App; for example, multiple interface description files may be generated in a folder, and each interface description file corresponds to a UI.
  • the installation package of the App is uploaded to the server, and the App is published in the application market provided by the server.
  • the user can use the user-side electronic device (the above-mentioned electronic device 100 ) to download the installation package of the App in the application market.
  • the user-side electronic device runs the installation package of the App, it obtains the interface description file in the installation package; when the user-side electronic device runs the App, the UI that matches the electronic device is displayed on the display screen according to the interface description file.
  • the interface description file is in json format.
  • the installation package of the App includes a folder “app” 410 .
  • the src ⁇ main ⁇ assets directory of the folder "app” 410 includes the layout folder "layout” 411
  • the layout folder "layout” 411 includes the interface description files layout1.json412, layout2.json413 and layout3.json 414, etc.
  • Each interface description file corresponds to a UI of the App.
  • Different types of user-side electronic devices such as mobile phone 420 , car device 430 , TV 440 , and watch 450 all run the same interface description file in “layout” 411 , and display different display effects of the same UI respectively.
  • mobile phone 420, car device 430, TV 440 and watch 450 all parse and execute layout1.json 512, and display different display effects of the UI corresponding to layout1.json 512 respectively.
  • the developer can develop a differentiated UI for different types of electronic devices by developing a set of codes in one interface description file. Different types of electronic devices read the same interface description file of the same UI, and can present different UI display effects. It can realize the development of a set of interface description files, which is suitable for various types of electronic devices, and reduces the development difficulty of developers.
  • the user interface interface implementation method provided by the embodiment of the present application supports the use in the interface description file Native UI programming capabilities, as well as operating system-customized UI programming capabilities.
  • Native UI programming capabilities enable controls to have The native control properties and the UI programming capabilities customized by the operating system enable the controls to have extended visual properties, layout properties, interaction properties, dynamic properties, and software and hardware dependencies.
  • the electronic device runs the installation package of the App and obtains the interface description file in the installation package; when the user runs the App on the electronic device, the electronic device can present the corresponding UI on the display screen according to the interface description file, and the controls in the UI can include Native control properties, and can also include extended control properties.
  • the custom UI engine provided by the embodiments of this application supports the Native control properties and all control properties extended in the operating system are parsed and executed.
  • FIG. 15 shows a software architecture of the electronic device 100 .
  • the software system of the electronic device 100 may include an application layer, an application framework layer, an Android runtime (Android runtime) and system libraries, and a kernel layer.
  • the application layer, the Android runtime and system libraries, and the kernel layer can be referred to in Figure 3
  • the corresponding description in the software architecture will not be repeated here.
  • the software system of the electronic device 100 provided by the embodiment of the present application partially reuses the UI programming framework in the conventional technology, which is easy to learn and reduces the development difficulty for developers.
  • the operating system of the application framework layer includes a custom UI engine 11 .
  • the custom UI engine 11 is used to parse and execute the interface description file of the App to generate the UI of the App.
  • the custom UI engine 11 may include a UI parsing engine 11a, a UI execution engine 11b, an MVVM (model-view-viewmodel) framework 11c, a syntax and semantic library 11d and a UI rendering engine 11e.
  • the application framework layer may further include more modules, and reference may be made to conventional technologies, which are not limited in this embodiment of the present application.
  • the syntax and semantics library 11d includes a set of syntax and semantic specifications for all fields in the interface description file, for example, field definitions and syntaxes such as variable interfaces, common fields, visual attributes, layout attributes, interaction attributes, animation attributes, and software and hardware dependency attributes.
  • the layout attribute refers to the layout of each control in the UI, such as the shape, position, and size of the control.
  • Visual properties refer to the visual effects of controls such as color and grayscale. Interaction properties are the ability to provide control responses based on user behavior; such as performing a search based on the user's "confirm" behavior.
  • the animation property refers to displaying animation effects on the control; for example, displaying the click rebound animation on the control.
  • the software and hardware dependency properties refer to the software and hardware parameters of the control-dependent device.
  • the developer needs to add code to the interface description file to develop the UI according to the syntax and semantic specification defined in the syntax and semantic library 11d.
  • the syntax and semantic specifications defined in the syntax and semantic library 11d are introduced below from the aspects of layout arrangement, data & interface binding, interactive behavior arrangement, and differentiated description.
  • the interface description file may include the following structure:
  • meta-data includes information such as version number.
  • An example is as follows:
  • version represents the version number of the interface description file; exemplarily, the version format is x.y.z, where x indicates the product, y indicates the subsystem of the product, and z indicates the number of times of development.
  • the version of the interface description file needs to match the version of the custom UI engine.
  • the version of the custom UI engine is the same as the version of the interface description file, or is newer than the version of the interface description file, in order to successfully parse the interface description file.
  • import is used to import objects
  • model is used to declare objects.
  • An example is as follows:
  • layout-data-common is used to describe common UI.
  • Various types of electronic devices parse the content in layout-data-common, and lay out a common UI according to the content in layout-data-common.
  • layout-data-uimode is used to describe the UI of the specified device.
  • the difference between the specified device UI and the generic UI is declared in layout-data-uimode.
  • layout-data-uimode declares all conditions applicable to the UI of the specified device.
  • the specified device may be a mobile phone, a watch, a car device, a smart home device (for example, a smart TV, a smart screen, a smart speaker, etc.), a large screen, a laptop computer, a desktop computer, and the like.
  • the specific form of layout-data-uimode may include layout-data-phone (for mobile phones), layout-data-watch (for watches), layout-data-television (for TVs) and so on.
  • All UI in the app is composed of controls.
  • the layout arrangement of the UI is to arrange the properties of the controls in the UI.
  • Custom UI Engine 11 supports all Native controls and extended controls in the operating system also support controls customized by developers in the app or integrated through static packages.
  • the controls may specifically include text controls, such as TextView controls, EditText controls, etc., may also include button controls, such as Button controls, ImageButton controls, etc., and may also include image controls, such as Image controls, etc. This is not done in this embodiment of the application. any restrictions.
  • Native controls can include TextView, EditText, etc.
  • extended controls in the operating system can include HwButton, etc.
  • An example of a declared control is as follows, in this example, the declared Native controls TextView and EditText.
  • the full package name of the resource package of the control needs to be introduced in import. This way, it can be called in layout-data-common or layout-data-uimode.
  • An example of declaring a custom control in the App is as follows. In this example, the full package name com.myapp.widget.MyCircleView of the resource package of the control MyCircleView is first introduced in import, and then MyCircleView is directly called in layout-data-common .
  • the custom UI engine 11 supports the developer to specify aliases for controls.
  • An example is as follows. In this example, when the full package name com.myapp.widget.MyCircleView of the resource package of MyCircleView is introduced in import, the name of MyCircleView is specified as AliasName.
  • control declaration when calling a control in layout-data-common or layout-data-uimode, the control declaration is made in the form of ComponentName(): ⁇ .
  • TextView(): ⁇ declares a TextView.
  • Control properties supported by Custom UI Engine 11 include Native properties and extended visual properties, layout properties, interaction properties, animation properties, and software and hardware dependencies in the operating system.
  • the attributes and attribute values of the control can be passed in ⁇ , the format is "attribute 1: attribute value 1, attribute 2: attribute value 2 ".
  • the example is as follows.
  • the control TextView is declared, and the property textSize of TextView is passed in ⁇ , and the property value of textSize is @dimen/mySize.
  • the properties and property values of the control can be passed in ( ).
  • the example is as follows.
  • the control TextView is declared, and the property text of TextView is passed in (), and the property value of text is @string/text_name.
  • control properties and property values are passed in both () and ⁇ , the content in () is ignored.
  • the attribute value of the control attribute can be assigned in any of the following ways. Specify directly by string value; access the resource value defined in the background data; access the classification parameter declared in the background data; or, access the value in the control model (ViewModel) object.
  • the custom UI engine 11 supports specifying the namespace (namespace) of the control properties through the namespace.propertyName method. In one implementation, not specifying a namespace means that the default is namespace. In an implementation manner, the custom UI engine 11 supports using the namespace androidhwext to point to an extended resource package in the operating system, and using the namespace app to point to a customized resource package in the App.
  • the extended resource package in the operating system provides the customized UI programming capabilities in the operating system; the customized resource package in the App provides the customized control properties in the App.
  • the developer may also define other namespaces.
  • the developer-defined namespace is imported through import, and provides the package name of the resource package that defines the properties of the control.
  • the example is as follows.
  • the developer-defined namespace myspace is introduced into import, and the complete package name of the resource package of myspace is com.myapp. After introducing myspace in import, you can call the property borderWidth in myspace in layout-data-common.
  • the custom UI engine 11 supports two-way binding of elements in the UI and background data. You can declare and specify the binding relationship between elements in the UI (such as controls, control groups) and background data in the interface description file (layout-data-common or layout-data-uimode).
  • the MVVM framework 11c in the custom UI engine 11 can refresh the background data according to the UI change, or automatically refresh the corresponding UI according to the background data change.
  • elements in the UI can be bound to a control model (ViewModel) object.
  • ViewModel control model
  • the property value of a control property in the UI is bound to the value of the ViewModel object.
  • the example is as follows.
  • the full package name com.myapp.UserInfo of the resource package of UserInfo (UserInfo is a ViewModel) is introduced in import, and an object user of type UserInfo is declared in the model; then in layout-data- Access data in user in common.
  • variable value (field) in the ViewModel object (model) is accessed through $model.field; for example, the above $user.photo is for accessing the variable photo in user, and $user.name is for accessing the variable name in user .
  • return value of the function (function) in the ViewModel object (model) is accessed through $model::function. For example, the above $user::hasName is the return value of the function hasName in accessing user.
  • the property imageUri (image) of the control ImageView is bound to the background data user.photo
  • the property text (text) of a control TextView is bound to the background data user.name
  • the property text of a control TextView is bound Bind with the background data user.age, bind the property checked (confirmed) of the control CheckBox with the background data user.agreed, and bind the visible (visible) property of a control TextView with the background data user::hasName.
  • the visibility of the controls can be obtained according to the background data, and the function of hiding parts of the UI can be realized.
  • the variable in the background data changes (visible becomes invisible, or invisible becomes visible)
  • the controls in the UI can be hidden or displayed accordingly.
  • An example is as follows, in this example, the visibility (visible) of a column of controls (Column) is determined by the value of the variable user.visible.
  • user input is received on the UI, and the user input is bound to the value of the ViewModel object.
  • the example is as follows.
  • the full package name com.myapp.UserInfo of the resource package of UserInfo (UserInfo is a ViewModel) is introduced in import, and an object user of type UserInfo is declared in the model; then in layout-data-
  • the declaration in common assigns the user input value of the property text (text) of the EditText control to the variable name in user.
  • the custom UI engine 11 supports declaring the execution action corresponding to the control response event in the interface description file.
  • the range of events supported by the control is determined by the event listeners supported by the control. For example, if a button control (Button) supports setOnClickListener for monitoring click events, the onClick (click) event can be bound to the control in the interface description file.
  • the custom UI engine 11 transparently transmits the parameters of the event and the return value of the response function in the background data between the control and the background data.
  • the control Button is declared in layout-data-common to execute the action defined in the background data context.buttonClick in response to the event onClick (response function return value).
  • the custom UI engine 11 supports the life cycle events of the UI execution engine to load controls, including onPreMount, onMount, onUnmount, onPreUpdate, and onUpdate, etc.; among them, onPreMount means that the control is called before the control is mounted to the UI; onMount means that the control is called after the control is mounted to the UI; onUnmount Indicates that it is called after the control is removed from the UI; onPreUpdate means that it is called before the UI is refreshed due to data changes; onUpdate means that it is called after the UI is refreshed due to background data changes.
  • event consumption is determined by the response function return value.
  • custom UI engine 11 follows The native interface definition transparently transmits the processing results in the background data to the control.
  • the properties of the controls supported by the custom UI engine 11 depend on the configuration parameters of the electronic device.
  • the operating system defines variables for electronic device configuration parameters.
  • the variables in the interface description file can declare the configuration parameters of the electronic device.
  • the configuration parameters of the electronic device are accessed, and the electronic device obtains the value of the configuration parameters according to its software and hardware conditions. In this way, when different types of electronic devices run the same interface description file, the generated UIs are different due to different software and hardware conditions and different configuration parameters.
  • the configuration parameters (config) of the current electronic device are accessed through $env.config.
  • the configuration parameters of the electronic device may include the contents shown in Table 1:
  • the attribute value of the dependOn property of the control can be assigned to the field in the configuration parameter, which is used to declare that the attribute of the control depends on a certain configuration parameter.
  • the visibility of the scan control depends on the camera hardware (camera_sensor) of the electronic device; it means that if the electronic device has a camera, the scan control is displayed; if the electronic device does not have a camera, the scan A sweep of the controls does not show.
  • layout-data-uimode is used to describe the UI of the specified device.
  • Developers can declare the UI of the specified device in layout-data-uimode.
  • the display effect of the UI of the specified device and the general UI is different.
  • layout-data-uimode declares all conditions applicable to the UI of the specified device.
  • the interface description file 710 includes code blocks such as layout-data-common 711 and layout-data-watch 712.
  • layout-data-common 711 is used to describe a common UI suitable for various types of electronic devices
  • layout-data-watch 712 is used to describe a UI suitable for watches.
  • layout-data-common 711 declares the attributes and attribute values of each control in the common UI.
  • the mobile phone reads the interface description file 710, parses and executes the content in layout-data-common 711, and generates corresponding controls according to the attributes and attribute values of each control declared in layout-data-common 711.
  • the mobile phone generates a picture control 721 corresponding to the content block 7111 in the layout-data-common 711
  • the content block 7112 in the layout-data-common 711 corresponds to the control group 722
  • the content block 7113 in the layout-data-common 711 corresponds to generate
  • the control group 723 the content block 7114 in the layout-data-common 711 corresponds to the generated button control 724
  • the content block 7115 in the layout-data-common 711 corresponds to the generated control group 725.
  • the UI 720 of the mobile phone is generated according to the content block 7111, the content block 7112, the content block 7113, the content block 7114 and the content block 7115.
  • Layout-data-watch 712 declares the attributes and attribute values of controls in the UI for watches.
  • the watch reads the interface description file 710, determines that there is a layout-data-watch 712 for the watch in the interface description file 710, then parses and executes the content in the layout-data-watch 712, according to the various controls declared in the layout-data-watch 712 properties and property values to generate corresponding controls.
  • the watch generates a picture control 731 corresponding to the content block 7121 in the layout-data-watch 712
  • the content block 7122 in the layout-data-watch 712 corresponds to the control group 732
  • the content block 7123 in the layout-data-watch 712 corresponds to the generation In the control group 733
  • the content block 7124 in the layout-data-watch 712 corresponds to the generated button control 734.
  • the UI 730 of the watch is generated from the content block 7121, the content block 7122, the content block 7123, and the content block 7124.
  • the watch is used as a designated device to read the content in the second code segment (layout-data-watch 712), and the electronic device other than the watch reads the content in the first code segment (layout-data-common 711); different types
  • the electronic device reads the same interface description file of the same UI, and can present different UI display effects; developing a set of interface description files can realize the development of differentiated UI for different types of electronic devices, reducing the development effort of developers. difficulty.
  • the difference between the specified device UI and the generic UI is declared in layout-data-uimode.
  • the interface description file 810 includes code blocks such as layout-data-common 811 and layout-data-watch 812.
  • layout-data-common 811 is used to describe the common UI suitable for various types of electronic devices
  • layout-data-watch 812 is used to describe the difference between the watch UI and the common UI.
  • layout-data-common 811 declares the attributes and attribute values of each control in the common UI.
  • the mobile phone reads the interface description file 810, parses and executes the content in layout-data-common 811, and generates corresponding controls according to the attributes and attribute values of each control declared in layout-data-common 811.
  • the mobile phone generates a picture control 721 corresponding to the content block 8111 in layout-data-common 811, the content block 8112 in the layout-data-common 811 corresponds to the control group 722, and the content block 8113 in the layout-data-common 811 corresponds to the control group 723.
  • the content block 8114 in the layout-data-common 811 corresponds to the generated button control 724, and the content block 8115 in the layout-data-common 811 corresponds to the generated control group 725.
  • the UI 720 of the mobile phone is generated according to the content block 8111, the content block 8112, the content block 8113, the content block 8114 and the content block 8115.
  • Layout-data-watch 812 declares the attributes and attribute values of controls in the watch UI that are different from the general UI.
  • the watch reads the interface description file 810, parses and executes the content in layout-data-common 811; the watch determines that layout-data-watch 812 for the watch exists in the interface description file 810, and parses and executes the content in layout-data-watch 812;
  • the watch generates corresponding controls according to the attributes and attribute values of each control declared in layout-data-common 811 and layout-data-watch 812.
  • the watch generates a picture control 731 corresponding to the content block 8111 in the layout-data-common 811
  • the content block 8112 in the layout-data-common 811 corresponds to the control group 732
  • the content block 8113 in the layout-data-common 811 corresponds to Corresponding to the generated control group 733
  • the content block 8114 in the layout-data-common 811 corresponds to the generated button control 734.
  • layout-data-watch 812 set the attribute value of the visible attribute (visible) of the control group generated corresponding to content block 8115 in layout-data-common 811 to invisible (gone).
  • the control group corresponding to the content block 8115 in the layout-data-common 811 is not displayed on the watch.
  • the UI 730 of the watch includes controls generated from content block 8111, content block 8112, content block 8113, and content block 8114.
  • the custom UI engine 11 supports developers to customize parameters in the style for the current interface description file.
  • the example is as follows, the developer defines myTextStyle in style, and can call the custom parameter in the form of $style.myTextStyle in layout-data-common.
  • the syntax is concise and efficient, and a set of interface description files can be developed to be applicable to different types of electronic devices, which avoids developing UIs for different types of electronic devices separately and reduces development costs.
  • the UI parsing engine 11a is used to parse the interface description file, and convert the content in the interface description file into a data format matching the UI execution engine 11b.
  • the UI parsing engine 11a may also perform syntax verification on the content in the interface description file, and if the syntax verification of the interface description file is successful, the interface description file is parsed; if the syntax verification of the interface description file is unsuccessful , the parsing interface description file is not executed.
  • the UI parsing engine 11a reads the interface description file, and parses the declaration (model), style (style), layout (layout-data-common, layout-data-uimode) in the interface description file
  • the data in the fields, etc. are preprocessed and stored in the database.
  • Use the property parser to parse the property fields of each control, call the UI execution engine 11b to set properties for each control, and complete the UI drawing.
  • the control parser gets the name of the control, and gets a list of the control's properties. If there is an identity of the control (identity, ID), add the control ID; if there is a style of the control (style), add the style of the control; instantiate the control to form a control queue. If a sublayout exists, recursively call controls in the sublayout. After parsing all the controls in the layout, add controls and return the generated controls.
  • the property parser obtains the instantiated control from the control queue, reads the property name and property value corresponding to the control, and stores the property (including property name and property value) in the hash table. If a sublayout exists, recursively call controls in the sublayout. After the properties of all controls are parsed, the property values of each control saved in the hash table are assigned to the corresponding controls.
  • the UI execution engine 11b is used to construct UI controls (instantiated controls and property settings) according to the data parsed by the UI parsing engine 11a, arrange the layout of the controls, and generate the interface declared in the interface description file; it can also realize device events and user behaviors
  • the mapping between the user behaviors and the actions corresponding to the user behaviors defined in the interface description file are executed in response to the user behaviors.
  • a Builder class is constructed for each control, and the Builder class adopts the The same inheritance logic is used to realize that child controls can inherit all properties and setting methods of parent class controls without repeated definitions.
  • the Builder class contains the entity construction method of the corresponding control and the setting method of the specific visual property of the control to complete the instantiation and property setting of the control.
  • a Builder class of the customized control can be provided in the UI execution engine 11b, and the access cost is low, which is friendly to the developer.
  • the UI execution engine 11b can set properties according to the declarations in the interface description file, complete the instantiation of the controls, and the constructed controls have Native control properties.
  • the UI execution engine 11b performs attribute setting according to the declaration in the interface description file, completes the instantiation of the control, and constructs Controls with extended attributes; if it is determined that the operating system does not include custom UI programming capabilities, the UI execution engine 11b maps the corresponding extended attributes according to the extended attributes declared in the interface description file.
  • Native control properties according to the corresponding Set the properties of the native control properties, complete the instantiation of the control, and construct a A control with native control properties, the control does not have extended properties.
  • an installation package of an App is generated on the developer device, which includes an interface description file.
  • the installation package of the App is uploaded to the server, and the App is published in the application market provided by the server.
  • the user can use the user-side electronic device (the above-mentioned electronic device 100 ) to download the installation package of the App in the application market.
  • the user-side electronic device runs the installation package of the App, it obtains the interface description file in the installation package; when the user-side electronic device runs the App, a UI matching the electronic device is displayed on the display screen according to the interface description file.
  • the interface description file includes the following:
  • the tablet 460 runs an App as a user-side electronic device.
  • the operating system of the tablet 460 includes custom UI programming capabilities.
  • a custom control group HwMagicLayout is successfully constructed, and the control group has layout attributes extended in the operating system.
  • the extended layout attributes in the operating system may include layout attributes such as automatic stretching, hiding, evenly dividing, proportioning, extending or wrapping.
  • automatic stretching means that the height or width of the control is automatically enlarged or reduced according to the size of the window to fit the size of the window.
  • Hiding refers to the ability of a control to be visible or invisible in a layout. Even distribution means that the content in the control is evenly distributed in the layout.
  • the proportion means that the control occupies the total size of the layout according to the specified percentage in the specified direction.
  • Extending means that the control is extended and displayed on the UI according to the size of the display screen.
  • Line wrapping means that the content of a control is displayed on one or more lines in the layout.
  • the UI layout of the tablet 460 has layout attributes extended in the system. When the tablet 460 is displayed in a vertical screen, the controls 461, the control group 462, the control group 463, the control 464 and the control group 465 are vertically arranged in a column.
  • the controls 461, the control groups 462, the control groups 463, and the controls 464 are vertically arranged in the first column, and the control group 465 is arranged in the second column.
  • the UI layout of the tablet 460 is adaptively adjusted according to the size and shape of the display window.
  • the interactive capability "zoomEnable” takes effect on the control 461 "imageview".
  • the control 461 can be enlarged and displayed in response to the user's zoom-in operation on the control 461 (for example, when the cursor corresponding to the mouse 480 is placed on the control 461, turning the wheel of the mouse 480 upwards).
  • the tablet 470 runs an App as a user-side electronic device.
  • the tablet 470's operating system does not include custom UI programming capabilities.
  • the UI of the tablet 470 does not support the custom control group HwMagicLayout, and the controls in the UI have Native linear layout (LinerLayout) properties.
  • the controls 471 , the control group 472 , the control group 473 , the control 474 and the control group 475 are vertically arranged in a column.
  • the UI layout of the tablet 470 is fixed and cannot be adjusted adaptively according to the size and shape of the display window.
  • the interactive capability "zoomEnable” cannot take effect on the control 471 "imageview". That is, when the tablet 470 is connected to the mouse 480, if the cursor corresponding to the mouse 480 is placed on the control 471, and the scroll wheel of the mouse 480 is rotated upward, the size of the control 471 remains unchanged.
  • the UI execution engine 11b dynamically parses the data when the electronic device runs the interface description file, and obtains relevant parameters of the electronic device when the electronic device runs the interface description file; it avoids the developer from precompiling the interface description file in the development tool. , to generate preset data files; in this way, UI development can be made independent of the compilation environment, and cross-development platform development and operation can be realized.
  • the MVVM framework 11c is used for bidirectional binding between elements in the UI and background data.
  • the developer declares the control field in the interface description file, binds properties to the control, and binds the background data object.
  • the UI parsing engine 11a parses the binding behavior in the interface description file, and obtains the corresponding relationship between the control attributes and the background data objects.
  • the MVVM framework 11c is used to implement two-way binding between controls and background data. When the background data is changed, the MVVM framework 11c maps the background data with the corresponding control attribute data; the UI execution engine 11b sets the control attribute data and refreshes the UI.
  • the attribute data of the control in the UI changes (for example, in response to user input, the shape, text and other data of the control change)
  • the MVVM framework 11c maps the data of the control attribute with the background data; the background data is refreshed.
  • the UI rendering engine 11e is used to render and organize the interface generated by the UI execution engine 11b, and output the display content to the display screen.
  • the electronic device 100 software system may include an application layer, an application framework layer, an Android runtime (Android runtime) and system libraries, and a kernel layer.
  • the interface description file of the application layer App1 is in json format; the interface description file of App2 is in xml format.
  • the operating system of the application framework layer includes the control unit.
  • the control unit acquires the interface description file of the App.
  • the control unit obtains the interface description file in json format of App1; when the electronic device 100 runs App2, the control unit obtains the interface description file in xml format of App2.
  • the control unit distributes the interface description file to the basic UI engine 10 or the custom UI engine 11 for UI drawing according to the type of the interface description file. For example, the control unit obtains the json format interface description file of App1, and distributes the json format interface description file of App1 to the custom UI engine 11 for processing.
  • the control unit acquires the xml format interface description file of App2, and distributes the xml format interface description file of App2 to the basic UI engine 10 for processing.
  • the specified paths of the interface description file in json format and the interface description file in xml format in the application installation package are different.
  • the control unit obtains the interface description file in json format from the first designated path of the App1 application installation package, and obtains the xml format interface description file from the second designated path of the App2 application installation package.
  • the interface description file in json format and the interface description file in xml format are preset with different tags, and the control unit determines the type of the interface description file according to the preset tags in the interface description file.
  • the custom UI engine 11 parses, executes, and renders the interface description file in the json format of App1, and generates the UI of App1.
  • the controls in the UI of App1 can support common OS (such as ) native UI programming capabilities, and can also support custom UI programming capabilities in the operating system of the electronic device 100 .
  • the basic UI engine 10 parses, executes, and renders the xml format interface description file of App2 to generate the UI of App2.
  • the controls in the UI of App2 can support common OS (such as ) native UI programming capabilities.
  • the electronic device 100 can run either an App developed using the json format interface description language, or can run an App developed using the xml format interface description language; the forward compatibility of the operating system is realized.
  • the embodiment of the present application also provides a method for implementing a user interface interface, which is used for applying a UI of a widget.
  • Custom notification bars, desktop widgets, and negative one-screen cards can more intuitively present the information in the app to users, and support operations on the app without opening the app, making it easier for users to use the app. More and more applications provide widgets for users to use.
  • Embodiments of the present application provide a method and device for implementing a user interface interface, which supports displaying various layout modes and control types on the UI of an application widget, so as to facilitate users to use the application widget and improve user experience.
  • the developer uses an interface description language to develop the UI of the application (Application, App) in the application development tool. Developers also use interface description languages to develop UIs for application widgets in application development tools.
  • application development tools eg, Android Studio, DevEco Studio, etc.
  • the electronic device 200 may also be referred to as a developer device in this application.
  • the developer develops the UI of the App in the application development tool to form the interface description file.
  • the interface description file may also be referred to as a description file.
  • the developer also develops the UI of the application widget in the application development tool to form the component interface description file.
  • the developer packages the interface description file and the component interface description file into the installation package of the App, and publishes the App in the application market provided by the server 300 .
  • the installation package of each App can be provided in the application market for users to download.
  • the installation package can be Application package (Android application package, APK) file.
  • the component interface description file is independent of the interface description file.
  • the component interface description file may be a part of the interface description file (for example, a code segment in the interface description file is used as the component interface description file). This embodiment of the present application does not limit this.
  • the component interface description file is an independent file as an example for illustrative description.
  • a user can use the mobile phone to download an installation package of an App in the application market.
  • the installation package of the App includes interface description files and component interface description files.
  • the music app can be installed in the mobile phone by running the installation package. In this way, the mobile phone also obtains the interface description file and the component interface description file in the installation package.
  • the mobile phone desktop includes a shortcut icon of the music App—the “Music” icon 103 .
  • the mobile phone can receive the user's click operation on the "music" icon 103, and in response to the user's click operation on the "music” icon 103, the mobile phone generates the UI of the music app according to the interface description file, and presents the UI of the music app on the display screen.
  • the mobile phone can also display a small component (called a music widget) of a music app on the mobile phone desktop according to user settings.
  • the mobile phone generates the UI of the music widget according to the component interface description file, and displays the UI 104 of the music widget on the display screen.
  • the developer can directly develop the UI of the App and the UI of the application widget on the electronic device 100, and run the App and the application widget on the electronic device 100; that is, the electronic device 200 and the electronic device Device 100 may be the same electronic device.
  • This embodiment of the present application does not limit this.
  • System native controls include text controls (TextView), input boxes (EditText), buttons (Button), image buttons (ImageButton), and image controls (ImageView), etc.
  • a UI can contain one or more View or ViewGroup.
  • View is an element displayed in the display interface;
  • ViewGroup is a layout container for storing View (or ViewGroup).
  • a new View or ViewGroup can be added to the ViewGroup, so that each View is arranged according to a certain hierarchy and structural relationship.
  • developers can use linear layout (LinearLayout), table layout (TableLayout), relative layout (RelativeLayout), layer layout (FrameLayout), absolute layout (AbsoluteLayout) or grid layout (GridLayout).
  • View or ViewGroup in each UI, so as to generate the layout file of each UI; such as interface description file or component interface description file, etc.
  • the system supports limited layout methods and types of controls applied in application widgets, which cannot meet the diverse needs of users.
  • the embodiments of the present application provide a method for implementing a user interface interface, which can support the application of various layout methods and types of controls in an application widget.
  • the user interface interface implementation method provided by the embodiments of the present application not only supports the The system's native linear layout (LinearLayout), layer layout (FrameLayout), relative layout (RelativeLayout) and grid layout (GridLayout) are applied to application widgets, and also supports The system's native table layout (TableLayout), absolute layout (AbsoluteLayout) and other layout methods are applied to application widgets.
  • the user interface interface implementation method not only supports the System native button (Button), image control (ImageView), image button (ImageButton), progress bar (ProgressBar), text control (TextView), list control (ListView), grid control (GridView), stack control (StackView) , control dynamic loading (ViewStub), adaptive control (AdapterViewFlipper), switch control (ViewFlipper), clock (AnalogClock), timer (Chronometer) and other controls are applied to application widgets; also supports System native input box (EditText), check box (CheckBox), sliding selector (Picker), scroll view (ScrollView), radio button (RadioButton), rating bar (RatingBar), search box (SearchView), dragging Bar (SeekBar), switch (Switch) and other controls are applied to application widgets.
  • the mobile phone 100 may display a widget of a music App (called a music widget) on the desktop.
  • a music widget a music App
  • the cell phone 100 displays the UI 910 of the music widget.
  • the UI 910 includes a picture control 911, which is used to display the picture set by the App; a text control 912, which is used to display the name of the track playing the music; the search box 913, which is used to receive the user's input and perform a search in response to the user's input text;
  • the picture button 914 is used to switch the display style of the music widget; the drag bar 915 is used to adjust the progress of playing music according to the user operation; and other controls.
  • the mobile phone 100 may receive a user's click operation on the search box 913, and in response to the click operation, display the enlarged search box 913 and the soft keyboard 91a on the desktop.
  • the user can enter text for the search box 913 using the soft keyboard 91a.
  • the mobile phone 100 searches according to the text entered in the search box 913 .
  • the mobile phone 100 may receive the user's drag operation on the drag bar 915 to adjust the progress of playing music.
  • the user interface interface implementation method provided by the embodiment of the present application also supports applying the custom UI programming capability in the operating system to the application widget, so that the controls in the application widget have the extended visual properties, layout properties, and interaction properties of the operating system. properties, animation properties, and hardware and software dependencies.
  • the layout attribute refers to the layout of each control in the UI, such as the shape, position, and size of the control.
  • Visual properties refer to the visual effects of controls such as color and grayscale.
  • Interaction properties are the ability to provide control responses based on user behavior; such as performing a search based on the user's "confirm" behavior.
  • the animation property refers to displaying animation effects on the control; for example, displaying the click rebound animation on the control.
  • the software and hardware dependency properties refer to the software and hardware parameters of the control-dependent device.
  • the extended layout attributes in the operating system may include layout attributes such as automatic stretching, hiding, evenly dividing, proportioning, extending or wrapping.
  • automatic stretching means that the height or width of the control is automatically enlarged or reduced according to the size of the window to fit the size of the window.
  • Hiding refers to the ability of a control to be visible or invisible in a layout.
  • Even distribution means that the content in the control is evenly distributed in the layout.
  • the proportion means that the control occupies the total size of the layout according to the specified percentage in the specified direction.
  • Extending means that the control is extended and displayed on the UI according to the size of the display screen.
  • Line wrapping means that the content of a control is displayed on one or more lines in the layout.
  • the UI 910 of the music widget may include a picture button 916 for displaying the lyrics of the music.
  • the mobile phone can receive the user's click operation on the picture button 916, and in response to the user's click operation on the picture button 916, the mobile phone displays the lyrics of the currently playing music.
  • Picture button 916 may or may not be displayed on UI 910. Whether the picture button 916 is displayed depends on whether the currently playing music has lyrics. If the currently playing music has corresponding lyrics, the picture button 916 is displayed on the UI 910; if the currently playing music has no corresponding lyrics, the picture button 916 is not displayed on the UI 910.
  • the UI 910 includes a picture button 916; when the currently playing music is Music 2, the UI 910 does not include the picture button 916.
  • the user interface interface implementation method provided by the embodiment of the present application also supports applying the layout method and control type defined by the developer in the App to the application widget.
  • developers can The various layout methods and control types native to the system, the layout methods and control types defined in the operating system, and the layout methods and control types defined in the App are applied to the application widgets for the convenience of users.
  • the developer opens a development tool (for example, DevEco Studio) in the electronic device 200 (developer device), and uses the interface description language in the development tool, according to
  • the syntax and semantic specification of the interface description language is used to describe the interface and define the behavior of the interface, and develop the UI; form the interface description file and component interface description file for publishing.
  • UI layout arrangement that is, declaring and specifying the binding relationship between elements in the UI (such as controls, control groups) and background data in the interface description file or component interface description file.
  • Interactive behavior orchestration that is, declaring the execution actions corresponding to the control response events in the interface description file or the component interface description file.
  • the range of events supported by the control is determined by the event listeners supported by the control.
  • buttons supports setOnClickListener for monitoring click events
  • the onClick (click) event can be bound to the control in the interface description file.
  • Differentiated description including arranging different code segments for different types of electronic devices, so that the UI of the application widget has different display effects on different types of electronic devices; obtain the values of configuration parameters according to the hardware and software conditions of the electronic devices, and apply them to controls ;Defines parameters etc. that apply within the App.
  • the user can declare the picture control 911, text control 912, search box 913, picture button 914, drag bar 915 and picture button 916 shown in Figures 24A-24D in the component interface description file of the Music App. Attributes are programmed; these controls are included in the UI 910 of the music widget. These controls can be The controls native to the system can also be controls defined in the operating system or defined in the music app.
  • Developers can also apply control properties defined in the operating system to these controls in the component interface description file. For example, a software dependency property defined in the operating system is applied to the picture button 916 .
  • the display properties of the declare picture button 916 depend on the music currently being played by the App, including lyrics.
  • Developers can also bind controls to background data in the component interface description file.
  • the drag bar 915 is bound to the background data; the mobile phone receives the user's drag operation on the drag bar 915, and updates the current music playback progress in the background data according to the user's drag operation on the drag bar 915; if If the current music playing progress in the background data changes, the drag bar 915 is updated.
  • Developers can also declare the corresponding execution actions of these controls in response to events in the component interface description file. For example, it is stated that the search box 913 performs a search action in response to a click event.
  • an installation package of the App is generated on the developer device, which contains the interface description file and the component interface description file.
  • the installation package of the App is uploaded to the server, and the App is published in the application market provided by the server.
  • the user can use the user-side electronic device (the above-mentioned electronic device 100 ) to download the installation package of the App in the application market.
  • the user-side electronic device runs the installation package of the App, it obtains the interface description file and the component interface description file in the installation package.
  • the icon 103 of the music app is displayed on the desktop.
  • the mobile phone 100 can receive the user's click operation on the icon 103, run the music app, and display the UI of the music app on the display screen according to the interface description file.
  • the user-side electronic device adds application widgets to the notification bar, desktop or negative screen according to the user's settings.
  • the user-side electronic device generates the UI of the application widget according to the component interface description file, and displays the UI of the application widget on the notification bar, the desktop or the negative screen.
  • the mobile phone 100 displays the UI 910 of the music widget on the desktop.
  • the application widget process in the electronic device 100 runs independently of the application process.
  • the application installed on the electronic device 100 runs by calling the application process, and the application widget runs by calling the application widget process.
  • the desktop process is the application widget process
  • the application widget is set on the negative screen
  • the display process on the negative screen is the application widget process
  • the application widget is set in the specified application
  • the process of the specified application is the application widget process.
  • the electronic device 100 also includes a custom UI engine 11, a widget framework 12 and other units.
  • the application process obtains the interface description file of the App, calls the custom UI engine 11 to parse and execute the interface description file of the App, and generates the UI of the App.
  • the custom UI engine 11 may include a UI parsing engine 11a, a UI execution engine 11b, an MVVM (model-view-viewmodel) framework 11c, and the like.
  • the UI parsing engine 11a is used to parse the interface description file, and convert the content in the interface description file into a data format matching the UI execution engine 11b.
  • the UI parsing engine 11a may also perform syntax verification on the content in the interface description file, and if the syntax verification of the interface description file is successful, the interface description file is parsed; if the syntax verification of the interface description file is unsuccessful , the parsing interface description file is not executed.
  • the UI execution engine 11b is used to construct UI controls (instantiated controls and property settings) according to the data parsed by the UI parsing engine 11a, arrange the layout of the controls, and generate the interface declared in the interface description file; it can also realize device events and user behaviors The mapping between the user behaviors and the actions corresponding to the user behaviors defined in the interface description file are executed in response to the user behaviors.
  • the MVVM framework 11c is used for bidirectional binding between elements in the UI and background data.
  • the application process obtains the component interface description file, calls the widget framework 12 to process the component interface description file, and forms widget UI data for displaying the application widget UI.
  • the widget framework 12 includes modules such as virtual control construction 12a, data binding 12b, widget service 12c, and event proxy 12d.
  • the virtual control construction 12a parses the component interface description file by calling the UI parsing engine 11a, instantiates the parsed component interface description file, and calls the UI execution engine 11b to construct the interface, constructs controls, control groups, etc., to form small components UI data.
  • These widget UI data exist within the application process and are used for binding with background data (eg, the control model (ViewModel)).
  • the data binding 12b is used to combine the properties, interaction events, etc.
  • the widget service 12c is used to track the currently processed object and the data (model) bound to the object in the process of generating the application widget UI; it is also used to manage the data transmission between the application process and the application widget process ; It is also used to manage cross-process event agents, and to send and receive cross-process events.
  • the event proxy 12d is used to handle the postback and response of events in the application widget process.
  • a dedicated event transmission class (such as HiAction) is defined in the event agent 12d, and the event transmission class supports the implementation of the Parcelable interface, which can be transmitted across processes (for example, calling Native cross-process Binder mechanism).
  • the event transmission class stores a series of events, each event includes layout identifier, control identifier, event type and other information.
  • the application widget process receives the operation and triggers an interaction event, that is, a new event is added to the HiAction.
  • the application widget process transmits the added event to the application process.
  • the application process responds to this event and performs corresponding actions.
  • the application process also calls the MVVM framework for processing; if there is a change in data or control properties, the widget UI data is updated, the cross-process interface data and properties are updated, and the display interface of the application widget process is further updated.
  • the application process also sends the component interface description file to the application widget process.
  • the application widget process calls the widget framework 12 to process the widget interface description file, forms widget UI data, and displays the widget UI data, that is, displays the application widget IU.
  • An application is installed on the electronic device, and the user can add an application widget corresponding to the application on the notification bar, the desktop or the negative screen.
  • the user does not open the application, but separately adds the application widget corresponding to the application.
  • the mobile phone 100 receives the user's two-finger pinch operation on the desktop, and in response to the two-finger pinch operation on the desktop, the mobile phone 100 displays a shortcut setting interface 1010, and the shortcut setting interface 1010 includes a "widget".
  • Option 1011 to add a desktop widget on the desktop.
  • the mobile phone 100 may display the desktop widget adding interface 1020 in response to the user's click operation on the "widget" option 1011 .
  • the desktop widget adding interface 1020 includes a "music" option 1021 for adding a music widget on the desktop.
  • the mobile phone 100 receives the user's click operation on the "music” option 1021, and in response to the user's click operation on the "music” option 1021, the UI 910 of the "music widget” is displayed on the desktop of the mobile phone 100.
  • the user adds an application widget corresponding to the application in the application.
  • the user opens the “Music Settings” interface 1030 of the “Music” application on the mobile phone 100 , and the “Music Settings” interface 1030 includes the “Desktop Widget” option 1031 .
  • the "Desktop widget” option 1031 is used to add a music widget on the mobile phone desktop.
  • the mobile phone 100 receives the user's click operation on the "desktop widget” option 1031 , and in response to the user's click operation on the "desktop widget” option 1031 , the mobile phone 100 displays the "desktop widget" interface 1040 .
  • the "Desktop Widgets" interface 1040 includes a "Style 1" option 1041 and a "Style 2" option 1042; it also includes an "Add” button 1043 and a “Cancel” button 1044.
  • the user can click the "Add” button 1043 to add the music widget to the desktop with the interface corresponding to the "Style 1" option 1041 or the "Style 2" option 1042; and can also click the "Cancel” button 1044 to exit adding the music widget.
  • the mobile phone 100 receives the user's click operation on the "Add” button 1043, and according to the user's selection, adds the music widget to the desktop with an interface corresponding to the "Style 2" option 1042.
  • the UI 910 of the "music widget” is displayed on the desktop of the mobile phone 100 .
  • the user performs an operation of adding an application widget.
  • the application widget process of the electronic device 100 receives the user's operation of adding an application widget (for example, the user clicks on the "Music" option 1021 in FIG. 27 ), and the application widget process notifies the application Operation; if the application process is in an inactive state, the electronic device 100 pulls up the application process, so that the application process runs in the background. Or, the application process of the electronic device 100 receives the user's operation of adding an application widget (for example, the user clicks the "Add" button 1043 in FIG. 28 ), and the application process notifies the application widget process of receiving the user's adding an application widget. operate.
  • the application process obtains the component interface description file from the application installation package.
  • the application process calls the custom UI engine 11 to parse and execute the component interface description file, and then calls the virtual control construction 12a to construct the widget UI data control, and generates controls, control groups, etc. according to the layout arrangement in the component interface description file to form a small component UI data (including information such as controls and their layout).
  • the component interface description file may be a file independent of the interface description file, or may be a piece of code in the interface description file.
  • the electronic device 100 may parse and execute only some code segments therein. For example, if the user selects the music widget interface corresponding to the "Style 1" option 1041 in Fig. 28, the mobile phone 100, after receiving the user's click operation on the "Add” button 1043, parses and executes the corresponding "Style 1" in the component interface description file. The user selects the music widget interface corresponding to the "Style 2" option 1042 in Fig. 28, and the mobile phone 100, after receiving the user's click operation on the "Add” button 1043, parses and executes the "Style 2" in the component interface description file. "The corresponding code segment.
  • the data binding 12b calls the MVVM framework 11c to perform data binding between the widget UI data and the background data (eg, the control model).
  • the background data eg, the control model
  • the application process sends the component interface description file to the application widget process.
  • the application widget process calls the custom UI engine 11 to parse and execute the component interface description file, then calls the virtual control construction 12a to construct the widget UI data control, and generates controls, control groups, etc. according to the layout arrangement in the component interface description file, forming Widget UI data (including control and control layout and other information); and display according to the widget UI data, that is, display the application widget UI. Since the application process generates the widget UI data and the application widget process uses the same code segment to generate the application widget UI, the controls on the application widget UI are in one-to-one correspondence with the controls in the widget UI data.
  • the application process can also send the widget UI data to the application widget process after generating the widget UI data, and the application widget process displays according to the widget UI data, that is, displays the application. Widget UI.
  • the controls on the UI of the application widget are also in one-to-one correspondence with the controls in the UI data of the widget.
  • the user can operate the application on the application widget UI. For example, the user can drag the drag bar 915 on the music widget UI 910 in FIG. 24A to adjust the playback progress of the music currently played by the music application.
  • the application widget process when the user operates on the UI of the application widget, the application widget process receives the user operation and transmits the user operation to the event agent 12d.
  • a dedicated event transport class is defined in the event broker 12d, which is used for cross-process transport.
  • the event transmission class stores multiple events, where each event includes information such as layout identification, control identification, and event type.
  • the event agent 12d After receiving the user operation, the event agent 12d generates an event corresponding to the operation in the event transmission class, and sends the event to the application process (if the application process is not started, the application process is pulled up so that the application process runs in the background).
  • the application process After receiving the event, the application process obtains the corresponding control according to the layout identifier and the control identifier, and executes corresponding business logic according to the event acting on the control. Since there is a one-to-one correspondence between the controls on the UI of the application widget and the controls in the UI data of the widget, the application process also refreshes the background data according to the received events.
  • the background data change triggers the update of the widget UI data, and the application process can also send the updated widget UI data to the application widget process, and the application widget process displays the updated application widget UI according to the refreshed widget UI data.
  • the application process is started, and the application process acquires the interface description file.
  • the application process calls the custom UI engine 11 to parse and execute the interface description file, generate the UI of the application, and display the UI of the application.
  • the MVVM framework 11c data-binds the UI of the application with the background data (eg, the control model).
  • the application process receives the user's operation of adding an application widget, and the application process obtains the component interface description file from the application installation package.
  • the application process calls the custom UI engine 11 to parse and execute the component interface description file, and then calls the virtual control construction 12a to construct the widget UI data control, and generates controls, control groups, etc. according to the layout arrangement in the component interface description file to form a small component UI data (including information such as controls and their layout).
  • the data binding 12b calls the MVVM framework 11c to perform data binding between the widget UI data and the background data (eg, the control model).
  • the application widget process displays the UI of the application widget according to the widget UI data.
  • the UI of the application and the UI of the corresponding application widget are displayed on the display screen of the electronic device 100 .
  • FIG. 30 shows a flow example of the method for implementing a user interface interface provided by an embodiment of the present application.
  • the application widget process of the electronic device 100 receives the user's operation of adding an application widget (for example, the user clicks on the "Music" option 1021 in FIG. 27 ), and the application widget process notifies the application Operation; if the application process is in an inactive state, the electronic device 100 pulls up the application process, so that the application process runs in the background. Or, the application process of the electronic device 100 receives the user's operation of adding an application widget (for example, the user clicks the "Add" button 1043 in FIG. 28 ), and the application process notifies the application widget process of receiving the user's adding an application widget.
  • Widget framework, MVVM framework, background data and other modules are initialized.
  • the widget framework obtains the component interface description file from the application installation package and sends it to the virtual control building module.
  • the virtual control building module builds the control according to the component interface description file to form the widget UI data.
  • the data binding module calls the MVVM framework to bind the widget UI data and background data.
  • the application process calls the widget service to bind the service, and the widget service binds the event proxy to the widget UI data; after that, the widget UI data and the event proxy of the widget UI data are sent across the process. In this way, after receiving the widget UI data and the event proxy information of the widget UI data, the application widget process can display the application widget UI according to the widget UI data.
  • the application widget process receives the user's operation on the application widget UI, and the event agent adds the corresponding event of the operation and sends the event to the application process; the application process executes the business logic in response to the event, and calls the MVVM framework to process the background data. renew.
  • the business data of the application changes, the background data changes cause the MVVM framework to update the widget UI data.
  • the application process sends the updated widget UI data across processes, and after receiving the updated widget UI data, the application widget process can display the updated application widget UI according to the updated widget UI data.
  • the application process generates widget UI data according to the component interface description file, and sends the component interface description file or the widget UI data to the application widget process; the application widget process describes the widget interface according to the component interface.
  • the file or widget UI data generates the application widget UI.
  • Developers can declare in the component interface description file
  • the native layout method and control type of the system can also declare the custom control type and UI programming capability in the operating system, as well as the layout method and control type defined by the developer in the App.
  • the operating system supports the application widget process to call the UI engine to parse and execute the component interface description file to generate the application widget UI. In this way, various layout modes and control types are displayed on the UI supporting the application widget, which facilitates the user to use the application widget and improves the user experience.
  • the electronic device is powered off.
  • the UI of the application widget is displayed. That is, after the application widget is added to the electronic device, the application widget UI is reloaded. As shown in FIG. 31, the mobile phone 100 is powered on, and the UI 910 of the music widget is displayed on the desktop of the mobile phone 100.
  • FIG. 32 shows a flow example of a method for reloading an application widget UI by an electronic device.
  • the application widget process After the electronic device is powered on, the application widget process starts.
  • the application widget process obtains the component interface description file from the application's installation package.
  • the application widget process invokes the custom UI engine to parse and execute the component interface description file, construct the widget UI data, and display the application widget UI according to the widget UI data.
  • the application widget process receives the user's operation on the application widget UI, the event agent adds the event corresponding to the operation, and pulls up the application process, so that the application process runs in the background of the system, and sends the event to the application process; the application process responds to The event executes the corresponding business logic and calls the MVVM framework to update the background data.
  • the relevant description in FIG. 30 which will not be repeated here.
  • the application widget process generates and draws and loads the application widget UI.
  • the application process generates the widget UI data, binds the widget UI data and background data, and establishes the corresponding relationship between the widget UI data and the application widget UI, and other processes can no longer be executed.
  • the application process of the electronic device generates the widget UI data according to the component interface description file, and binds the widget UI data with the background data; the application widget process also obtains the widget according to the component interface description file. Widget UI data, and display the widget UI data as the UI of the app widget. In this way, a corresponding relationship is established between the UI of the application widget and the background data, and various layout methods and control types can be displayed on the UI of the application widget, which is convenient for the user to use the application widget and improves the user experience.
  • the embodiments of the present application also provide a method for implementing a user interface interface, which is used to present a UI when an App on an electronic device is projected to a playback device for playback.
  • IoT Internet of things
  • Consumers can use mobile phones, tablet computers and other devices as control devices of IoT devices to control IoT devices, so that control devices and IoT devices can work together.
  • the IoT device is called a playback device.
  • the IoT device is called a playback device.
  • the screen shapes and sizes of IoT devices vary greatly, it is a problem that needs to be solved how to perform screen projection on IoT devices with different shapes and sizes of screens, and obtain a screen projection interface that matches the screen shape and size of IoT devices.
  • Embodiments of the present application provide a method and apparatus for implementing a user interface interface, which supports projecting various UIs on a control device to an IoT device for playback, thereby improving user experience.
  • the control device is the user-side electronic device (electronic device 100 ) in the above embodiments.
  • An embodiment of the present application provides a method for implementing a user interface interface. Please refer to FIG. 33 .
  • An application development tool (for example, Android Studio, DevEco Studio, etc.) is installed on the electronic device 200 .
  • the electronic device 200 may also be referred to as a developer device in this application.
  • developers use an interface description language to develop the UI of the App and the UI of the player in the application development tool. It can be understood that in some embodiments, the developer can directly develop the UI of the App and the UI of the player on the control device 100, and run the App on the control device 100; that is, the electronic device 200 and the control device 100 can be the same Electronic equipment. This embodiment of the present application does not limit this.
  • the developer develops the UI of the App (that is, the UI displayed when the electronic device installs and runs the App) in the application development tool, and forms an interface description file.
  • the interface description file may also be referred to as a description file.
  • the developer also develops the UI of the App for display on the player side (ie, the player side UI) in the application development tool to form the player side interface description file.
  • the developer packages the interface description file and the player interface description file into the installation package of the App, and publishes the App in the application market provided by the server 300 .
  • the installation package of each App can be provided in the application market for users to download.
  • the installation package can be Application package (Android application package, APK) file.
  • a user can use the mobile phone to download an installation package of an App in the application market.
  • the video App can be installed in the mobile phone by running the installation package. In this way, the mobile phone also obtains the interface description file and the player interface description file in the installation package.
  • the interface description file may also be used as the player interface description file, that is, the interface description file and the player interface description file are the same file.
  • the mobile phone can present the UI of the corresponding App on the display screen according to the interface description file.
  • a "video" icon 101 is generated on the desktop.
  • the user can click the "video” icon 101 to open the video app.
  • the mobile phone runs the video App.
  • the OS platform is installed on the mobile phone.
  • the custom UI engine of the OS platform reads the interface description file, parses and executes the interface description language, and renders the UI of the video App according to the interface description in the interface description file.
  • the display device (such as a display screen) of the mobile phone presents the UI 105 of the video App.
  • the interface description file may also include the definition of the interface behavior.
  • the mobile phone can execute corresponding interface actions according to the interface behaviors defined in the interface description file to realize the interface behaviors.
  • the OS platform also has a corresponding programming language for implementing interface behaviors, implementing dynamic changes of UI 105 and responding to user operations on UI 105; for example, using JAVA, Implement interface behavior using the Swift programming language.
  • the mobile phone can also project various interfaces of the video App to the playback device 1000 for display.
  • the main interface or playback interface of the video App is projected to the playback device 1000 .
  • the playback device 1000 renders the corresponding player UI according to the interface description matching its device type in the player interface description file.
  • the UI 105 of the video App includes a "screen projection" button 106; the "screen projection” button 106 is used to project the interface of the App running on the mobile phone to the playback device for display.
  • the mobile phone receives the user's click operation on the "screen projection” button 106 , and in response to the user's click operation on the "screen projection” button 106 , the mobile phone displays a device selection interface 107 .
  • the device selection interface 107 includes prompt information 108 for prompting the user to select a playback device for screen projection.
  • the device selection interface 107 also includes a "TV in living room” option 109 and a "My Tablet” option 10a. The user can click the "TV in the living room” option 109 to project the UI of the video App to the smart TV for display.
  • the mobile phone receives the user's click operation on the "TV in the living room” option 109, and in response to the user's click operation on the "TV in the living room” option 109, the mobile phone projects the UI of the video App to the smart TV.
  • the smart TV displays the player UI 1001 corresponding to the UI 105.
  • the user can also click the "My Tablet” option 10a to project the UI of the video app to the tablet for display.
  • the mobile phone receives the user's click operation on the "My Tablet Computer” option 10a, and in response to the user's click operation on the "My Tablet Computer” option 10a, the mobile phone projects the UI of the video App to the tablet computer.
  • the tablet computer displays the player UI 1002 corresponding to the UI 105.
  • the device types of smart TVs and tablet computers are different, and the screen sizes and shapes are also different.
  • the interface layout of the player UI 1001 on the smart TV and the player UI 1002 on the tablet computer is different; that is, the player UI is different in different device types. Differential display on electronic devices.
  • the above-mentioned playback device 1000 may include a portable computer (such as a mobile phone, etc.), a smart home device (such as a smart TV, a smart screen, a large screen, a smart speaker, etc.), a handheld computer, a personal digital assistant (personal digital assistant, PDA), wearable Devices (for example, smart watches, smart bracelets, etc.), tablet computers, notebook computers, netbooks, augmented reality (AR) ⁇ virtual reality (VR) devices, in-vehicle computers, etc. Do not make any restrictions.
  • the playback device 1000 may include the structure shown in FIG. 2 , which will not be repeated here. It can be understood that the structure shown in FIG.
  • the playback device 1000 in the embodiment of the present application does not constitute a specific limitation on the playback device 1000 .
  • the playback device 1000 may include more or less components than those shown in FIG. 2 , or combine some components, or separate some components, or arrange different components.
  • the components shown in Figure 2 may be implemented in hardware, software or a combination of software and hardware.
  • the control device 100 includes units such as a custom UI engine 11 , a screen projection framework 13 , and a transmission channel adaptation 14 .
  • the custom UI engine 11 provides the IF1 interface
  • the screen projection framework 13 provides the IF2, IF3, IF4 and IF5 interfaces.
  • interface IF1-interface IF5 The description of interface IF1-interface IF5 is shown in Table 2:
  • the custom UI engine 11 parses and executes the interface description file of the App to generate the UI of the App.
  • the custom UI engine 11 may include a UI parsing engine 11a, a UI execution engine 11b, an MVVM (model-view-viewmodel) framework 11c, and the like.
  • the UI parsing engine 11a is used to parse the interface description file, and convert the content in the interface description file into a data format matching the UI execution engine 11b.
  • the UI parsing engine 11a may also perform syntax verification on the content in the interface description file, and if the syntax verification of the interface description file is successful, the interface description file is parsed; if the syntax verification of the interface description file is unsuccessful , the parsing interface description file is not executed.
  • the UI execution engine 11b is used to construct UI controls (instantiated controls and property settings) according to the data parsed by the UI parsing engine 11a, arrange the layout of the controls, and generate the interface declared in the interface description file; it can also realize device events and user behaviors
  • the mapping between the user behaviors and the actions corresponding to the user behaviors defined in the interface description file are executed in response to the user behaviors.
  • the MVVM framework 11c is used for bidirectional binding between elements in the UI and background data. In the interface description file, declare and specify the binding relationship between elements in the UI (such as controls, control groups) and the background data.
  • the MVVM framework 11c can implement UI changes refresh the background data, and automatically refresh the corresponding UI according to the background data changes. It helps developers focus on UI design and arrangement, simplifies the UI development process, and greatly reduces the development time that developers invest in realizing front-end and back-end data interaction.
  • the transmission channel adaptation 14 is used to adapt the data transmission channel between the control device 100 and the playback device 1000 .
  • the data of the control device 100 is converted into a format suitable for the data transmission channel, so that the control device 100 can send data to the playback device 1000 through the data transmission channel.
  • the screen projection framework 13 is used to process the interface description file of the player to form the UI data of the player for displaying the UI of the player.
  • the screencasting framework 13 includes modules such as virtual control construction 13a, data binding 13b, screencasting service 13c, data sending and receiving 13d, resource transmission 13e, event proxy 13f, and life cycle 13g.
  • the virtual control construction 13a constructs controls, control groups, etc. according to the player interface description file by calling the UI analysis engine 11a and the UI execution engine 11b to form the player UI data.
  • These playback-side UI data exist in the App process and are used to bind with background data.
  • the data binding 13b is used to bind the properties, interaction events, etc.
  • the screen projection service 13c is used to track and process the currently processed object and the data (model) bound to the object during the projection process; it is also used to manage the data transmission channel between the control device 100 and the playback device 1000 .
  • the data transceiver 13d is used for data transmission and reception between the control device 100 and the playback device 1000 .
  • the interface of the transceiver agent can be defined to implement the default transceiver built in the control device 100 .
  • the transceiver suitable for the App can be customized according to the interface specification in the App.
  • the resource transmission 13e is used to transmit a specific type of data resources (eg, data, pictures, videos, etc., whose data amount is larger than a set value).
  • the resource transmission 13e is used to manage the specific type of data resource, such as sending, receiving, buffering, identifying, and progress control.
  • the event proxy 13f is a channel for delivering events to avoid events being blocked by data transmission.
  • the life cycle 13g is used to manage the life cycle of the association between the operating entities of the control device 100 and the playback device 1000 during the screen projection process. Exemplarily, the life cycle of the control device 100 and the playback device 1000 is shown in Table 3:
  • the control device 100 is in a stand-alone running state, triggers screen projection, and waits for the user's authorization to project the screen to the playback device. If the user agrees to the authorization, the playback device sends an authorization instruction to the control device, and the control device 100 enters the server running state. If the user rejects the authorization, the playback device sends an authorization rejection instruction to the control device, and the control device 100 stops screen projection.
  • the control device 100 is in the server running state, the App switches to the background running, and stops pushing data to the playback device; the App switches to the foreground running, and starts to push data to the playback device.
  • control device 100 When the control device 100 is in the server running state, if the control device closes the App, or the playback device is closed, the screen projection is stopped.
  • the control device 100 When the control device 100 is in the server running state, the App of the playback device switches to run in the background, and the control device 100 enters the server pause state.
  • the control device 100 is in the server pause state, the App of the playback device switches to the foreground playback, and the control device 100 enters the server running state.
  • the control device 100 When the control device 100 is in the pause state of the server, and the playback device is turned off, the screen projection is stopped.
  • the playback device 1000 receives a screen projection request initiated by the control device, and waits for the user to confirm the authorization. If the user confirms the authorization, the playback device 1000 enters the screen projection operation state; if the user refuses the authorization, the playback device 1000 stops running.
  • the playback device 1000 is in the screen-casting running state, and the App switches to the background running state, and then enters the background resident state.
  • the playback device 1000 resides in the background, and the App switches to the foreground for playback, it enters the screen-casting running state.
  • the playback device 1000 is in the screen-casting running state or the background resident state, and the playback device is turned off or the app is closed by the control device, it stops running.
  • the developer in the App development stage, the developer generates an App installation package on the developer device, which includes an interface description file and a player interface description file.
  • the developer uses the interface description language, develops the interface description file and the player interface description file on the developer device according to the syntax and semantic specifications of the interface description language, and adds code to the interface description file and the player interface description file for UI development.
  • Developers can perform UI layout arrangement, data & interface binding, interactive behavior arrangement, and differentiated description in the interface description file and the player interface description file respectively.
  • All UI in the app is composed of controls.
  • the layout of the UI is to arrange the properties of the controls in the UI.
  • controls in the UI can include all Native controls and extended controls in the operating system also support controls customized by developers in the app or integrated through static packages.
  • the controls may specifically include text controls, such as TextView controls, EditText controls, etc., may also include button controls, such as Button controls, ImageButton controls, etc., and may also include image controls, such as Image controls, etc. This is not done in this embodiment of the application. any restrictions.
  • Control properties include Native properties and extended visual properties, layout properties, interaction properties, animation properties, and software and hardware dependencies in the operating system. Visual properties refer to the visual effects of controls such as color and grayscale.
  • Interaction properties are the ability to provide control responses based on user behavior; such as performing a search based on the user's "confirm" behavior.
  • the animation property refers to displaying animation effects on the control; for example, displaying the click rebound animation effect on the control.
  • the software and hardware dependency properties refer to the software and hardware parameters of the control-dependent device.
  • Data & interface binding that is, declare and specify the binding relationship between elements in the UI (such as controls, control groups) and background data in the interface description file or the player interface description file.
  • Interactive behavior arrangement that is, in the interface description file or the player interface description file, declare the execution action corresponding to the control response event.
  • the range of events supported by the control is determined by the event listeners supported by the control. For example, if a button control (Button) supports setOnClickListener for monitoring click events, the onClick (click) event can be bound to the control in the interface description file.
  • developers can declare variables of electronic device configuration parameters in the interface description file or the player interface description file.
  • the configuration parameters of the electronic device are accessed, and the electronic device obtains the value of the configuration parameters according to its software and hardware conditions. In this way, when different types of electronic devices run the interface description file or the player interface description file, the generated UI is different due to different software and hardware conditions and different configuration parameters.
  • the interface description file or the player interface description file may include style, layout-data-common and other parts. Developers define myTextStyle in style, and can call this custom parameter in the form of $style.myTextStyle in layout-data-common. The example is as follows,
  • the developer can arrange the code for the specified device in the interface description file or the player interface description file.
  • the interface description file or the player interface description file may include the following structure:
  • layout-data-uimode is used to describe the player UI of the specified device.
  • the layout-data-uimode declares the difference between the specified device player UI and the universal player UI.
  • the specified device parses and executes the content in layout-data-common and layout-data-uimode to generate the playback UI of the specified device.
  • layout-data-uimode declares all conditions applicable to the player UI of the specified device.
  • the device layouts its player UI according to the content in layout-data-uimode.
  • the designated device may be one of a mobile phone, a watch, a car device, a smart home device (for example, a smart TV, a smart screen, a smart speaker, etc.), a large screen, a tablet computer, a laptop computer, or a desktop computer.
  • the specific form of layout-data-uimode can include layout-data-phone (for mobile phones), layout-data-watch (for watches), layout-data-television (for smart TVs), layout-data- pad (for tablets), layout-data-car (for cars), etc.
  • layout-data-side UI for different types of playback devices can perform parsing and execution according to their corresponding code segments and build a playback-side UI; it is possible to display a playback-side UI matching the type of playback device on different types of playback devices.
  • the developer uploads the App installation package generated on the developer's device to the server, and the App is published in the application market provided by the server.
  • the user can use the user-side electronic device (the above-mentioned control device 100 ) to download the installation package of the App in the application market.
  • the control device runs the installation package of the App, obtain the interface description file and the player interface description file in the installation package.
  • a UI matching the control device is displayed on the display screen according to the interface description file.
  • the control device determines a playback device that performs screen projection according to the user input, and sends a screen projection instruction to the playback device, where the screen projection instruction includes an identifier of the screen projection interface.
  • the playback device receives the screen projection instruction, obtains the corresponding playback side interface description file according to the identifier of the screen projection interface, and forms a playback side UI matching its device type according to the playback side interface description file.
  • the control device 100 receives the user's screen projection operation (for example, the mobile phone receives the user's click operation on the "My Tablet PC” option 10a in Figure 33), and obtains from the App installation package.
  • the player interface description file corresponding to the current interface.
  • the control device 100 also determines the device type of the playback device 1000 for screen projection according to the user input (for example, if the mobile phone receives the user's click operation on the “TV in the living room” option 109 in FIG. 33 , the playback device 1000 is determined to be a smart TV; After receiving the user's click operation on the “My Tablet PC” option 10a in FIG. 33, it is determined that the playback device 1000 is a tablet PC).
  • the virtual control construction 13a in the OS of the control device 100 parses and executes the code segment corresponding to the device type of the playback device 1000 in the player interface description file by calling the custom UI engine 11, according to the device of the playback device 1000 in the player interface description file. Controls are constructed according to the code segment corresponding to the type, and controls, control groups, etc. are generated according to the layout arrangement in the code segment to form the UI data of the player.
  • the control device 100 parses and executes the code segment corresponding to the smart TV in the player interface description file to form the playback UI data projected to the smart TV;
  • the playback device 1000 is a tablet computer, and controls The device 100 parses and executes the code segment corresponding to the tablet computer in the player interface description file to form the player UI data projected to the tablet computer.
  • the data binding 13b calls the MVVM framework 11c to perform data binding between the objects in the UI data of the player and the background data (for example, the control model). In this way, if the background data changes, the corresponding player UI data can be refreshed, and if the player UI data changes, the corresponding background data can be refreshed.
  • control device 100 sends the player interface description file and the resource file (including the data resources associated in the player interface description file) to the playback device 1000 through the data transceiver 13d.
  • the control device 100 encodes the player interface description file, and after encoding data such as layout information, resource values, data, and response event definitions, it is transmitted to the playback device 1000 through a data transmission channel; specific types of data resources (For example, data, pictures, videos, etc. whose data amount is greater than the set value) are transmitted to the playback device 1000 through a specific data transmission channel.
  • a specific type of data resource can be transmitted to the playback device 1000 before sending the player interface description file.
  • the control device 100 also initializes the event agent 13f, and establishes an event transmission channel between the control device 100 and the playback device 1000 for transmitting event information.
  • the playback device 1000 receives the player interface description file and data resources through the data transceiver 13d, and the virtual control construction 13a in the OS of the playback device 1000 parses and executes the device of the playback device 1000 in the player interface description file by calling the custom UI engine 11.
  • the code segment corresponding to the type is constructed according to the code segment corresponding to the device type of the playback device 1000 in the player interface description file, and the controls, control groups, etc. are generated according to the layout arrangement in the code segment to form the player UI data (including controls and control layout and other information); and display according to the UI data of the player, that is, display the UI of the player.
  • control device 100 Since the control device 100 generates the player UI data and the playback device 1000 uses the same code segment to generate the player UI data, the controls on the player UI generated by the playback device 1000 and the controls in the player UI data generated by the control device 100 are one-to-one correspondence.
  • the player device 1000 displays the player UI on the display screen that matches the shape and size of the screen of the player device.
  • the smart TV parses and executes the layout-data-television code segment in the playback interface description file to generate a corresponding playback UI for the smart TV;
  • a tablet computer is used as a playback device.
  • the tablet computer parses and executes the layout-data-pad code segment in the player interface description file, the corresponding player UI of the tablet computer is generated.
  • the control device 100 performs control construction according to the code segment corresponding to the device type of the playback device 1000 in the player interface description file, and generates playback according to the layout arrangement in the code segment.
  • the generated playback terminal UI data is sent to the playback device 1000 .
  • the playback device 1000 displays the player UI according to the player UI data.
  • the playback device generates a playback end UI corresponding to the playback device according to the code segment corresponding to the device type of the playback device in the playback end interface description file.
  • the playback UI displayed by different types of playback devices matches the shape and size of their screen.
  • developers can easily develop the UI of the player, and define various types of controls (including all Native controls and expanded controls in the operating system, and also supports controls customized by developers in the App or integrated through static packages), and supports various types of controls; so that all types of apps can support the screen projection function, and play
  • the control types supported by the terminal UI are more abundant and diverse, which is convenient for users to use.
  • the virtual control construction 13a in the OS of the control device 100 parses and executes the code segment corresponding to the device type of the playback device 1000 in the player interface description file, according to the device type of the playback device 1000 in the player interface description file.
  • the code segment corresponding to the type constructs the UI data of the player.
  • the player UI data is not displayed on the display screen of the control device 100 (that is, the player UI data exists in the App process and is not sent to the display process). What the control device 100 displays is the UI generated according to the interface description file. After the interface of the control device 100 is projected to the playback device 1000, the user can perform other operations on the control device 100, and the playback device 1000 normally plays the projected content.
  • the mobile phone displays an interface 1210 of the “Video” App, and the interface 1210 includes a “Screencast” button 1211 .
  • the mobile phone receives the user's click operation on the "cast screen” button 1211, and determines the playback device according to the user's input.
  • the mobile phone projects the screen to the smart TV according to the user input.
  • the mobile phone generates the player UI data according to the player interface description file, and sends the player interface description file to the smart TV.
  • the smart TV generates the player UI data according to the player interface description file, and displays the player UI according to the player UI data.
  • the smart TV displays the UI 1220 of the playback terminal.
  • the mobile phone receives the user's click operation on the picture 1212, and in response to the user's click operation on the picture 1212, the interface 1230 of the "Video" App is displayed.
  • control device projects the screen to the playback device, it can continue to perform other functions, and run independently from the playback device to play the screen-cast content without affecting each other, so as to achieve better cooperation between the devices.
  • the custom UI engine 11 in the OS of the control device 100 parses and executes the interface description file, generates the UI of the application, and displays the UI of the application.
  • the MVVM framework 11c data-binds the UI of the application with background data (eg, control models).
  • the virtual control construction 13a in the OS of the control device 100 parses and executes the code segment corresponding to the device type of the playback device 1000 in the player interface description file by calling the custom UI engine 11, according to the device of the playback device 1000 in the player interface description file. Controls are constructed according to the code segment corresponding to the type, and controls, control groups, etc. are generated according to the layout arrangement in the code segment to form the UI data of the player. After that, the data binding 13b calls the MVVM framework 11c to perform data binding between the objects in the UI data of the player and the background data (for example, the control model).
  • control device 100 sends the player interface description file and the resource file (including the data resources associated in the player interface description file) to the playback device 1000 through the data transceiver 13d; or sends the player UI data to the playback device 1000, In this way, the playback device 1000 can display the player UI according to the player UI data.
  • the virtual control construction 13a in the OS of the control device 100 parses and executes the code segment corresponding to the device type of the playback device 1000 in the player interface description file, according to the playback device 1000 in the player interface description file.
  • the code segment corresponding to the device type constructs the UI data of the player.
  • the App process sends the player UI data to the display process, and displays the player UI on the display screen of the control device 100 .
  • the control device 100 and the playback device 1000 display the player UI generated according to the same code segment.
  • the mobile phone displays an interface 1210 of the “Video” App, and the interface 1210 includes a “Screencast” button 1211 .
  • the mobile phone receives the user's click operation on the "cast screen” button 1211, and determines the playback device according to the user's input.
  • the mobile phone projects the screen to the smart TV according to the user input.
  • the mobile phone generates the player UI data according to the player interface description file, and displays the player UI according to the player UI data.
  • the mobile phone also sends the player interface description file to the smart TV.
  • the smart TV generates the player UI data according to the player interface description file, and displays the player UI according to the player UI data.
  • the mobile phone after the mobile phone projects the screen to the smart TV, the mobile phone also displays the player UI, and both the mobile phone and the smart TV display the player UI 1220.
  • control device and the playback device can play the player UI synchronously, so that mirror projection can be realized, and the control device and the playback device can work together.
  • the data binding 13b calls the MVVM framework 11c to update the player UI data. Since there is a one-to-one correspondence between the controls of the UI of the player and the controls in the UI data of the player, the update of the UI data of the player triggers the update of the UI of the player. In this way, when the control device 100 receives user operations or changes in service data, the player UI of the playback device 1000 can be updated synchronously.
  • the mobile phone displays the UI 1310 of the "Video” App.
  • the mobile phone projects the UI 1310 of the “Video” App to the smart TV according to the user's input.
  • the smart TV displays the player UI 1320 of the "video” App, and the player UI 1320 includes a "play” button 1321.
  • the user may perform an operation to start playing the video on the smart TV (for example, the user selects the "play” button 1321 through the remote control of the smart TV, and clicks the "play” button 1321).
  • the smart TV receives the user's click operation on the "play” button 1321, and in response to the user's click operation on the "play” button 1321, performs playback of the video, and displays the updated UI 1320.
  • the updated player UI 1320 includes a "pause" button 1322.
  • a dedicated event transport class is defined in the event proxy 13f, and the event transport class is used for cross-device transport.
  • the event transmission class stores multiple events, where each event includes information such as layout identification, control identification, and event type.
  • the playback device 1000 receives the user's operation on the player UI, generates an event corresponding to the operation in the event transmission class, and transmits it to the control device 100 through the event transmission channel. After receiving the event, the control device 100 acquires the corresponding control according to the layout identifier and the control identifier, and executes corresponding business logic according to the event acting on the control.
  • the control device 100 Because there is a one-to-one correspondence between the controls of the UI of the player and the controls in the UI data of the player, the control device 100 also updates the background data, and the change of the background data triggers the update of the UI data of the player.
  • the control device 100 sends the updated player UI data to the playback device 1000, and the playback device 1000 displays the updated player UI according to the updated player UI data.
  • the control device executes the corresponding business logic, and updates the player UI on the playback device.
  • the UI on the control device can also be updated synchronously, which is convenient for the user to use.
  • the control device executes the relevant business logic, and the control device controls the playback device uniformly, which is convenient for management; and avoids the low performance of the playback device and does not support complex business logic processing.
  • the playback device 1000 receives the user's second operation on the UI of the playback terminal.
  • the playback device 1000 acquires the updated player interface description file from the control device 100, and generates an updated player UI.
  • the mobile phone displays the UI 1330 of the "Video” App.
  • the mobile phone projects the UI 1330 of the "Video” App to the smart TV according to the user's input.
  • the smart TV displays the playback UI 1340 of the "Video” App.
  • the smart TV receives the user's second operation on the player UI 1340 (for example, the user moves the focus on the player UI 1340 from “movie” to "variety show” through the remote control).
  • the smart TV displays the updated player UI, that is, the player UI 1350 of the "Video” App.
  • the playback device 1000 receives the second operation of the user on the UI of the player, generates an event corresponding to the second operation in the event transmission class, and transmits it to the user through the event transmission channel.
  • Control device 100 After receiving the event, the control device 100 acquires the corresponding control according to the layout identifier and the control identifier, and executes corresponding business logic according to the event acting on the control.
  • the control device 100 determines to update the player UI to the player UI with the focus "variety show", and obtains the player interface description file 2 corresponding to the player UI with the focus "variety show".
  • the virtual control construction 13a in the OS of the control device 100 parses and executes the code segment corresponding to the device type of the playback device 1000 in the player interface description file 2 by calling the UI, and according to the player interface description file 2
  • the device type of the playback device 1000 Controls are constructed in the corresponding code segment, and controls, control groups, etc. are generated according to the layout arrangement in the code segment, and the second UI data of the player is formed.
  • the data binding 13b calls the MVVM framework 11c to perform data binding between the objects in the UI data 2 of the player and the background data (for example, the control model).
  • control device 100 sends the player interface description file 2 and the resource file (including the data resources associated in the player interface description file 2) to the playback device 1000 through the data transceiver 13d.
  • control device 100 encodes the player interface description file 2, and after encoding data such as layout information, resource values, data, and response event definitions, it is transmitted to the playback device 1000 through a data transmission channel; specific types of data Resources (for example, data, pictures, videos, etc., whose data amount is larger than the set value) are transmitted to the playback device 1000 through a specific data transmission channel.
  • the playback device 1000 receives the player interface description file 2 and the data resource resource file through the data transceiver 13d, and the virtual control construction 13a in the OS of the playback device 1000 parses and executes the playback end interface description file 2 by calling the custom UI engine 11.
  • the code segment corresponding to the device type of the device 1000 is constructed according to the code segment corresponding to the device type of the playback device 1000 in the player interface description file 2, and the controls, control groups, etc. are generated according to the layout arrangement in the code segment to form the playback end UI data 2 (including controls and the layout of the controls and other information); and display according to the UI data 2 of the player, that is, display the updated UI of the player.
  • the user can directly operate the player UI on the playback device, control the device to execute the business logic corresponding to the operation, and send the player interface description file corresponding to the updated player UI to the playback device.
  • the terminal interface description file generates an updated playback terminal UI. Realize the direct operation of the playback UI on the playback device, and successfully switch the playback UI.
  • FIG. 41A shows an example of the processing flow of the control device in the user interface interface implementation method provided by the embodiment of the present application.
  • the screen projection framework, MVVM framework, and background data are initialized.
  • the resource transmission module transmits the data resources related to the App, and binds the data resources to the screen projection service.
  • the screencasting framework obtains the player interface description file from the application installation package and sends it to the virtual control building module.
  • the virtual control building module builds controls according to the interface description file of the player to form the UI data of the player; and binds the UI data of the player to the screen projection service.
  • the virtual control building module notifies the data binding module to bind the UI data of the player side and the background data, and the data binding module mobilizes the MVVM framework to bind the UI data of the player side and the background data.
  • the screencasting service is also bound to an event proxy.
  • the player interface description file is encoded and sent to the playback device. In this way, after receiving the encoded player interface description file, the playback device can generate and display the player UI according to the player interface description file.
  • the screencasting framework receives the event sent by the playback device and sends the event to the MVVM framework; the MVVM framework updates the background data according to the event.
  • the background data changes cause the MVVM framework to update the UI data of the player.
  • the control device sends the updated UI data of the player to the playback device. In this way, after receiving the updated player UI data, the playback device can display the updated player UI according to the updated player UI data.
  • FIG. 41B shows an example of the processing flow of the playback device in the user interface interface implementation method provided by the embodiment of the present application.
  • the playback device receives the encoded playback end interface description file sent by the control device through the transmission channel.
  • the screencasting framework calls the custom UI engine to parse and execute the player interface description file, form the player UI data, and display the player UI according to the player UI data.
  • the event proxy module receives the user's operation on the UI of the player, generates a corresponding event, and transmits the event to the control device, so that the control device processes the event.
  • An embodiment of the present application provides a method for implementing a user interface interface. In the process of running an App on a control device, if a preset condition is met, the control device pushes preset information to a playback device for playback.
  • the user opens the “Takeaway” app on the mobile phone to order food and pay for the order.
  • the App switches to run in the background.
  • the preset conditions for example, the mobile phone determines that the takeaway order is expected to be delivered in 20 minutes; for another example, the user performs a query operation on the smart watch
  • the mobile phone pushes the preset information to the smart watch for display.
  • the smart watch displays the player UI 1410; the player UI 1410 includes a “delivery order progress” control 1411 and prompt information 1412.
  • the developer defines a player interface description file (or a piece of code in the player interface description file) that pushes information to the player when a preset condition is met during the development phase.
  • the player UI of the smart watch is defined in the player interface description file, including controls 1411 and prompt information 1412 .
  • the mobile phone determines that the preset conditions are met, reads the specified code segment, generates the player UI data according to the specified code segment, and converts the specified code segment.
  • the code segment (or the generated playback UI data) is sent to the smart watch.
  • the smart watch generates the player UI 1410 according to the specified code segment (or the generated player UI data).
  • the user uses the navigation App to navigate on the mobile phone.
  • the mobile phone When a preset condition is met (for example, changing the forward direction), the mobile phone generates the player UI data according to the specified code segment, and sends the specified code segment (or the generated player UI data) to the smart watch.
  • the smart watch generates the player UI 1420 according to the specified code segment (or the generated player UI data).
  • the smart watch receives an operation performed by the user on the smart watch, generates an event corresponding to the operation, and sends the event to the mobile phone for processing.
  • the mobile phone performs business logic processing, executes corresponding actions, and updates the UI data of the player.
  • the mobile phone also sends the updated UI data of the player to the smart watch, and the smart watch updates the UI of the player according to the updated UI data of the player.
  • the control device when a preset condition is satisfied, automatically pushes part of the information of the running App to the playback device for playback.
  • Different types of playback devices can read the code segment corresponding to the device type, which can easily realize the differentiated layout of the playback UI of the device.
  • the user can control the App on the playback device, and the control device performs business logic processing; this can improve the user's use experience, avoid low performance of the playback device, and do not support complex business logic processing.
  • the above-mentioned electronic device includes corresponding hardware structures and/or software modules for executing each function.
  • the embodiments of the present application can be implemented in hardware or a combination of hardware and computer software. Whether a function is performed by hardware or computer software driving hardware depends on the specific application and design constraints of the technical solution. Those skilled in the art may use different methods to implement the described functions for each specific application, but such implementation should not be considered beyond the scope of the embodiments of the present application.
  • the electronic device may be divided into functional modules according to the foregoing method examples.
  • each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules. It should be noted that, the division of modules in the embodiments of the present application is schematic, and is only a logical function division, and there may be other division manners in actual implementation.
  • an embodiment of the present application discloses an electronic device 1500, which may be an electronic device running the above-mentioned development tool, or an electronic device running an App in the above-mentioned embodiment, or an application group running in the above-mentioned embodiment.
  • the electronic device of the component; the electronic device can be the above-mentioned control device or playback device.
  • the electronic device may specifically include: a display screen 1501; an input device 1502 (such as a mouse, a keyboard or a touch screen, etc.); one or more processors 1503; a memory 1504; one or more application programs (not shown); and one or more A plurality of computer programs 1505, the various devices described above may be connected by one or more communication buses 1506.
  • the above-mentioned one or more computer programs 1505 are stored in the above-mentioned memory 1504 and configured to be executed by the one or more processors 1503, and the one or more computer programs 1505 include instructions, which can be used to execute the above-mentioned Relevant steps in the examples.
  • the electronic device 1500 may be the electronic device 100 or the electronic device 200 in FIG. 1 .
  • the electronic device 1500 may be the developer device or the user-side electronic device in FIG. 14 .
  • the electronic device 1500 may be the electronic device 100 or the electronic device 200 in FIG. 23 .
  • the electronic device 1500 may be the control device 100 or the electronic device 200 or the playback device 1000 in FIG. 33 .
  • Embodiments of the present application further provide a computer-readable storage medium, where computer program codes are stored in the computer-readable storage medium, and when the processor executes the computer program codes, the electronic device executes the methods in the foregoing embodiments.
  • Embodiments of the present application also provide a computer program product, which when the computer program product runs on a computer, causes the computer to execute the method in the above-mentioned embodiments.
  • the electronic device 1500, the computer-readable storage medium, or the computer program product provided in the embodiments of the present application are all used to execute the corresponding methods provided above. Therefore, for the beneficial effects that can be achieved, reference may be made to the provided above. The beneficial effects in the corresponding method will not be repeated here.
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be Incorporation may either be integrated into another device, or some features may be omitted, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware, and can also be implemented in the form of software functional units.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a readable storage medium.
  • the technical solutions of the embodiments of the present application can be embodied in the form of software products in essence, or the parts that contribute to the prior art, or all or part of the technical solutions, which are stored in a storage medium , including several instructions to make a device (may be a single chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: a U disk, a removable hard disk, a ROM, a magnetic disk, or an optical disk and other mediums that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Stored Programmes (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请实施例提供了一种用户接口界面实现方法及装置,涉及终端技术领域。第一应用的应用安装包包括用于对第一应用的UI进行界面描述和界面行为定义的第一描述文件和第二描述文件;第一描述文件和第二描述文件采用不同的界面描述语言;其中,第一描述文件采用第一界面描述语言,第二描述文件采用第二界面描述语言。电子设备运行第一应用时;第一UI引擎读取、解析和执行第一描述文件,生成第一UI的第一部分;第二UI引擎读取、解析和执行第二描述文件,生成第一UI的第二部分。第一UI引擎可以为通用的操作系统引擎,第二UI引擎与操作系统平台不相关,能够适应多种操作系统平台,且技术实现难度低。

Description

用户接口界面实现方法及装置
本申请要求于2020年08月25日提交国家知识产权局、申请号为202010862489.9、申请名称为“一种用户接口界面实现方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
本申请要求于2020年09月30日提交国家知识产权局、申请号为202011064544.6、申请名称为“一种用户接口界面实现方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
本申请要求于2020年10月22日提交国家知识产权局、申请号为202011141010.9、申请名称为“一种用户接口界面实现方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
本申请要求于2020年10月22日提交国家知识产权局、申请号为202011142718.6、申请名称为“一种用户接口界面实现方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
本申请要求于2020年11月30日提交国家知识产权局、申请号为202011381146.7、申请名称为“一种用户接口界面实现方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
本申请要求于2020年11月30日提交国家知识产权局、申请号为202011384490.1、申请名称为“一种用户接口界面实现方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
本申请要求于2020年12月14日提交国家知识产权局、申请号为202011475517.8、申请名称为“用户接口界面实现方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种用户接口界面实现方法及装置。
背景技术
开发者通常会基于某一个操作系统(operator system,OS)平台开发应用程序(application,App)。在App开发过程中,一个很重要的任务就是开发App的用户接口界面(user interface,UI)。通常,开发者使用OS平台提供的软件开发包(software development kit,SDK)开发App的UI。UI开发主要包括界面描述和界面行为定义。界面描述是指使用界面描述语言描述UI的布局(layout),使用的控件,以及布局和控件的视觉风格等。界面行为定义是指使用界面描述语言定义界面行为;界面行为包括UI的动态变化,以及电子设备对UI动态变化的响应(比如对用户在UI上操作的响应)。各个OS平台有其对应的界面描述语言;例如
Figure PCTCN2021108273-appb-000001
使用xml(extensible markup language,可扩展标记语言)格式,
Figure PCTCN2021108273-appb-000002
使用swift构建的嵌入式领域专用语言(embedded domain specific language,EDSL)进行界面描述和界面行为定义。OS平台提供的UI引擎可以解释执行UI的界面描 述语言,渲染出UI呈现给用户。并且,各个OS平台还有对应的程序语言用于实现界面行为,实现UI的动态变化以及响应用户对UI的操作;例如
Figure PCTCN2021108273-appb-000003
使用JAVA,
Figure PCTCN2021108273-appb-000004
使用swift编程语言实现界面行为。
如何为开发者提供便利,使得开发者方便的开发出与操作系统适配且功能丰富的UI,是需要解决的问题。
发明内容
本申请实施例提供一种用户接口界面实现方法及装置,可以提供丰富的UI编程能力,为开发者开发出与操作系统适配且功能丰富的UI提供便利。为达到上述目的,本申请采用如下技术方案:
第一方面,本申请提供一种用户接口界面实现方法,包括:电子设备第一应用的应用安装包,应用安装包包括第一描述文件和第二描述文件;其中,第一描述文件和第二描述文件用于对第一应用的第一用户接口界面UI进行界面描述和界面行为定义;第一描述文件采用第一界面描述语言,第二描述文件采用第二界面描述语言;第一界面描述语言与第二界面描述语言不同。电子设备运行第一应用;其中,电子设备的第一UI引擎读取、解析和执行第一描述文件,生成第一UI的第一部分;电子设备的第二UI引擎读取、解析和执行第二描述文件,生成第一UI的第二部分;电子设备显示第一UI。
在该方法中,支持使用两种不同的界面描述语言共同开发UI。电子设备的操作系统包括两个UI引擎,分别解析和执行两种不同的界面描述语言。其中一个UI引擎可以是通用OS(比如
Figure PCTCN2021108273-appb-000005
)的UI引擎,可以解析通用的界面描述语言;另一个UI引擎是与OS平台不相关的扩展UI引擎,可以解析DSL。这样,开发者可以使用基础界面描述语言描述UI布局、包括的控件等;并且选择性使用DSL将自定义UI编程能力应用于一些控件,在UI增加一些动效等。本申请实施例提供的扩展UI引擎与OS平台不相关,故能适应多种OS平台,技术实现难度低,方便开发者使用。
在一种可能的实现方式中,电子设备的第一UI引擎生成第一UI的第一部分包括:电子设备的第一UI引擎根据第一描述文件生成第一UI中一个或多个第一控件;其中,一个或多个第一控件具备第一UI编程能力。
该第一控件即通用OS(比如
Figure PCTCN2021108273-appb-000006
)生成的控件。第一UI编程能力是通用OS(比如
Figure PCTCN2021108273-appb-000007
)支持的UI编程能力。比如,包括设置控件的长、宽、高,间距,颜色;选中控件,在控件上输入文字等。
在一种可能的实现方式中,电子设备的第二UI引擎生成第一UI的第二部分包括:电子设备的第二UI引擎根据第二描述文件对一个或多个第一控件应用第二UI编程能力。也就是说,开发者可以使用自定义界面描述语言在第二描述文件中对通用OS生成的控件应用自定义的UI编程能力,扩展通用OS控件的能力,丰富通用OS控件的使用效果。
在一种可能的实现方式中,电子设备的第二UI引擎生成第一UI的第二部分包括:电子设备的第二UI引擎根据第二描述文件生成第一UI中一个或多个第二控件;其中,一个或多个第二控件具备第二UI编程能力。
该第二控件为本申请实施例中OEM OS提供的自定义控件,支持自定义的第二UI编程能力,支持丰富的控件效果。
在一种可能的实现方式中,第二UI编程能力包括:视觉属性能力,布局能力,统一 交互能力和动效能力中至少一种。
其中,布局能力用于描述UI中控件的布局;比如控件的形状、位置、大小等。视觉属性能力用于描述控件的视觉属性;比如,控件的颜色、灰度等视觉效果。统一交互能力用于基于用户行为提供控件响应;比如基于用户的“确认”行为执行搜索。动效能力用于在控件上显示动画效果;比如在控件上显示点击回弹动效等。
在一种可能的实现方式中,布局能力包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。其中,拉伸是指控件的宽度和高度按照不同的比例进行放大或缩小的显示能力;隐藏是指控件在显示界面中可见或不可见的显示能力;折行是指控件中的内容在显示界面中通过一行或多行显示的显示能力;均分是指控件在显示界面中均匀分布的显示能力;占比是指控件在指定方向上按照指定的百分比占据总布局的能力;延伸是指控件在UI上按照一个方向上排列显示的能力。
在一种可能的实现方式中,若第二UI引擎确定存在第二描述文件,触发第一UI引擎读取第一描述文件,第二UI引擎读取第二描述文件。也就是说,由第二UI引擎控制分发流程,触发第一UI引擎和第二UI引擎解析执行描述文件。
在一种可能的实现方式中,第一描述文件与第二描述文件在应用安装包中不同路径。第一UI引擎和第二UI引擎按照预设规则分别读取不同路径下的描述文件。
在一种可能的实现方式中,第一描述文件与第二描述文件被预置不同的标签。第一UI引擎和第二UI引擎按照预置标签分别读取相应的描述文件。
在一种可能的实现方式中,第二UI引擎还对第二界面描述语言进行语法校验;若语法校验通过,第二UI引擎解析和执行第二描述文件。
在一种可能的实现方式中,电子设备的第二UI引擎实现器件事件与第二描述文件中用户行为之间的映射;响应于器件事件,执行第二描述文件中用户行为对应的控件动作。也就是说,OEM OS可以将不同形态的电子设备触发的事件映射为同一用户行为(比如,将PC上的鼠标双击事件映射为“确认”行为,将手机上的手指单击事件映射为“确认”行为),避免开发者针对不同形态的电子设备定义器件事件和用户行为的对应关系,带来重复劳动;使得同一描述文件可以适用于多种形态的电子设备,降低开发难度,为开发者带来便利。
在一种可能的实现方式中,第二UI引擎包括第二描述文件中字段的语法语义规范集合。这样,开发者可以按照OEM OS的语法语义规范在OEM OS平台开发UI。
在一种可能的实现方式中,第一界面描述语言为可扩展标记语言xml语言,第二界面描述语言为领域专用语言DSL。
第二方面,本申请提供一种用户接口界面实现方法,包括:显示第一应用的开发界面;第一应用的开发界面包括第一描述文件和第二描述文件;第一描述文件和第二描述文件用于对第一应用的第一用户接口界面UI进行界面描述和界面行为定义;第一描述文件采用第一界面描述语言,第二描述文件采用第二界面描述语言;第一界面描述语言与第二界面描述语言不同;响应于用户输入的第一操作,在第一描述文件中增加对第一UI的第一部分的描述;响应于用户输入的第二操作,在第二描述文件中增加对第一UI的第二部分的描述;根据第一描述文件和第二描述文件生成第一应用的应用安装包。
在该方法中,开发者可以使用两种不同的界面描述语言共同开发UI。其中一种语言是 通用OS(比如
Figure PCTCN2021108273-appb-000008
)支持的基础界面描述语言,另一种语言是自定义界面描述语言。开发者可以使用基础界面描述语言描述UI布局、包括的控件等;并且选择性使用DSL将自定义UI编程能力应用于一些控件,在UI增加一些动效等。由于自定义界面描述语言与OS平台不相关,故能适应多种OS平台,技术实现难度低,方便开发者使用。
在一种可能的实现方式中,在第一描述文件中增加对第一UI的第一部分的描述包括:在第一描述文件中增加对第一UI中一个或多个第一控件的描述;对一个或多个第一控件应用第一UI编程能力。
该第一控件为通用OS(比如
Figure PCTCN2021108273-appb-000009
)支持的控件。第一UI编程能力是通用OS(比如
Figure PCTCN2021108273-appb-000010
)支持的UI编程能力。比如,包括设置控件的长、宽、高,间距,颜色;选中控件,在控件上输入文字等。
在一种可能的实现方式中,在第二描述文件中增加对第一UI的第二部分的描述包括:在第二描述文件中增加对一个或多个第一控件的描述;对一个或多个第一控件应用第二UI编程能力。
也就是说,开发者可以使用自定义界面描述语言在第二描述文件中对通用OS生成的控件应用自定义的UI编程能力,扩展通用OS控件的能力,丰富通用OS控件的使用效果。
在一种可能的实现方式中,在第二描述文件中增加对第一UI的第二部分的描述包括:在第二描述文件中增加对一个或多个第二控件的描述;对一个或多个第二控件应用第二UI编程能力。该第二控件为本申请实施例中OEM OS提供的自定义控件,支持自定义的第二UI编程能力,支持丰富的控件效果。
在一种可能的实现方式中,第二UI编程能力包括:视觉属性能力,布局能力,统一交互能力和动效能力中至少一种。
在一种可能的实现方式中,布局能力包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
在一种可能的实现方式中,第一描述文件与第二描述文件在应用安装包中不同路径。这样,OEM OS的第一UI引擎和第二UI引擎可以按照预设规则分别在不同路径下读取文件,获取对应的描述文件。
在一种可能的实现方式中,第一描述文件与第二描述文件被预置不同的标签。这样,OEM OS的第一UI引擎和第二UI引擎可以按照预置标签分别读取相应的描述文件。
第三方面,本申请提供一种计算机可读存储介质,包括计算机指令,该计算机指令用于对第一应用的第一用户接口界面UI进行界面描述和界面行为定义,其中,计算机指令包括存储于第一描述文件中的第一指令,以及存储于第二描述文件中的第二指令;第一描述文件采用第一界面描述语言,第二描述文件采用第二界面描述语言;第一界面描述语言与第二界面描述语言不同;第一指令用于描述第一UI的第一部分,第二指令用于描述第一UI的第二部分。
在该方法中,开发者使用两种不同的界面描述语言共同开发UI。其中一种界面描述语言是通用OS(比如
Figure PCTCN2021108273-appb-000011
)支持的基础界面描述语言,另一种界面描述语言是自定义界面描述语言。开发者使用基础界面描述语言描述UI布局、包括的控件等;并且选择性使用DSL将自定义UI编程能力应用于一些控件,在UI增加一些动效等。由于自定义界面描述语言与OS平台不相关,故能适应多种OS平台,技术实现难度低,方便开发者使 用。
在一种可能的实现方式中,第一指令具体用于:描述第一UI中一个或多个第一控件,对一个或多个第一控件应用第一UI编程能力。
该第一控件为通用OS(比如
Figure PCTCN2021108273-appb-000012
)支持的控件。第一UI编程能力是通用OS(比如
Figure PCTCN2021108273-appb-000013
)支持的UI编程能力。比如,包括设置控件的长、宽、高,间距,颜色;选中控件,在控件上输入文字等。
在一种可能的实现方式中,第二指令具体用于:对一个或多个第一控件应用第二UI编程能力。在另一种可能的实现方式中,第二指令具体用于:描述第一UI中一个或多个第二控件,对一个或多个第二控件应用第二UI编程能力。
也就是说,开发者可以使用自定义界面描述语言在第二描述文件中对通用OS生成的控件应用自定义的UI编程能力,扩展通用OS控件的能力;也可以增加具备丰富控件效果的自定义控件。
在一种可能的实现方式中,第二UI编程能力包括:视觉属性能力,布局能力,统一交互能力和动效能力中至少一种。布局能力包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
在一种可能的实现方式中,第一描述文件与第二描述文件在计算机可读存储介质中不同路径。这样,OEM OS的第一UI引擎和第二UI引擎可以按照预设规则分别在不同路径下读取文件,获取对应的描述文件。
在一种可能的实现方式中,第一描述文件与第二描述文件被预置不同的标签。这样,OEM OS的第一UI引擎和第二UI引擎可以按照预置标签分别读取相应的描述文件。
第四方面,本申请提供一种计算机可读存储介质,例如,应用开发工具,该应用开发工具具体可包括计算机指令,当计算机指令在上述电子设备上运行时,可使电子设备执行上述第一方面中任意一项所述的方法。
第五方面,本申请提供一种电子设备,包括:显示屏、输入设备、一个或多个处理器、一个或多个存储器、以及一个或多个计算机程序;其中,处理器与输入设备、显示屏以及存储器均耦合,上述一个或多个计算机程序被存储在存储器中,当电子设备运行时,处理器可执行存储器存储的一个或多个计算机程序,以使电子设备执行上述第一方面中任意一项所述的方法。
第六方面,本申请提供一种电子设备,包括:显示屏、一个或多个处理器、一个或多个存储器、以及一个或多个计算机程序;其中,处理器与显示屏以及存储器均耦合,上述一个或多个计算机程序被存储在存储器中,当电子设备运行上述第一应用时,处理器可执行存储器存储的一个或多个计算机程序,以使电子设备执行上述第二方面中任意一项所述的方法。
本申请实施例提供一种用户接口界面实现方法及装置,可以实现一次开发,多设备部署;即开发一套界面描述文件,适用于各种不同类型的电子设备;降低开发者的开发难度。为达到上述目的,本申请采用如下技术方案:
第七方面,本申请提供一种用户接口界面实现方法,包括:第一电子设备和第二电子设备分别从服务器下载第一应用的应用安装包;并分别安装所述应用安装包。该应用安装包包括描述文件和资源文件;其中,描述文件用于对第一应用的第一UI进行界面描述和 界面行为定义;资源文件包括生成第一应用的UI使用的资源。第一电子设备读取描述文件中与第一电子设备的设备类型对应的第一代码,按照第一代码的定义使用资源文件的资源生成第一电子设备的第一UI;第二电子设备读取描述文件中与第二电子设备的设备类型对应的第二代码,按照第二代码的定义使用资源文件的资源生成第二电子设备的第一UI;第一电子设备的设备类型与第二电子设备的设备类型不同。其中,电子设备的设备类型可以包括手机、智能电视、智能手表、平板电脑、笔记本电脑、上网本、大屏、车载电脑等。
在该方法中,不同类型的电子设备读取同一UI的同一个描述文件,呈现不同的UI布局。可以实现开发一套描述文件,适用于各种不同类型的电子设备,降低了开发者的开发难度。
结合第七方面,在一种可能的实现方式中,该方法还包括:第一电子设备按照描述文件中第三代码的定义生成第一电子设备的第一UI中第一控件,第一电子设备的第一UI中第一控件具备第一电子设备的操作系统自定义的控件属性;第三电子设备按照描述文件中第三代码的定义生成第三电子设备的第一UI中第一控件,第三电子设备的第一UI中第一控件具备通用操作系统的控件属性;其中,第三代码是第一代码的部分或全部。
在该方法中,描述文件中定义第一控件支持操作系统自定义的控件属性。第一电子设备的操作系统提供自定义的控件属性,第一电子设备的第一UI中第一控件具备第一电子设备的操作系统自定义的控件属性;第三电子设备支持通用操作系统(比如
Figure PCTCN2021108273-appb-000014
)的控件属性,第三电子设备的第一UI中第一控件具备通用操作系统的控件属性。这样,同一描述文件可以在不同的操作系统中成功运行,实现了跨操作系统平台运行,降低开发者开发难度。
第八方面,本申请提供一种用户接口界面实现方法,包括:第一电子设备下载第一应用的应用安装包;并安装该应用安装包;其中,应用安装包包括描述文件和资源文件;描述文件用于对第一应用的第一用户接口界面UI进行界面描述和界面行为定义;资源文件包括生成第一应用的UI使用的资源。第一电子设备读取描述文件中与第一电子设备的设备类型对应的第一代码,按照第一代码的定义使用资源文件的资源生成第一电子设备的第一UI。
在该方法中,电子设备读取描述文件中与该电子设备的设备类型相对应的代码。这样,不同的电子设备读取同一描述文件可以呈现不同的UI布局。可以实现开发一套描述文件,适用于各种不同类型的电子设备,降低了开发者的开发难度。
结合第八方面,在一种可能的实现方式中,第一电子设备按照描述文件中第三代码的定义生成第一电子设备的第一UI中第一控件,第一电子设备的第一UI中第一控件具备第一电子设备的操作系统自定义的控件属性;其中,第三代码是第一代码的部分或全部。
在该方法中,描述文件中定义第一控件支持操作系统自定义的控件属性。第一电子设备的操作系统提供自定义的控件属性,第一电子设备的第一UI中第一控件具备第一电子设备的操作系统自定义的控件属性。
结合第七方面或第八方面,在一种可能的实现方式中,第一电子设备的操作系统包括自定义UI编程能力,自定义UI编程能力用于提供第一电子设备的操作系统自定义的 控件属性。
结合第七方面或第八方面,在一种可能的实现方式中,自定义的控件属性包括:视觉属性,布局属性,交互属性,动效属性和软硬件依赖属性中至少一种。
结合第七方面或第八方面,在一种可能的实现方式中,布局属性包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
结合第七方面或第八方面,在一种可能的实现方式中,第一电子设备的第一UI包括第二控件,第二控件具备通用操作系统的控件属性。也就是说,第一电子设备根据描述文件生成的第一UI中既可以包括具备第一电子设备的操作系统自定义的控件属性的控件,也可以包括具备通用操作系统的控件属性的控件。提供了形态更丰富的控件。
结合第七方面或第八方面,在一种可能的实现方式中,第一电子设备的第一UI包括第三控件,第三控件具备第一应用中自定义的控件属性。在该方法中,开发者可以在安装包的文件中自定义属于该第一应用的控件属性,使得UI更丰富。
结合第七方面或第八方面,在一种可能的实现方式中,描述文件包括第四代码,用于定义第一电子设备的第一UI中第四控件的控件属性与第一电子设备的操作系统中第一数据的对应关系。该方法还包括:第一电子设备接收用户在第四控件上的第一输入;根据第一输入修改第一数据的值。
在该方法中,开发者在描述文件中定义控件的控件属性与操作系统中后台数据的对应关系;电子设备的UI引擎实现根据用户输入修改后台数据的功能。避免了开发者在描述文件中描述根据用户输入修改后台数据的实现,降低了开发者的开发难度。
结合第七方面或第八方面,在一种可能的实现方式中,该方法还包括:第一电子设备的第一UI中第四控件的控件属性随着第一电子设备的操作系统中第一数据的改变而改变。
在该方法中,开发者在描述文件中定义控件的控件属性与操作系统中后台数据的对应关系;电子设备的UI引擎实现控件的控件属性随着电子设备的操作系统中后台数据的改变而改变。UI中的控件可以随着电子设备参数的改变而改变,并且避免了开发者在描述文件中描述控件的控件属性随着电子设备参数改变而改变的实现,降低了开发者的开发难度。
第九方面,本申请提供一种用户接口界面实现方法,包括:显示第一应用的开发界面;该第一应用的开发界面包括描述文件,用于对第一应用的第一用户接口界面UI进行界面描述和界面行为定义;响应于用户输入的第一操作,在描述文件中增加与第一电子设备的设备类型对应的第一代码;响应于用户输入的第二操作,在描述文件中增加与第二电子设备的设备类型对应的第二代码;根据所述描述文件生成所述第一应用的应用安装包。其中,第一电子设备的设备类型与第二电子设备的设备类型不同。其中,电子设备的设备类型可以包括手机、智能电视、智能手表、平板电脑、笔记本电脑、上网本、大屏、车载电脑等。
在该方法中,一个描述文件中包括对应不同类型的电子设备的代码。不同类型的电子设备读取同一UI的同一个描述文件,即可呈现不同的UI布局。可以实现开发一套描述文件,适用于各种不同类型的电子设备,降低了开发者的开发难度。
结合第九方面,在一种可能的实现方式中,第一应用的应用安装包还包括资源文件, 资源文件包括生成第一应用的UI使用的资源。
结合第九方面,在一种可能的实现方式中,描述文件中包括定义第一控件具备第一电子设备的操作系统自定义的控件属性的第三代码,以及定义第二控件具备通用操作系统的控件属性的第四代码。也就是说,电子设备根据描述文件生成的第一UI中既可以包括具备第一电子设备的操作系统自定义的控件属性的控件,也可以包括具备通用操作系统的控件属性的控件。提供了形态更丰富的控件。
结合第九方面,在一种可能的实现方式中,第一电子设备的操作系统自定义的控件属性包括:视觉属性,布局属性,交互属性,动效属性和软硬件依赖属性中至少一种。
结合第九方面,在一种可能的实现方式中,布局属性包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
第十方面,本申请提供一种计算机可读存储介质,例如,应用开发工具,该应用开发工具具体可包括计算机指令,当计算机指令在上述电子设备上运行时,可使电子设备执行上述第九方面中任意一项所述的方法。
第十一方面,本申请提供一种计算机可读存储介质,包括计算机指令,该计算机指令用于对第一应用的第一用户接口界面UI进行界面描述和界面行为定义;其中,计算机指令包括与第一电子设备的设备类型对应的第一代码以及与第二电子设备的设备类型对应的第二代码;其中第一电子设备的设备类型与第二电子设备的设备类型不同。其中,电子设备的设备类型可以包括手机、智能电视、智能手表、平板电脑、笔记本电脑、上网本、大屏、车载电脑等。
结合第十一方面,在一种可能的实现方式中,计算机指令还包括生成第一应用的UI使用的资源。
结合第十一方面,在一种可能的实现方式中,计算机指令还包括定义第一控件具备第一电子设备的操作系统自定义的控件属性的第三代码,以及定义第二控件具备通用操作系统的控件属性的第四代码。
第十二方面,本申请提供一种电子设备,包括:显示屏、输入设备、一个或多个处理器、一个或多个存储器、以及一个或多个计算机程序;其中,处理器与输入设备、显示屏以及存储器均耦合,上述一个或多个计算机程序被存储在存储器中,当电子设备运行时,处理器可执行存储器存储的一个或多个计算机程序,以使电子设备执行上述第八方面中任意一项所述的方法。
第十三方面,本申请提供一种电子设备,包括:显示屏、一个或多个处理器、一个或多个存储器、以及一个或多个计算机程序;其中,处理器与显示屏以及存储器均耦合,上述一个或多个计算机程序被存储在存储器中,当电子设备运行上述第一应用时,处理器可执行存储器存储的一个或多个计算机程序,以使电子设备执行上述第九方面中任意一项所述的方法。
本申请实施例提供一种用户接口界面实现方法及装置,支持应用小组件的UI上显示各种布局方式和控件类型,方便用户使用应用小组件,提高用户体验。为达到上述目的,本申请采用如下技术方案:
第十四方面,本申请提供一种用户接口界面实现方法,包括:电子设备的第一应用进程读取组件界面描述文件,并根据组件界面描述文件生成第一小组件UI数据,将第一小 组件UI数据中的控件与电子设备操作系统中的后台数据进行绑定;其中,组件界面描述文件用于对第一应用的应用小组件的第一UI进行界面描述和界面行为定义;然后,第一应用进程向应用小组件进程发送第一数据;应用小组件进程接收第一数据,根据第一数据获取第一小组件UI数据,并按照第一小组件UI数据显示应用小组件的第一UI。
在该方法中,应用进程和应用小组件进程都根据组件界面描述文件生成小组件UI数据。应用进程将小组件UI数据中的控件绑定后台数据,应用小组件进程将小组件UI数据显示为应用小组件UI。这样,开发者可以在组件界面描述文件中定义各种类型的控件,使得应用小组件的UI支持各种类型的控件。当用户在应用小组件的UI上进行操作时,应用进程可以根据小组件UI数据中的控件与后台数据的对应关系执行相应的业务逻辑。
在一种可能的实现方式中,第一应用进程向应用小组件进程发送的是组件界面描述文件;应用小组件进程接收到组件界面描述文件,根据组件界面描述文件生成第一小组件UI数据,并按照第一小组件UI数据显示应用小组件的第一UI。
在一种可能的实现方式中,第一应用进程向应用小组件进程发送的是第一小组件UI数据;应用小组件进程接收到第一小组件UI数据,按照第一小组件UI数据显示应用小组件的第一UI。
结合第十四方面,在一种可能的实现方式中,该方法还包括:第一应用进程按照组件界面描述文件中第一代码的定义生成第一小组件UI数据中第一控件,第一控件具备电子设备的操作系统原生的控件属性。其中,操作系统原生的控件包括:输入框,复选框,滑动选择器,滚动视图,单选按钮,评分条,搜索框,拖动条,或开关等。
也就是说,开发者可以在组件界面描述文件中定义操作系统原生的各种控件属性,使得应用小组件的UI支持操作系统原生的各种控件。
结合第十四方面,在一种可能的实现方式中,该方法还包括:第一应用进程按照组件界面描述文件中第二代码的定义生成第一小组件UI数据中第二控件,第二控件具备电子设备的操作系统中自定义的控件属性。其中,自定义的控件属性包括:视觉属性,布局属性,交互属性,动效属性和软硬件依赖属性中至少一种。布局属性包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
也就是说,开发者可以在组件界面描述文件中定义操作系统中自定义的各种控件属性,使得应用小组件的UI支持操作系统中自定义的各种控件。
结合第十四方面,在一种可能的实现方式中,组件界面描述文件包括第三代码,用于定义应用小组件的第一UI中第三控件的控件属性与电子设备的操作系统中第一数据的对应关系,该方法还包括:电子设备接收用户在第三控件上的第一输入;根据第一输入修改第一数据的值。
结合第十四方面,在一种可能的实现方式中,应用小组件的第一UI中第三控件的控件属性随着电子设备的操作系统中第一数据的改变而改变。
结合第十四方面,在一种可能的实现方式中,该方法还包括:电子设备从服务器下载第一应用的应用安装包;该应用安装包包括组件界面描述文件;电子设备使用应用安装包安装第一应用。
在该方法中,组件界面描述文件是从应用安装包获取的。
第十五方面,本申请提供一种用户接口界面实现方法,包括:显示第一应用的开发界 面;第一应用的开发界面包括组件界面描述文件;组件界面描述文件用于对第一应用的应用小组件的第一UI进行界面描述和界面行为定义。响应于用户输入的第一操作,在组件界面描述文件中增加定义第一小组件UI中第一控件的第一代码;其中,第一控件具备操作系统原生的控件属性;操作系统原生的控件包括:输入框,复选框,滑动选择器,滚动视图,单选按钮,评分条,搜索框,拖动条,或开关等。根据组件界面描述文件生成第一应用的应用安装包。
在该方法中,开发者可以在组件界面描述文件中定义控件具备操作系统原生的控件属性。使得电子设备上运行的应用小组件的UI包括各种具备操作系统原生的控件属性的控件。
结合第十五方面,在一种可能的实现方式中,该方法还包括:响应于用户输入的第二操作,在组件界面描述文件中增加定义第一小组件UI中第二控件的第二代码,第二控件具备操作系统中自定义的控件属性;自定义的控件属性包括:视觉属性,布局属性,交互属性,动效属性和软硬件依赖属性中至少一种。其中,布局属性包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
在该方法中,开发者可以在组件界面描述文件中定义控件具备操作系统自定义的控件属性。使得电子设备上运行的应用小组件的UI包括各种具备操作系统自定义的控件属性的控件。
第十六方面,本申请提供一种计算机可读存储介质,例如,应用开发工具,该应用开发工具具体可包括计算机指令,当计算机指令在上述电子设备上运行时,可使电子设备执行上述第十五方面中任意一项所述的方法。
第十七方面,本申请提供一种计算机可读存储介质,包括计算机指令,该计算机指令用于对第一应用的应用小组件的第一用户接口界面UI进行界面描述和界面行为定义;其中,计算机指令包括生成第一小组件UI中第一控件的第一代码,第一控件具备操作系统原生的控件属性;操作系统原生的控件包括:输入框,复选框,滑动选择器,滚动视图,单选按钮,评分条,搜索框,拖动条,或开关等。
结合第十七方面,在一种可能的实现方式中,计算机指令还包括生成第一小组件UI中第二控件的第二代码,第二控件具备操作系统中自定义的控件属性;自定义的控件属性包括:视觉属性,布局属性,交互属性,动效属性和软硬件依赖属性中至少一种。其中,布局属性包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
第十八方面,本申请提供一种计算机可读存储介质。该计算机可读存储介质包括计算机程序,当计算机程序在电子设备上运行时,使得电子设备执行上述第十四方面中任意一项所述的方法。
第十九方面,本申请提供一种电子设备,包括:显示屏、输入设备、一个或多个处理器、一个或多个存储器、以及一个或多个计算机程序;其中,处理器与输入设备、显示屏以及存储器均耦合,上述一个或多个计算机程序被存储在存储器中,当电子设备运行时,处理器可执行存储器存储的一个或多个计算机程序,以使电子设备执行上述第十四方面或第十五方面中任意一项所述的方法。
本申请实施例提供一种用户接口界面实现方法及装置,支持将控制设备上各种UI投屏至IoT设备进行播放,提高用户体验。为达到上述目的,本申请采用如下技术方案:
第二十方面,本申请提供一种用户接口界面实现方法,包括:第一电子设备读取第一应用的第一播放端界面描述文件,并根据第一播放端界面描述文件生成第一播放端UI数据,将第一播放端UI数据中的控件与第一电子设备操作系统中的后台数据进行绑定;其中,第一播放端界面描述文件用于对在第二电子设备上播放述第一应用的第一播放端UI进行界面描述和界面行为定义;第一电子设备向第二电子设备发送第一数据;第二电子设备接收第一数据,根据第一数据获取第一播放端UI数据,按照第一播放端UI数据显示第一播放端UI。
在该方法中,控制设备和播放端都根据播放端界面描述文件生成播放端UI数据。控制设备将播放端UI数据中的控件绑定后台数据,播放端将播放端UI数据显示为播放端UI。这样,开发者可以在播放端界面描述文件中定义各种UI,使得播放端UI更丰富。还可以为不同设备类型的播放端定义不同的UI布局,使得播放端UI与播放端屏幕的尺寸和形态匹配。当用户在播放端UI上进行操作时,控制设备可以根据播放端UI数据中的控件与后台数据的对应关系执行对应的业务逻辑。
在一种可能的实现方式中,第一电子设备向第二电子设备发送的是第一播放端界面描述文件,第二电子设备根据第一播放端界面描述文件生成第一播放端UI数据,按照第一播放端UI数据显示第一播放端UI。
在一种可能的实现方式中,第一电子设备向第二电子设备发送的是第一播放端UI数据,第二电子设备接收到第一播放端UI数据,按照第一播放端UI数据显示第一播放端UI。
结合第二十方面,在一种可能的实现方式中,该方法还包括:第二电子设备接收用户在第一播放端UI上的第一操作;响应于用户在第一播放端UI上的第一操作,第二电子设备向第一电子设备发送第一指令;第一电子设备接收到第一指令,读取第二播放端界面描述文件,根据第二播放端界面描述文件生成第二播放端UI数据,将第二播放端UI数据中的控件与第一电子设备操作系统中的后台数据进行绑定;其中,第二播放端界面描述文件用于对在第二电子设备上播放第一应用的第二播放端UI进行界面描述和界面行为定义;第一电子设备向第二电子设备发送第二播放端界面描述文件;第二电子设备接收第二播放端界面描述文件,根据第二播放端界面描述文件生成第二播放端UI数据,按照第二播放端UI数据显示第二播放端UI。
在该方法中,用户可以直接在播放设备上对播放端UI进行操作,控制设备执行该操作对应的业务逻辑,并向播放设备发送更新的播放端UI对应的播放端界面描述文件,播放设备根据更新的播放端界面描述文件生成更新的播放端UI。实现在播放设备上直接对播放端UI进行操作,成功切换播放端UI。
结合第二十方面,在一种可能的实现方式中,该方法还包括:第二电子设备接收用户在第一播放端UI上的第一操作;响应于用户在第一播放端UI上的第一操作,第二电子设备向第一电子设备发送第一指令;第一电子设备接收到第一指令,读取第二播放端界面描述文件;第二播放端界面描述文件用于对在第二电子设备上播放第一应用的第二播放端UI进行界面描述和界面行为定义;第一电子设备根据第二播放端界面描述文件生成第二播放端UI数据,将第二播放端UI数据中的控件与第一电子设备操作系统中的后台数据进行绑定;第一电子设备向第二电子设备发送第二播放端UI数据;第二电子设备接收 第二播放端UI数据,按照第二播放端UI数据显示第二播放端UI。
在该方法中,用户可以直接在播放设备上对播放端UI进行操作,控制设备执行该操作对应的业务逻辑,并向播放设备发送更新的播放端UI数据,播放设备根据更新的播放端UI数据更新播放端UI。实现在播放设备上直接对播放端UI进行操作,成功切换播放端UI。
结合第二十方面,在一种可能的实现方式中,该方法还包括:第一电子设备从服务器下载第一应用的应用安装包;应用安装包包括第一播放端界面描述文件和资源文件;资源文件包括生成第一应用的播放端UI使用的资源;第一电子设备使用应用安装包安装第一应用。
结合第二十方面,在一种可能的实现方式中,该方法还包括:第一电子设备读取第一播放端界面描述文件中与第三电子设备的设备类型对应的第一代码,按照第一代码的定义使用资源文件的资源生成第三播放端UI数据;第一电子设备读取第一播放端界面描述文件中与第四电子设备的设备类型对应的第二代码,按照第二代码的定义使用资源文件的资源生成第四播放端UI数据;其中,第四电子设备的设备类型与第三电子设备的设备类型不同。第一电子设备分别向第三电子设备和第四电子设备发送第一播放端界面描述文件和资源文件;第三电子设备根据第一播放端界面描述文件中与第三电子设备的设备类型对应的第一代码的定义使用资源文件的资源生成第三播放端UI数据;按照第三播放端UI数据显示第一播放端UI;第四电子设备根据第一播放端界面描述文件中与第四电子设备的设备类型对应的第二代码的定义使用资源文件的资源生成第四播放端UI数据;按照第四播放端UI数据显示第一播放端UI。
在该方法中,不同类型的播放设备读取同一UI的同一个播放端界面描述文件,呈现不同的播放端UI布局。可以实现开发一套播放端界面描述文件,适用于各种不同类型的播放设备,降低了开发者的开发难度。
结合第二十方面,在一种可能的实现方式中,该方法还包括:第一电子设备读取第一播放端界面描述文件中与第三电子设备的设备类型对应的第一代码,按照第一代码的定义使用资源文件的资源生成第三播放端UI数据;第一电子设备读取第一播放端界面描述文件中与第四电子设备的设备类型对应的第二代码,按照第二代码的定义使用资源文件的资源生成第四播放端UI数据;其中,第四电子设备的设备类型与第三电子设备的设备类型不同。第一电子设备向第三电子设备发送第三播放端UI数据;第三电子设备按照第三播放端UI数据显示第一播放端UI;第一电子设备向第四电子设备发送第四播放端UI数据;第四电子设备按照第四播放端UI数据显示第一播放端UI。
在该方法中,不同类型的播放设备根据同一UI的同一个播放端界面描述文件,呈现不同的播放端UI布局。可以实现开发一套播放端界面描述文件,适用于各种不同类型的播放设备,降低了开发者的开发难度。
结合第二十方面,在一种可能的实现方式中,该方法还包括:第一电子设备按照第一播放端界面描述文件中第三代码的定义生成第一播放端UI中第一控件,第一控件具备第一电子设备的操作系统自定义的控件属性;其中,第一电子设备的操作系统自定义的控件属性包括:视觉属性,布局属性,交互属性,动效属性和软硬件依赖属性中至少一种。布局属性包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
结合第二十方面,在一种可能的实现方式中,该方法还包括:第一电子设备按照第一播放端UI数据显示第一播放端UI。在该方法中,控制设备与播放设备同步播放播放端UI,可以实现镜像投屏,控制设备与播放设备协同工作。
第二十一方面,本申请提供一种用户接口界面实现方法,包括:显示第一应用的开发界面;第一应用的开发界面包括播放端界面描述文件;播放端界面描述文件用于对在播放端播放第一应用的播放端UI进行界面描述和界面行为定义;响应于用户的第一输入,在述播放端界面描述文件中增加与第一电子设备的设备类型对应的第一代码;响应于用户的第二输入,在播放端界面描述文件中增加与第二电子设备的设备类型对应的第二代码;第一电子设备的设备类型与第二电子设备的设备类型不同;根据播放端界面描述文件生成第一应用的应用安装包。
在该方法中,一个播放端界面描述文件中包括对应不同类型的播放设备的代码。不同类型的播放设备读取同一UI的同一个播放端界面描述文件,即可呈现不同的播放端UI布局。可以实现开发一套播放端界面描述文件,适用于各种不同类型的播放设备,降低了开发者的开发难度。
结合第二十一方面,在一种可能的实现方式中,第一应用的应用安装包还包括资源文件,资源文件包括生成第一应用的播放端UI使用的资源。
结合第二十一方面,在一种可能的实现方式中,播放端界面描述文件中包括定义第一播放端UI中第一控件具备第一电子设备的操作系统自定义的控件属性的第三代码;第一电子设备的操作系统自定义的控件属性包括:视觉属性,布局属性,交互属性,动效属性和软硬件依赖属性中至少一种。布局属性包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
第二十二方面,本申请提供一种计算机可读存储介质,例如,应用开发工具,该应用开发工具具体可包括计算机指令,当计算机指令在上述电子设备上运行时,可使电子设备执行上述第二十一方面中任意一项所述的方法。
第二十三方面,本申请提供一种计算机可读存储介质,包括计算机指令,该计算机指令用于对第一应用的第一播放端UI进行界面描述和界面行为定义;其中,计算机指令包括与第一电子设备的设备类型对应的第一代码以及与第二电子设备的设备类型对应的第二代码;其中第一电子设备的设备类型与第二电子设备的设备类型不同。其中,电子设备的设备类型可以包括手机、智能电视、智能手表、平板电脑、笔记本电脑、上网本、大屏、车载电脑等。
结合第二十三方面,在一种可能的实现方式中,计算机指令还包括生成第一应用的播放端UI使用的资源。
结合第二十三方面,在一种可能的实现方式中,计算机指令还包括定义第一播放端UI中第一控件具备第一电子设备的操作系统自定义的控件属性的第三代码;第一电子设备的操作系统自定义的控件属性包括:视觉属性,布局属性,交互属性,动效属性和软硬件依赖属性中至少一种。布局属性包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
第二十四方面,本申请提供一种计算机可读存储介质。该计算机可读存储介质包括计算机程序,当计算机程序在电子设备上运行时,使得电子设备执行上述第二十方面中任意 一项所述的方法。
第二十五方面,本申请提供一种电子设备,包括:显示屏、输入设备、一个或多个处理器、一个或多个存储器、以及一个或多个计算机程序;其中,处理器与输入设备、显示屏以及存储器均耦合,上述一个或多个计算机程序被存储在存储器中,当电子设备运行时,处理器可执行存储器存储的一个或多个计算机程序,以使电子设备执行上述第二十方面或第二十一方面中任意一项所述的方法。
可以理解地,上述各个方面所提供的电子设备以及计算机可读存储介质均应用于上文所提供的对应方法,因此,其所能达到的有益效果可参考上文所提供的对应方法中的有益效果,此处不再赘述。
附图说明
图1为本申请实施例提供的用户接口界面实现方法的场景示意图;
图2为本申请实施例提供的电子设备的硬件结构示意图;
图3为本申请实施例提供的电子设备的软件架构示意图;
图4为一种用户接口界面实现方法的示意图;
图5为一种用户接口界面实现方法的示意图;
图6为本申请实施例提供的用户接口界面实现方法的示意图;
图7为本申请实施例提供的用户接口界面实现方法的架构示意图;
图8为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图9为本申请实施例提供的用户接口界面实现方法的流程示意图;
图10为本申请实施例提供的用户接口界面实现方法的流程示意图;
图11为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图12为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图13为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图14为本申请实施例提供的用户接口界面实现方法的示意图;
图15为本申请实施例提供的电子设备的软件架构示意图;
图16为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图17为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图18为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图19为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图20A为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图20B为本申请实施例提供的用户接口界面实现方法的流程示意图;
图20C为本申请实施例提供的用户接口界面实现方法的示意图;
图21为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图22为本申请实施例提供的用户接口界面实现方法的示意图;
图23为本申请实施例提供的用户接口界面实现方法的场景示意图;
图24A为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图24B为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图24C为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图24D为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图25为本申请实施例提供的用户接口界面实现方法的示意图;
图26为本申请实施例提供的用户接口界面实现方法的示意图;
图27为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图28为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图29A为本申请实施例提供的用户接口界面实现方法的示意图;
图29B为本申请实施例提供的用户接口界面实现方法的示意图;
图29C为本申请实施例提供的用户接口界面实现方法的示意图;
图29D为本申请实施例提供的用户接口界面实现方法的示意图;
图30为本申请实施例提供的用户接口界面实现方法的流程示意图;
图31为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图32为本申请实施例提供的用户接口界面实现方法的流程示意图;
图33为本申请实施例提供的用户接口界面实现方法的场景示意图;
图34为本申请实施例提供的用户接口界面实现方法的示意图;
图35A为本申请实施例提供的用户接口界面实现方法的示意图;
图35B为本申请实施例提供的用户接口界面实现方法的示意图;
图36为本申请实施例提供的用户接口界面实现方法的示意图;
图37A为本申请实施例提供的用户接口界面实现方法的示意图;
图37B为本申请实施例提供的用户接口界面实现方法的示意图;
图38A为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图38B为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图39A为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图39B为本申请实施例提供的用户接口界面实现方法的示意图;
图40A为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图40B为本申请实施例提供的用户接口界面实现方法的示意图;
图40C为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图40D为本申请实施例提供的用户接口界面实现方法的示意图;
图41A为本申请实施例提供的用户接口界面实现方法的流程示意图;
图41B为本申请实施例提供的用户接口界面实现方法的流程示意图;
图42A为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图42B为本申请实施例提供的用户接口界面实现方法的场景实例示意图;
图43为本申请实施例提供的一种电子设备的结构组成示意图。
具体实施方式
以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在也包括例如“一个或多个”这种表达形式,除非其上下文中明确地有相反指示。还应当理解,在本申请以下各实施例中,“至少一个”、“一个或多个”是指一个、两个或两个以上。术语“和/或”,用于描述关联对象的关联关系,表示可以存在三种关系;例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A、B可以是单数或者复数。字符“/”一般表示前后关 联对象是一种“或”的关系。
在本说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。术语“连接”包括直接连接和间接连接,除非另外说明。
请参考图1,电子设备200上安装有应用开发工具(比如,Android Studio,DevEco Studio等)。通常,开发者使用一种界面描述语言,在应用开发工具中开发UI,形成界面描述文件。本申请中电子设备200也可称为开发者设备。界面描述文件也可称为描述文件。
UI开发主要包括界面描述和界面行为定义。界面描述是指使用界面描述语言描述UI的布局(layout),使用的控件,以及布局和控件的视觉风格等。界面行为定义是指使用界面描述语言定义界面行为;界面行为包括UI的动态变化,以及电子设备对UI动态变化的响应(比如对用户对UI操作的响应)。各个OS平台有其对应的界面描述语言;例如
Figure PCTCN2021108273-appb-000015
使用xml(extensible markup language,可扩展标记语言)格式,
Figure PCTCN2021108273-appb-000016
使用swift构建的嵌入式领域专用语言(embedded domain specific language,EDSL)进行界面描述和界面行为定义。
开发者将界面描述文件打包到App的安装包,在服务器300提供的应用市场中发布App。应用市场中可以提供各个App的安装包供用户下载。例如,安装包可以为
Figure PCTCN2021108273-appb-000017
应用程序包(Android application package,APK)文件。
以手机为电子设备100举例,用户可使用手机在应用市场中下载某一App的安装包。以视频App举例,手机下载视频App的安装包后,通过运行该安装包可将视频App安装在手机中。这样,手机也获取了安装包中的界面描述文件。手机可以按照界面描述文件构建UI。手机的OS平台提供的UI引擎解释、执行界面描述语言,渲染出UI呈现给用户。手机的显示装置(比如显示屏)上呈现出构建的UI。手机的OS平台还执行实现界面行为的编程语言,实现UI的动态变化以及响应用户对UI的操作。
示例性的,开发者在电子设备200上使用OS平台支持的界面描述语言开发视频App的UI,并发布视频App。用户使用“视频”App的安装包将视频App安装于手机,手机桌面生成“视频”图标101。用户可以点击“视频”图标101,以打开视频App。响应于用户对“视频”图标101的点击操作,手机运行视频App。手机上安装有OS平台,OS平台读取界面描述文件,解析、执行界面描述语言,按照界面描述文件中的界面描述渲染出视频App的UI,并在显示屏呈现视频App的UI 102。进一步的,界面描述文件中还可以包括对界面行为的定义。手机可以响应于用户对UI 102的操作,按照界面描述文件中定义的界面行为,执行相应的界面动作,实现界面行为。通常,OS平台还有对应的程序语言用于实现界面行为,实现UI 102的动态变化以及响应用户对UI 102的操作;例如
Figure PCTCN2021108273-appb-000018
使用JAVA,
Figure PCTCN2021108273-appb-000019
使用swift编程语言实现界面行为。
可以理解的,在一些实施例中,开发者可以直接在电子设备100上开发App的UI,并在电子设备100上运行该App;即电子设备200和电子设备100可以是同一个电子设备。 本申请实施例对此并不进行限定。
上述电子设备100可以包括便携式计算机(如手机等)、手持计算机、平板电脑、笔记本电脑、上网本、个人电脑(personal computer,PC)、智能家居设备(比如,智能电视、智慧屏、大屏、智能音箱等)、个人数字助理(personal digital assistant,PDA)、可穿戴设备(比如,智能手表、智能手环等)、增强现实(augmented reality,AR)\虚拟现实(virtual reality,VR)设备、车载电脑等,本申请实施例对此不做任何限制。电子设备100的示例性实施例包括但不限于搭载
Figure PCTCN2021108273-appb-000020
或者其它操作系统的便携式电子设备。可以理解的是,在其他一些实施例中,上述电子设备100也可以不是便携式电子设备,而是台式计算机。
示例性的,请参考图2,其示出了一种电子设备100的结构示意图。电子设备100可包括处理器110,外部存储器接口120,内部存储器121,音频模块130,扬声器130A,麦克风130B,显示屏140,无线通信模块150,电源模块160等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元。例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的部件,也可以集成在一个或多个处理器中。在一些实施例中,电子设备100也可以包括一个或多个处理器110。
其中,控制器是电子设备100的神经中枢和指挥中心。可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
应用处理器上可以运行电子设备100的操作系统,用于管理电子设备100的硬件与软件资源。比如,管理与配置内存、决定系统资源供需的优先次序、控制输入与输出设备、操作网络、管理文件系统、管理驱动程序等。操作系统也可以用于提供一个让用户与系统交互的操作界面。其中,操作系统内可以安装各类软件,比如,驱动程序,应用程序(application,App)等。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路间(inter-integrated circuit,I2C)接口,集成电路间音频(integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,SIM卡接口,和/或USB接口等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储一个或多个计算机程序,该一个或多个计算机程序包括指令。处理器110可以通过运行存储在内部存储器121的上述指令,从而使得电子设备100执行本申请一些实施例中所提供的用户接口界面实现方法,以及各种应用以及数据处理等。内部存储器121可以包括代码存储区和数据存储区。其中,数据存储区可存储电子设备100使用过程中所创建的数据等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如一个或多个磁盘存储部件,闪存部件,通用闪存存储器(universal flash storage,UFS)等。在一些实施例中,处理器110可以通过运行存储在内部存储器121的指令,和/或存储在设置于处理器110中的存储器的指令,来使得电子设备100执行本申请实施例中所提供的用户接口界面实现方法,以及其他应用及数据处理。
电子设备100可以通过音频模块130,扬声器130A,麦克风130B,以及应用处理器等实现音频功能。例如音乐播放,录音等。音频模块130用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块130还可以用于对音频信号编码和解码。在一些实施例中,音频模块130可以设置于处理器110中,或将音频模块130的部分功能模块设置于处理器110中。
扬声器130A,也称“喇叭”,用于将音频电信号转换为声音信号。
麦克风130B,也称“话筒”,“传声器”,用于将声音信号转换为电信号。用户可以通过人嘴靠近麦克风130B发声,将声音信号输入到麦克风130B。
电子设备100的无线通信功能可以通过天线1,天线2以及无线通信模块150等实现。
无线通信模块150可以提供应用在电子设备100上的包括Wi-Fi,蓝牙(bluetooth,BT),无线数传模块(例如,433MHz,868MHz,915MHz)等无线通信的解决方案。无线通信模块150可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块150经由天线1或者天线2接收电磁波,将电磁波信号滤波以及调频处理,将处理后的信号发送到处理器110。无线通信模块150还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线1或者天线2转为电磁波辐射出去。
电子设备100通过GPU,显示屏140,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏140和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏140用于显示图像,视频等。显示屏140包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏140,N为大于1的正整数。本申请实施 例中,显示屏140可以用于显示UI,以及接收用户对UI的操作。
在一些实施例中,显示屏140上设置有压力传感器170A、触摸传感器170B等。压力传感器170A用于感受压力信号,可以将压力信号转换成电信号。当有触摸操作作用于显示屏140,电子设备100根据压力传感器170A检测所述触摸操作强度。电子设备100也可以根据压力传感器170A的检测信号计算触摸的位置。触摸传感器170B,也称“触控面板”,可以与显示屏140组成触摸屏,也称“触控屏”。触摸传感器170B用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。还可以通过显示屏140提供与触摸操作相关的视觉输出。
电源模块160,可以用于向电子设备100包含的各个部件供电。在一些实施例中,该电源模块160可以是电池,如可充电电池。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构的
Figure PCTCN2021108273-appb-000021
系统为例,示例性说明电子设备100的软件结构。
图3是本发明实施例的电子设备100的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将软件系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图3所示,应用程序包可以包括相机,图库,日历,通话,地图,负一屏,WLAN,桌面,音乐,视频,短信息,等应用程序。
应用程序框架层包括OS,为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数,实现预定义的功能。比如,获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等;提供应用程序访问的数据;为应用程序提供各种资源,比如本地化字符串,图标,图片,界面描述文件,视频文件等。OS的视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。OS还可以使应用程序在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互;还可以以图表或者滚动条文本形式在系统顶部状态栏出现通知,例如后台运行的应用程序的通知;还可以以对话窗口形式在屏幕上出现通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
Android runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
Figure PCTCN2021108273-appb-000022
作为开源OS,在便携式电子设备上被广泛使用。同时,很多厂商也推出了自己的增强系统(OEM OS);例如,华为的EMUI,是基于
Figure PCTCN2021108273-appb-000023
构建的。OEM OS可以提供比基础OS(比如
Figure PCTCN2021108273-appb-000024
)更优化增强的SDK,提供厂商自定义的UI编程能力。
一种实现方法中,厂商发布的OEM OS支持
Figure PCTCN2021108273-appb-000025
的界面描述语言(比如xml),可以提供
Figure PCTCN2021108273-appb-000026
的基础UI编程能力和厂商自定义的UI编程能力。请参考图4,App的UI开发语言为
Figure PCTCN2021108273-appb-000027
适用的界面描述语言(比如xml),用于声明
Figure PCTCN2021108273-appb-000028
提供的基础UI编程能力。在
Figure PCTCN2021108273-appb-000029
的界面描述语言(比如xml)中增加厂商自定义字段,用于声明厂商自定义的UI编程能力。OEM OS平台基于
Figure PCTCN2021108273-appb-000030
提供的基础UI引擎解释和执行界面描述语言;基础UI引擎中增加对厂商自定义字段的解释和执行。OEM OS支持
Figure PCTCN2021108273-appb-000031
提供的基础UI编程能力,并提供厂商自定义UI编程能力。这种方法中,使用
Figure PCTCN2021108273-appb-000032
原生开发接口、界面描述语言和UI引擎,采用在
Figure PCTCN2021108273-appb-000033
原生界面描述语言和UI引擎中增加自定义字段的方式提供厂商自定义UI编程能力。
另一种实现方法中,厂商发布的OEM OS独立于通用OS平台(比如
Figure PCTCN2021108273-appb-000034
),提供厂商自定义的UI编程能力。请参考图5,App的UI开发语言为厂商自定义界面描述语言。OEM OS平台提供自定义UI引擎,对自定义界面描述语言进行解析和执行;提供厂商自定义的UI编程能力。这种方法中,厂商自定义一套完整的、独立于通用OS平台的UI编程框架,可以满足开发者跨平台开发和运行App的诉求。
本申请实施例提供一种用户接口界面实现方法及装置,可以提供丰富的UI编程能力,且能够适应多种OS平台,技术实现难度低,方便开发者使用。
请参考图6,本申请实施例提供的OEM OS平台支持基础界面描述语言,以及自定义界面描述语言。基础界面描述语言是通用的OS平台支持的界面描述语言;比如,
Figure PCTCN2021108273-appb-000035
的xml,
Figure PCTCN2021108273-appb-000036
的swift等。本申请实施例中,自定义界面描述语言为一种领域特定语言(domain specific language,DSL),自定义界面描述语言与通用OS平台不相关。本申请下述实施例中,自定义界面描述语言称为DSL。开发者可以使用基础界面描述语言和DSL共同开发App的UI。比如,开发者使用基础界面描述语言描述UI布局、包括的控件等;并且选择性使用DSL将自定义UI编程能力应用于一些控件,在UI增加一些动效等。比如,自定义UI编程能力可以包括布局能力,视觉属性能力,统一交互能力,动效能力等。布局能力用于描述UI中控件的布局;比如控件的形状、位置、大小等。视觉属性能力用于描述控件的视觉属性;比如,控件的颜色、灰度等视觉效果。统一交互能力用于基于用户行为提供控件响应;比如基于用户的“确认”行为执行搜索。动效能力用于在控件上显示动画效果;比如在控件上显示点击回弹动效等。
本申请实施例提供的OEM OS既可以实现通用OS平台提供的基础UI编程能力,还可以实现相对OS平台扩展的自定义UI编程能力。OEM OS平台包括基础UI引擎和扩展UI引擎。电子设备构建UI时,基础UI引擎用于解释和执行基础界面描述语言,生成基础的UI(具备基础UI编程能力);扩展UI引擎用于解释和执行DSL,在基础的UI上叠加自定义UI编程能力。
本申请实施例提供的用户接口界面实现方法,自定义界面描述语言和扩展UI引擎仅需要覆盖自定义UI编程能力,因此厂商的发布难度低,易于扩展;并且开发者接入门槛低。自定义界面描述语言和扩展UI引擎与通用OS平台不相关,通用OS平台可以是
Figure PCTCN2021108273-appb-000037
也可以是其他通用OS平台。自定义界面描述语言和扩展UI引擎可以方便的适用于多种通用OS平台。
开发者采用基础界面描述语言和自定义界面描述语言开发App。App在电子设备上运行时,电子设备上OEM OS的UI引擎解析和执行界面描述语言(基础界面描述语言和自定义界面描述语言),生成UI。其中,基础UI引擎用于解释和执行基础界面描述语言,本申请实施例中OEM OS提供的扩展UI引擎,用于解析和执行自定义界面描述语言。请参考图7,在一种示例中,扩展UI引擎310包括流程控制311、DSL文件加载312、解析引擎313、执行引擎314等模块。其中,流程控制311用于控制扩展UI引擎310内各个模块的执行流程,以及扩展UI引擎310与OEM OS中其他模块的交互流程等。DSL文件加载312用于读取DSL文件。解析引擎313包括DSL语法校验、DSL解析等子模块。其中,DSL语法校验子模块用于对DSL文件中内容进行语法校验。DSL解析子模块用于解析DSL文件,将DSL文件中内容转换成与执行引擎匹配的数据格式。在一种实现方式中,如果DSL语法校验子模块对DSL文件语法校验成功,DSL解析子模块对DSL文件进行解析;如果DSL语法校验子模块对DSL文件语法校验不成功,DSL解析子模块不执行解析DSL文件。解析引擎313还可以包括DSL预处理等子模块。比如DSL预处理子模块用于对DSL文件进行预编译等。执行引擎314包括版本管理、控件构建、事件代理、解释执行引擎、语义支持库等子模块。其中,版本管理子模块用于匹配扩展UI引擎310与App中DSL文件的版本。扩展UI引擎310的版本需要与DSL文件的版本一致或者比DSL文件的版本更新,才能正常运行。控件构建子模块用于根据DSL文件内容构建UI的控件。事件代理子模块用于实现器件事件与用户行为之间的映射。比如,鼠标双击事件、显示屏上的手指单击事件都可以映射为用户在电子设备上的“确认”行为。解释执行引擎用于解释和执行DSL文件中的代码,响应于用户行为,执行DSL文件中定义的用户行为对应的动作。语义支持库包括DSL文件中所有字段的语法语义规范集合,比如,环境变量接口、公共字段、布局模板属性、视觉属性、布局属性、交互属性、动效属性等字段定义和语法。
请继续参考图7,本申请实施例中OEM OS还包括自定义UI编程能力320。自定义UI编程能力320包括DSL适配层,用于为扩展UI引擎310提供自定义UI编程能力的适配接口。自定义UI编程能力320还提供视觉属性能力,布局能力,统一交互能力,动效能力等自定义UI编程能力的实现。比如,DSL文件中声明了一个控件使能垂直拉伸能力,该自定义UI编程能力(垂直拉伸能力)的实现由自定义UI编程能力320完成;也就是说,该控件的显示窗口发生变化时,自定义UI编程能力320实现控件的垂直拉伸,不需要开发者在DSL中进行控件垂直拉伸的实现。
开发者可以使用基础界面描述语言和DSL共同开发App。基础界面描述语言的语法规则和开发工具可以沿用常规技术。本申请实施例还提供DSL的语法规则和开发工具。在一种示例中,本申请实施例提供一种开发工具,支持基础界面描述语言和DSL的语法规则,提供基础界面描述语言和DSL的编辑及编译环境。
在一些实施例中,本申请实施例提供一种开发工具,在该开发工具的开发界面,包括基础界面描述语言文件和DSL文件。示例性的,开发者打开开发工具的开发界面,在开发界面包括基础界面描述语言文件初始版和DSL文件初始版。进一步的,开发者可以在基础界面描述语言文件初始版中采用基础界面描述语言增加控件描述,还可以在DSL文件初始版中采用DSL增加控件描述。可以理解的,DSL文件初始版可以是开发工具中预置的,也可以是开发者在开发工具中添加的。在一些实例中,开发工具中还可以包括诸如DSL模板,DSL的语法规则描述文件,界面描述示例等。
在一些示例中,基础界面描述语言文件用于描述原生控件,对原生控件应用基础UI编程能力。DSL文件用于声明控件的自定义UI编程能力。比如,可以在DSL文件中对原生控件应用自定义UI编程能力;再比如,还可以在DSL文件中声明自定义控件,对自定义控件应用自定义UI编程能力。
在一种实现方式中,基础界面描述语言文件和DSL文件分别设置于开发工具文件夹的不同路径。示例性的,基础界面描述语言用一个或多个xml格式的文件承载,DSL用一个或多个json格式的文件承载。例如,如图8所示,以
Figure PCTCN2021108273-appb-000038
平台作为通用OS平台为例,开发者在开发工具中创建一个App的文件夹app。文件夹app的res目录下集成了一个AndroidManifest.xml文件。开发者可以在该xml格式的文件中声明使用的基础UI编程能力。文件夹app的assets目录下集成了一个huawei_dsl.json文件。开发者可以在该json格式的DSL文件中声明使用的自定义UI编程能力。
可以理解的,上述将基础界面描述语言文件和DSL文件分别设置于开发工具文件夹的不同路径,仅是为了OEM OS的UI引擎可以区分基础界面描述语言文件和DSL文件。在实际应用中,也可以采用其他方式区分基础界面描述语言文件和DSL文件。比如,对基础界面描述语言文件和DSL文件预置不同的标签,OEM OS的UI引擎可以按照预置标签分别获取到基础界面描述语言文件和DSL文件。本申请实施例对此并不进行限定。
开发者在开发工具中完成App开发,进行编译后生成App的安装包。该基础界面描述语言文件和DSL文件集成到App安装包中,以使得OEM OS的UI引擎可以读取基础界面描述语言文件和DSL文件。在一种实现方式中,基础界面描述语言文件和DSL文件在App安装包中的存储位置与其在开发工具中App的文件夹中的位置一致。
在一种示例中,DSL文件使用标准的json格式;示例性的,DSL文件包括version、app和layout等内容。
其中,version表示DSL文件的版本号;示例性的,version格式为x.y.z,x指示产品,y表示产品的子系统,z表示开发次数;例如,可以是101.1.003。
app内容块用于声明作用于DSL文件所在的App安装包内App全局控件的自定义UI编程能力。示例性的,app内容块格式为:
Figure PCTCN2021108273-appb-000039
Figure PCTCN2021108273-appb-000040
其中,feature_name为自定义UI编程能力的属性。value为自定义UI编程能力的属性值。
layout内容块用于声明作用于一个布局(layout)中控件的自定义UI编程能力。示例性的,layout内容块格式为:
Figure PCTCN2021108273-appb-000041
其中,layoutId用于指示一个layout;比如,layoutId为一个layout的标识。widgetId用于指示layout中的一个控件;比如,widgetId是控件标识。prop_name为自定义UI编程能力的属性,表示自定义UI编程能力的一个特征;比如,自定义UI编程能力使能,自定义UI编程能力的优先级,自定义UI编程能力的参数等。value为自定义UI编程能力的属性值,属性值用于指定属性的值;比如,属性为自定义UI编程能力使能,相应的,属性值为true表示使能自定义UI编程能力,属性值为false表示不使能自定义UI编程能力。
示例性的,DSL文件中包括以下代码段:
Figure PCTCN2021108273-appb-000042
其中,版本号为101.1.003。自定义UI编程能力zoom(缩放)的属性值为使能,即App全局的控件使能zoom(缩放)能力。App中名称为R.layout.mainpage的layout中名称为R.id.edit_text的控件使能onSearch(搜索)能力,onSearch的属性值为com.app.Search$onSearchPrice(即搜索功能的具体执行动作在com.app.Search$onSearchPrice中定义)。
可以理解的,DSL文件中可以包括更少或更多的字段。比如,DSL文件中可以包括 version和layout内容块,不包括app内容块。再比如,layout内容块中可以包括多个控件的描述字段。再比如,对一个控件可以使能多个自定义UI编程能力。本申请实施例对此并不进行限定。
本申请实施例的自定义UI编程能力可以包括视觉属性能力,布局能力,统一交互能力,动效能力等。开发者可以在DSL文件中声明自定义UI编程能力,以使用OEM OS提供的自定义UI编程能力。
示例性的,UI的视觉属性体现为控件的视觉属性。OEM OS为控件定义了一套视觉参数变量,用于描述控件的视觉属性;这一套视觉参数变量可以用于多品牌或多设备视觉属性的切换。开发者描述UI的视觉属性时,使用视觉参数变量(在电子设备运行时动态获取与品牌或者电子设备本身匹配的属性值)即可,不需要开发者指定具体的变量值。在DSL文件中声明视觉参数变量即使得控件具备了相应的视觉属性能力。
DSL文件中使用视觉属性能力的示例如下:
Figure PCTCN2021108273-appb-000043
在该示例中,控件R.id.textview的视觉属性textColor的属性值为emuiColor1;控件R.id.image的视觉属性foreground的属性值为emui_color_bg。其中,emuiColor1和emui_color_bg为视觉参数变量,在不同的品牌或设备上映射为不同的颜色值。OEM OS中预置不同的品牌或设备上视觉参数变量与颜色值的映射关系,避免了开发者在不同的品牌或设备上指定textColor以及foreground属性值的重复劳动。
OEM OS提供自适应布局能力,以构建响应式UI,使得UI的布局可以适配不同大小、形态的显示屏;避免开发者为不同的设备进行不同的布局工作。示例性的,自适应布局能力包括自动拉伸、隐藏、折行、均分、占比、延伸等布局能力。在一种示例中,OEM OS提供的自适应布局能力适用于LinearLayout布局以及布局中的控件。
下面示例性示出几种布局能力。
一、自适应布局能力总开关,用于打开或者关闭控件的自适应布局能力。在一种示例中,自适应布局能力总开关打开,才能使能布局能力中任意一种。
1、字段定义
能力 属性 字段定义 属性所属
总开关 打开自适应布局能力 use_HwConstraintLayout 布局
其中,“能力”用于指示自定义UI编程能力。“属性”表示自定义UI编程能力的特征参数。“属性所属”表示对属性作用的分类;比如属性所属为布局,表示该属性用于对控件布局;比如属性所属为子元素,表示该属性用于描述控件。
二、自动拉伸能力。控件使能自动拉伸能力,则可以实现控件在UI根据窗口大小自动拉伸,以适应窗口大小。
1、字段定义
Figure PCTCN2021108273-appb-000044
2、DSL文件中使用自动拉伸能力的示例
Figure PCTCN2021108273-appb-000045
在该示例中,R.layout.linearlayout_vertical布局中使能控件的垂直拉伸能力。该布局中控件在显示窗口发生变化时,可以自动垂直拉伸,以适应显示窗口大小。
三、隐藏能力。控件使能隐藏能力,则可以在UI隐藏。
1、字段定义
Figure PCTCN2021108273-appb-000046
2、DSL文件中使用隐藏能力的示例
Figure PCTCN2021108273-appb-000047
Figure PCTCN2021108273-appb-000048
在该示例中,R.layout.mainpage布局中控件R.id.container使能垂直隐藏能力。R.id.container中R.id.image1的垂直隐藏的优先级为2,R.id.image2的垂直隐藏的优先级为1。
四、折行能力。控件使能折行能力,则可以实现在UI内,控件折为多行显示。在一种示例中,折行宽度限定值可以用于指定控件在每一行显示的最大宽度。
1、字段定义
Figure PCTCN2021108273-appb-000049
2、DSL文件中使用折行能力的示例
Figure PCTCN2021108273-appb-000050
在该示例中,R.layout.mainpage布局中控件使能折行能力。其中,R.id.image1的折行宽度限定值160dp,R.id.image2的折行宽度限定值160dp;表示R.id.image1在每一行显示的最大宽度值为160dp,R.id.image2在每一行显示的最大宽度值为160dp。
五、均分能力。控件使能均分能力,则可以实现控件在UI内均分显示。
1、字段定义
Figure PCTCN2021108273-appb-000051
2、DSL文件中使用均分能力的示例
Figure PCTCN2021108273-appb-000052
Figure PCTCN2021108273-appb-000053
在该示例中,R.layout.mainpage布局中控件使能均分能力。其中,R.id.image1均分类型为spread。
六、占比能力。控件使能占比能力,表示支持控件在指定方向上按照指定的百分比占据布局总大小。
1、字段定义
Figure PCTCN2021108273-appb-000054
2、DSL文件中使用占比能力的示例
Figure PCTCN2021108273-appb-000055
在该示例中,R.layout.mainpage布局中控件使能垂直占比能力。其中,R.id.image1垂直占比为33.33%。
七、延伸能力。控件使能延伸能力,则可以实现控件根据显示屏尺寸在UI上延伸显示。在一种示例中,露出值用于指定UI上可显示的最后一个控件在UI上的露出特征。
1、字段定义
Figure PCTCN2021108273-appb-000056
2、DSL文件中使用延伸能力的示例
version:101.0.100
Figure PCTCN2021108273-appb-000057
在该示例中,R.layout.mainpage布局中控件使能延伸能力。其中,R.id.image1使能露出特征能力,露出值为40dp。
OEM OS还提供统一交互能力,支持开发者基于行为定义控件的响应。在一种示例中,统一交互能力包括搜索、缩放等。开发者可以在DSL文件中声明统一交互能力,使得控件具备搜索、缩放等能力。
开发者基于通用OS平台开发UI时,开发者定义事件对应的行为。比如,定义鼠标双击事件对应“确认”行为,定义在显示屏上的手指单击事件对应“确认”行为,以及定义其他事件与“确认”行为的对应关系。开发者的工作量较大。本申请实施例提供的OEM OS支持开发者直接定义对“确认”行为的响应(即定义行为对应的统一交互能力),而不需定义与“确认”行为对应的事件;事件与行为的映射关系由OEM OS完成。OEM OS将不同形态的电子设备触发的事件映射为同一行为(比如,将PC上的鼠标双击事件映射为“确认”行为,将手机上的手指单击事件映射为“确认”行为),避免开发者针对不同形态的电子设备定义事件和行为的对应关系,带来重复劳动。
DSL文件中使用搜索能力的示例如下:
Figure PCTCN2021108273-appb-000058
在该示例中,控件R.id.sample_text具备onSearch(搜索)能力。电子设备接收到用户在控件R.id.sample_text的“确认”行为(比如,接收到鼠标双击R.id.sample_text事件,接收到手指单击R.id.sample_text事件等),执行在com.sample.SearchImplSample$onSearchSample中定义的搜索功能。
DSL文件中使用缩放能力的示例如下:
Figure PCTCN2021108273-appb-000059
Figure PCTCN2021108273-appb-000060
在该示例中,控件R.id.sample_text具备onZoom(缩放)能力。电子设备接收到用户在控件R.id.sample_text的“确认”行为(比如,接收到鼠标双击R.id.sample_text事件,接收到手指单击R.id.sample_text事件等),执行在com.sample.ZoomImplSample$onZoomSample中定义的缩放功能。
OEM OS还提供增强的动效能力,使得控件的动效更具表现力。OEM OS提供的动效能力适用于Button及其子类;支持App全局使能,或者针对控件使能。
在一种示例中,动效能力包括Button控件的点击回弹微动效(字段定义:reboundAnimation)。
DSL文件中使用动效能力的示例如下:
Figure PCTCN2021108273-appb-000061
在app内容块中声明reboundAnimation,App内所有适用范围内的控件使能点击回弹微动效。
Figure PCTCN2021108273-appb-000062
在layout内容块中声明reboundAnimation,目标控件使能点击回弹微动效。
示例性的,图9示出了App运行时生成UI的一种流程示意图。
S401、扩展UI引擎的流程控制模块读取基础界面描述语言文件,并调用基础UI引擎解析和执行基础界面描述语言,构建出基础UI。
其中,基础UI的控件使用了基础UI编程能力。
S402、扩展UI引擎的流程控制模块调用DSL文件加载模块读取和加载DSL文件。
S403、解析引擎对DSL文件中内容进行语法校验、解析和预处理,获取与执行引擎匹配的数据格式。
在一种示例中,DSL语法校验子模块对DSL文件中内容进行语法校验,如果校验通过,DSL解析子模块对DSL文件中字段进行解析。进一步的,DSL预处理子模块对DSL文件进行预处理获取与执行引擎匹配的数据格式。
S404、执行引擎在S401构建的基础UI上,根据DSL文件内容,以控件为单位构建 增强的UI。
在一种实现方式中,控件构建子模块从语义支持库中依次获取DSL文件中字段对应的语义处理组件。比如,从语义支持库中获取“onSearch”字段的语义处理组件SearchHandler。进一步的,控件构建子模块通过DSL适配层将自定义UI编程能力应用于控件,构建增强的UI。
示例性的,图10示出了电子设备响应用户在UI的操作的一种流程示意图。
S501、执行引擎创建一个事件代理,将事件代理通过DSL适配层注册给UI。
S502、OEM OS监听UI的用户操作事件,并上报用户操作事件至事件代理。
S503、事件代理实现事件与行为之间的映射。
S504、解释执行引擎解释和执行DSL文件中的代码,根据行为实现DSL文件中指定的响应,完成响应用户在UI的操作。
本申请实施例提供的用户接口界面实现方法,支持App中包括原生控件、自定义控件,还支持在原生控件中应用自定义UI编程能力。其中,原生控件为通用OS(比如
Figure PCTCN2021108273-appb-000063
)支持的控件,通用OS为原生控件提供基础UI编程能力;自定义控件为通用OS不支持,OEM OS支持的控件,OEM OS为自定义控件提供自定义UI编程能力。
请参考图11,其示出了OEM OS构建原生控件的一种流程示意图。OEM OS 1101的App开发工程包中包括基础界面描述语言文件,基础UI引擎1110的流程控制1111指示解析引擎1112对基础界面描述语言文件进行处理。解析引擎1112读取、加载基础界面描述语言文件,将基础界面描述语言文件转化为与执行引擎1113匹配的数据格式。执行引擎1113根据基础界面描述语言文件内容构建基础UI,生成原生控件1130。
请参考图12,其示出了OEM OS在原生控件应用自定义UI编程能力的一种流程示意图。OEM OS 1101的App开发工程包中包括基础界面描述语言文件以及DSL文件。扩展UI引擎1120的流程控制1121指示基础UI引擎1110中的解析引擎1112对基础界面描述语言文件进行处理。解析引擎1112读取、加载基础界面描述语言文件,将基础界面描述语言文件转化为与执行引擎1113匹配的数据格式。执行引擎1113根据基础界面描述语言文件内容构建基础UI,生成原生控件1130。流程控制1121指示扩展UI引擎1120中的解析引擎1122对DSL文件进行处理。解析引擎1122读取、加载DSL文件,将DSL文件转化为与执行引擎1123匹配的数据格式。执行引擎1123根据DSL文件在原生控件1130应用自定义UI编程能力。
请参考图13,其示出了OEM OS构建自定义控件的一种流程示意图。OEM OS 1101的App开发工程包中包括基础界面描述语言文件以及DSL文件。扩展UI引擎1120的流程控制1121指示基础UI引擎1110中的解析引擎1112对基础界面描述语言文件进行处理。解析引擎1112读取、加载基础界面描述语言文件,将基础界面描述语言文件转化为与执行引擎1113匹配的数据格式。执行引擎1113根据基础界面描述语言文件内容构建基础UI,生成原生控件1130。流程控制1121指示扩展UI引擎1120中的解析引擎1122对DSL文件进行处理。解析引擎1122读取、加载DSL文件,将DSL文件转化为与执行引擎1123匹配的数据格式。执行引擎1123根据DSL文件在基础UI上生成自定义控件1140。
本申请实施例提供的OEM OS包括基础UI引擎和扩展UI引擎。电子设备构建UI时,基础UI引擎用于解释和执行基础界面描述语言,生成基础的UI(具备基础UI编程能力); 扩展UI引擎用于解释和执行DSL,在基础的UI上叠加自定义UI编程能力。本申请实施例提供的用户接口界面实现方法能够适应多种OS平台,提供丰富的UI编程能力;技术实现难度低,方便开发者使用。
本申请实施例还提供一种用户接口界面实现方法,实现难度低,方便开发者使用。
随着物联网(Internet of things,IoT)快速发展,IoT设备类型与数量均快速增长。不同的IoT设备拥有差异化的屏幕尺寸以及用户交互方式。例如,手机的屏幕尺寸大多在4-6寸左右,用户交互方式以触摸、点击显示屏为主;电视的屏幕尺寸可达50寸或更大,用户交互方式通常为遥控器操作;车机等设备具有更为广泛的屏幕形态以及用户交互方式。目前通用的OS平台(比如
Figure PCTCN2021108273-appb-000064
)支持的开发方式为,针对同一App中的同一UI,开发者对每一种类型的电子设备设计不同的界面描述文件。显然,使用这种方法为各类设备开发差异化UI的工作量大,开发难度高。本申请实施例提供一种用户接口界面实现方法及装置,可以实现一次开发,多设备部署;即开发一套界面描述文件,适用于各种不同类型的电子设备;降低开发者的开发难度。
比如,
Figure PCTCN2021108273-appb-000065
作为开源OS,在便携式电子设备上被广泛使用。基于
Figure PCTCN2021108273-appb-000066
开发App的UI时,对同一App中的同一UI,开发者针对每一种类型的电子设备设计不同的界面描述文件,以实现为不同类型的电子设备开发差异化UI。
Figure PCTCN2021108273-appb-000067
支持为每种类型的电子设备分别设立布局文件夹的方式实现多类设备UI独立开发。比如,在布局文件夹的名称中增加后缀来区分不同的布局文件夹。这样,针对同一UI,不同类型的电子设备读取不同布局文件夹中的界面描述文件,呈现不同的UI显示效果。如果需要将该App安装在其他类型的电子设备上,则需要单独为该类型的电子设备开发对应的界面描述文件(单独增加一个布局文件夹)。这样的开发方式中,开发者需要为不同类型的电子设备单独开发对应的界面描述文件,开发工作量大,难度高。
还有一些厂商提供了一些完整的、独立于
Figure PCTCN2021108273-appb-000068
的UI编程框架。例如Flutter,ReacNative,Weex等框架。这种UI编程框架包含UI界面描述语言及对应的解析执行引擎,提供独立的界面控件库、布局引擎和渲染引擎等,可以跨设备运行,但是兼容性差。
本申请实施例提供一种用户接口界面实现方法及装置。请参考图14,开发者在电子设备200(开发者设备)中打开一种开发工具(比如,DevEco Studio),在开发工具的开发界面生成界面描述文件。示例性的,开发者打开开发工具的开发界面,开发界面包括界面描述文件初始版。界面描述文件初始版可以是空白文件,也可以包含简单示例。可以理解的,界面描述文件初始版可以是开发工具中预置的,也可以是开发者在开发工具中添加的。在一些示例中,开发工具中还可以包括诸如界面描述语言模板,界面描述语言语法规则描述文件,界面描述示例等,本申请实施例中不再赘述。
进一步的,开发者可以在界面描述文件初始版中采用一种界面描述语言添加界面描述和界面行为定义,形成用于发布的界面描述文件。在一种实现方式中,开发者针对App中每个UI生成一个界面描述文件;比如,可以在一个文件夹中生成多个界面描述文件,每个界面描述文件对应一个UI。
在开发者设备上生成App的安装包,其中包含界面描述文件。App的安装包上传至服务器,App在服务器提供的应用市场中发布。用户可使用用户侧电子设备(上述电子设备100)在应用市场中下载App的安装包。用户侧电子设备运行App的安装包之后,获取安 装包中的界面描述文件;当用户侧电子设备运行App时,按照界面描述文件在显示屏显示与该电子设备相匹配的UI。
在一种示例中,界面描述文件采用json格式。示例性的,如图14所示,App的安装包中包括文件夹“app”410。文件夹“app”410的src\main\assets目录下包括布局文件夹“layout”411,布局文件夹“layout”411中包括界面描述文件layout1.json412、layout2.json413和layout3.json 414等,每个界面描述文件分别对应App的一个UI。手机420、车机430、电视440、手表450等不同类型的用户侧电子设备都运行“layout”411中同一个界面描述文件,并分别显示同一UI的不同显示效果。比如,手机420、车机430、电视440和手表450都解析执行layout1.json 512,并分别显示layout1.json 512对应UI的不同显示效果。
本申请实施例中,对同一App中的同一UI,开发者在一个界面描述文件中开发一套代码即可实现为不同类型的电子设备开发差异化UI。不同类型的电子设备读取同一UI的同一个界面描述文件,可以呈现不同的UI显示效果。可以实现开发一套界面描述文件,适用于各种不同类型的电子设备,降低了开发者的开发难度。
本申请实施例提供的用户接口界面实现方法,支持在界面描述文件中使用
Figure PCTCN2021108273-appb-000069
原生的UI编程能力,以及操作系统自定义的UI编程能力。
Figure PCTCN2021108273-appb-000070
原生的UI编程能力使得控件具备
Figure PCTCN2021108273-appb-000071
原生的控件属性,操作系统自定义的UI编程能力使得控件具备扩展的视觉属性、布局属性、交互属性、动效属性和软硬件依赖属性等。电子设备运行App的安装包,获取安装包中的界面描述文件后;当用户在电子设备上运行App时,电子设备可以按照界面描述文件在显示屏呈现相应的UI,该UI中的控件可以包括
Figure PCTCN2021108273-appb-000072
原生的控件属性,还可以包括扩展的控件属性。本申请实施例提供的自定义UI引擎支持对
Figure PCTCN2021108273-appb-000073
原生控件属性以及操作系统中扩展的所有控件属性进行解析执行。
在一种示例中,图15示出了电子设备100的一种软件架构。如图15所示,电子设备100软件系统可以包括应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。在一种示例中,应用程序层,安卓运行时(Android runtime)和系统库,以及内核层可以参考图3中
Figure PCTCN2021108273-appb-000074
软件架构中相应的描述,此处不再赘述。本申请实施例提供的电子设备100的软件系统部分复用常规技术中UI编程框架,易于学习,降低开发者的开发难度。
应用程序框架层的操作系统包括自定义UI引擎11。自定义UI引擎11用于解析和执行App的界面描述文件,生成App的UI。自定义UI引擎11可以包括UI解析引擎11a,UI执行引擎11b,MVVM(model-view-viewmodel)框架11c,语法语义库11d和UI渲染引擎11e。可以理解的,应用程序框架层还可以包括更多模块,可以参考常规技术,本申请实施例对此并不进行限定。
下面结合附图对上述各模块进行详细介绍。
语法语义库11d包括界面描述文件中所有字段的语法语义规范集合,比如,变量接口、公共字段、视觉属性、布局属性、交互属性、动效属性、软硬件依赖属性等字段定义和语法。其中,布局属性是指UI中各个控件的布局;比如控件的形状、位置、大小等。视觉属性是指控件的颜色、灰度等视觉效果。交互属性是基于用户行为提供控件响应的能力;比如基于用户的“确认”行为执行搜索。动效属性是指在控件上显示动画效果;比如在控件上显示点击回弹动效等。软硬件依赖属性是指控件依赖设备的软硬件参数。
开发者需要按照语法语义库11d中定义的语法语义规范在界面描述文件中添加代码,开发UI。下面从布局编排,数据&界面绑定、交互行为编排以及差异化描述等方面对语法语义库11d中定义的语法语义规范进行介绍。
以界面描述文件采用json格式为例,示例性的,界面描述文件可以包括如下结构:
Figure PCTCN2021108273-appb-000075
其中,meta-data包括版本号等信息。示例如下:
"meta-data":{
   "version":"10.0.1008"
}
version表示界面描述文件的版本号;示例性的,version格式为x.y.z,x指示产品,y表示产品的子系统,z表示开发次数。界面描述文件的版本需要与自定义UI引擎的版本匹配,比如,自定义UI引擎的版本与界面描述文件的版本一致,或者比界面描述文件的版本更新,才能成功解析界面描述文件。
import用于导入对象,model用于声明对象。示例如下:
Figure PCTCN2021108273-appb-000076
import中导入UserInfo保存的完整路径com.myapp.UserInfo,以及Context保存的完整路径com.myapp.TestActivity;在model中声明UserInfo类型的对象user,以及Context类型的对象context;这样就可以在界面描述文件(layout-data-common和layout-data-uimode)中直接调用user和context。本申请中,UserInfo、TestActivity等文件称为资源文件;资源文件包括生成该应用的UI使用的资源,该资源可以包括开发者定义的数据结构、控件、控件属性等。
import和model的具体用法在后面结合layout-data-common、layout-data-uimode和styles中的语法语义规则详细介绍。
layout-data-common用于描述通用UI。各种类型的电子设备都解析layout-data-common中内容,按照layout-data-common中的内容布局通用UI。layout-data-uimode用于描述指定 设备的UI。在一种实现方式中,layout-data-uimode中声明指定设备UI与通用UI的区别。在另一种实现方式中,layout-data-uimode中声明适用于指定设备的UI的全部条件。指定设备可以为手机、手表、车机、智能家居设备(比如,智能电视、智慧屏、智能音箱等)、大屏、笔记本电脑、台式电脑等。比如,layout-data-uimode的具体形式可以包括layout-data-phone(用于手机),layout-data-watch(用于手表),layout-data-television(用于电视)等。
styles用于定义App中自定义参数。开发者可以在styles中自定义参数。
一、布局编排
App中所有UI是由控件构成的。UI的布局编排即编排UI中控件的属性。
1、控件
自定义UI引擎11支持所有
Figure PCTCN2021108273-appb-000077
原生控件以及操作系统中拓展的控件,还支持开发者在App中自定义的或者通过静态包集成的控件。其中,控件具体可以包括文本控件,例如TextView控件、EditText控件等,也可以包括按钮控件,例如Button控件、ImageButton控件等,还可以包括图片控件,例如Image控件等,本申请实施例对此不做任何限制。
对于
Figure PCTCN2021108273-appb-000078
原生控件以及操作系统中拓展的控件,可以在layout-data-common或layout-data-uimode中直接调用控件名称。例如,
Figure PCTCN2021108273-appb-000079
原生控件可以包括TextView,EditText等;操作系统中拓展的控件可以包括HwButton等。一种声明控件的示例如下,在该示例中,声明了
Figure PCTCN2021108273-appb-000080
原生控件TextView和EditText。
Figure PCTCN2021108273-appb-000081
对于开发者在App中自定义的或者通过静态包集成的控件,需要在import中引入控件的资源包的完整包名。这样,就可以在layout-data-common或layout-data-uimode中调用。一种声明App中自定义控件的示例如下,在该示例中,先在import中引入控件MyCircleView的资源包的完整包名com.myapp.widget.MyCircleView,然后在layout-data-common中直接调用MyCircleView。
Figure PCTCN2021108273-appb-000082
在一种实现方式中,自定义UI引擎11支持开发者对控件进行别名指定。一种示例如下,在该示例中,在import中引入MyCircleView的资源包的完整包名com.myapp.widget.MyCircleView时,将MyCircleView的名称指定为AliasName。
Figure PCTCN2021108273-appb-000083
在一种实现方式中,在layout-data-common或layout-data-uimode中调用控件时,以ComponentName():{}的形式进行控件声明。例如,TextView():{},表示声明了一个TextView。
2、控件的属性
自定义UI引擎11支持的控件属性包括
Figure PCTCN2021108273-appb-000084
原生属性以及操作系统中扩展的视觉属性、布局属性、交互属性、动效属性和软硬件依赖属性等。
使用ComponentName():{}方式描述一个控件时,在一种实现方式中,可以在{}中传入控件的属性及属性值,格式为“属性1:属性值1,属性2:属性值2”。示例如下,该示例中,声明了控件TextView,并在{}中传入TextView的属性textSize,textSize的属性值为@dimen/mySize。
Figure PCTCN2021108273-appb-000085
在另一种实现方式中,可以在()中传入控件的属性及属性值。示例如下,该示例中,声明了控件TextView,并在()中传入TextView的属性text,text的属性值为@string/text_name。
{
  "TextView(text:@string/text_name)":{}
}
在一种实现方式中,如果对于同一个控件,在()和{}中都传入控件属性及属性值,则忽略()中内容。
其中,控件属性的属性值可以通过以下任意一种方式赋值。直接通过字符串值指定;访问后台数据中定义的资源值;访问后台数据中声明的分类参数;或者,访问控件模型(ViewModel)对象中的值。
自定义UI引擎11支持通过namespace.propertyName方式来指定控件属性的命名空间(namespace)。在一种实现方式中,不指定命名空间表示默认为
Figure PCTCN2021108273-appb-000086
的命名空间。在一种实现方式中,自定义UI引擎11支持使用命名空间androidhwext指向操作系统中扩展资源包,使用命名空间app指向App中自定义的资源包。操作系统中扩展资源包提供操作系统中自定义的UI编程能力;App中自定义的资源包提供App中自定义的控件属性。
在一种实现方式中,开发者还可以定义其他命名空间。开发者定义的命名空间通过import引入,并提供定义控件属性的资源包的包名。示例如下,在该示例中,import中引 入开发者定义的命名空间myspace,myspace的资源包的完整包名为com.myapp。在import中引入myspace后,即可在layout-data-common中调用myspace中属性borderWidth。
Figure PCTCN2021108273-appb-000087
二、数据&界面绑定
自定义UI引擎11支持将UI中的元素与后台数据进行双向绑定。可以在界面描述文件(layout-data-common或layout-data-uimode)中声明并指定UI中的元素(比如控件、控件组)与后台数据之间的绑定关系。自定义UI引擎11中MVVM框架11c即可实现根据UI改变刷新后台数据,或者根据后台数据改变自动刷新对应UI。
示例性的,可以将UI中的元素与一个控件模型(ViewModel)对象绑定起来。先在import中引入ViewModel,在model中声明一个ViewModel类型的对象。然后在layout-data-common或layout-data-uimode中调用该ViewModel对象。
在一种示例中,将UI中控件属性的属性值绑定为ViewModel对象的值。示例如下,在该示例中,在import中引入UserInfo(UserInfo为一个ViewModel)的资源包的完整包名com.myapp.UserInfo,并在model中声明一个UserInfo类型的对象user;然后在layout-data-common中访问user中的数据。
Figure PCTCN2021108273-appb-000088
Figure PCTCN2021108273-appb-000089
在一种实现方式中,通过$model.field方式访问ViewModel对象(model)中的变量值(field);例如上述$user.photo为访问user中变量photo,$user.name为访问user中变量name。在一种实现方式中,通过$model::function方式访问ViewModel对象(model)中的函数(function)返回值。例如上述$user::hasName为访问user中函数hasName的返回值。
上述示例中,将控件ImageView的属性imageUri(图像)与后台数据user.photo绑定起来,将一个控件TextView的属性text(文本)与后台数据user.name绑定起来,将一个控件TextView的属性text与后台数据user.age绑定起来,将控件CheckBox的属性checked(确认)与后台数据user.agreed绑定起来,将一个控件TextView的属性visible(可见)与后台数据user::hasName绑定起来。当后台数据发生变化时,控件属性的属性值发生改变,UI中控件的显示效果改变。
在一种实现方式中,可以根据后台数据获取控件的可见性,实现UI的隐藏局部的功能。当后台数据中变量变化(可见变为不可见,或不可见变为可见),UI中控件可以随之隐藏或显示。示例如下,在该示例中,一列控件(Column)的可见性(visible)由变量user.visible的值决定。
Figure PCTCN2021108273-appb-000090
在另一种示例中,在UI上接收用户输入,将用户输入绑定至ViewModel对象的值。示例如下,在该示例中,在import中引入UserInfo(UserInfo为一个ViewModel)的资源包的完整包名com.myapp.UserInfo,并在model中声明一个UserInfo类型的对象user;然后在layout-data-common中声明,将控件EditText的属性text(文本)的用户输入值赋值给user中的变量name。其中通过“=”为后台数据赋值。
Figure PCTCN2021108273-appb-000091
Figure PCTCN2021108273-appb-000092
三、交互行为编排
自定义UI引擎11支持在界面描述文件中声明控件响应事件对应的执行动作。控件支持的事件范围由控件支持的事件监听决定。例如,按钮控件(Button)支持对点击事件的监听setOnClickListener,则可以在界面描述文件中对控件绑定onClick(点击)事件。自定义UI引擎11将事件的参数和后台数据中的响应函数返回值在控件和后台数据之间进行双向透传。示例如下,layout-data-common中声明控件Button响应于事件onClick执行后台数据context.buttonClick中定义的动作(响应函数返回值)。
Figure PCTCN2021108273-appb-000093
自定义UI引擎11支持UI执行引擎加载控件的生命周期事件包括onPreMount、onMount、onUnmount、onPreUpdate和onUpdate等;其中,onPreMount表示控件挂载到UI之前调用;onMount表示控件挂载到UI后调用;onUnmount表示控件在UI上移除后调用;onPreUpdate表示数据变化引起UI刷新前调用;onUpdate表示当后台数据变化引起UI刷新后调用。
在一种实现方式中,事件消费与否由响应函数返回值决定。可选的,自定义UI引擎11遵循
Figure PCTCN2021108273-appb-000094
原生的接口定义,将后台数据中的处理结果透传给控件。
四、差异化描述
1、自定义UI引擎11支持控件的属性依赖于电子设备的配置参数。
操作系统中定义了电子设备配置参数的变量。界面描述文件中可以声明电子设备配置参数的变量。当电子设备运行界面描述文件时,访问电子设备的配置参数,电子设备根据其软硬件条件获取配置参数的值。这样,不同类型的电子设备运行同一界面描述文件时,由于其软硬件条件不同,配置参数不同,生成的UI不同。
在一种实现方式中,通过$env.config方式访问当前电子设备的配置参数(config)。
示例性的,电子设备的配置参数可以包括表1所示内容:
表1
Figure PCTCN2021108273-appb-000095
示例如下,控件的dependOn属性的属性值可以赋值为配置参数中字段,用来声明控件的属性依赖某个配置参数。在该示例中,扫一扫控件(TextView)的可见性依赖于电子设备的摄像头硬件(camera_sensor);表示若电子设备存在摄像头,则该扫一扫控件显示;若电子设备不存在摄像头,该扫一扫控件不显示。
Figure PCTCN2021108273-appb-000096
2、layout-data-uimode用于描述指定设备的UI。
开发者可以在layout-data-uimode中声明指定设备的UI,指定设备的UI与通用UI的显示效果是有区别的。
在一种实现方式中,layout-data-uimode中声明适用于指定设备的UI的全部条件。示例性的,请参考图16,界面描述文件710包括layout-data-common 711和layout-data-watch 712等代码块。layout-data-common 711用于描述适用于各种类型电子设备的通用UI,layout-data-watch 712用于描述适用于手表的UI。如图16所示,layout-data-common 711中声明了通用UI中各个控件的属性及属性值。手机读取界面描述文件710,解析执行layout-data-common 711中内容,按照layout-data-common 711中声明的各个控件的属性及属性值生成相应的控件。示例性的,手机根据layout-data-common 711中内容块7111对应生成图片控件721,layout-data-common 711中内容块7112对应生成控件组722, layout-data-common 711中内容块7113对应生成控件组723,layout-data-common 711中内容块7114对应生成按钮控件724,layout-data-common 711中内容块7115对应生成控件组725。这样,根据内容块7111、内容块7112、内容块7113、内容块7114和内容块7115生成了手机的UI 720。
layout-data-watch 712中声明了适用于手表的UI中控件的属性及属性值。手表读取界面描述文件710,确定界面描述文件710中存在用于手表的layout-data-watch 712,则解析执行layout-data-watch 712中内容,按照layout-data-watch 712中声明的各个控件的属性及属性值生成相应的控件。示例性的,手表根据layout-data-watch 712中内容块7121对应生成图片控件731,layout-data-watch 712中内容块7122对应生成控件组732,layout-data-watch 712中内容块7123对应生成控件组733,layout-data-watch 712中内容块7124对应生成按钮控件734。这样,根据内容块7121、内容块7122、内容块7123和内容块7124生成了手表的UI 730。
这样,手表作为一种指定设备读取第二代码段(layout-data-watch 712)中内容,除手表外的电子设备读取第一代码段(layout-data-common 711)中内容;不同类型的电子设备读取同一UI的同一个界面描述文件,即可呈现不同的UI显示效果;开发一套界面描述文件,即可实现为不同类型的电子设备开发差异化UI,降低了开发者的开发难度。
在另一种实现方式中,layout-data-uimode中声明指定设备UI与通用UI的区别。示例性的,请参考图17,界面描述文件810包括layout-data-common 811和layout-data-watch 812等代码块。layout-data-common 811用于描述适用于各种类型电子设备的通用UI,layout-data-watch 812用于描述手表的UI与通用UI的差别。如图17所示,layout-data-common 811中声明了通用UI中各个控件的属性及属性值。手机读取界面描述文件810,解析执行layout-data-common 811中内容,按照layout-data-common 811中声明的各个控件的属性及属性值生成相应的控件。手机根据layout-data-common 811中内容块8111对应生成图片控件721,layout-data-common 811中内容块8112对应生成控件组722,layout-data-common 811中内容块8113对应生成控件组723,layout-data-common 811中内容块8114对应生成按钮控件724,layout-data-common 811中内容块8115对应生成控件组725。这样,根据内容块8111、内容块8112、内容块8113、内容块8114和内容块8115生成了手机的UI 720。
layout-data-watch 812中声明了手表的UI中与通用UI有区别的控件的属性及属性值。手表读取界面描述文件810,解析执行layout-data-common 811中内容;手表确定界面描述文件810中存在用于手表的layout-data-watch 812,并解析执行layout-data-watch 812中内容;手表按照layout-data-common 811和layout-data-watch 812中声明的各个控件的属性及属性值生成相应的控件。如图17所示,手表根据layout-data-common 811中内容块8111对应生成图片控件731,layout-data-common 811中内容块8112对应生成控件组732,layout-data-common 811中内容块8113对应生成控件组733,layout-data-common 811中内容块8114对应生成按钮控件734。根据layout-data-watch 812的描述,将layout-data-common 811中内容块8115对应生成的控件组的可视属性(visible)的属性值设置为不可见(gone)。也就是说,在手表上不显示layout-data-common 811中内容块8115对应的控件组。如图17所示,手表的UI 730包括根据内容块8111、内容块8112、内容块8113和内容块8114生 成的控件。
这样,所有类型的电子设备都读取第一代码段(layout-data-common 711)中内容,手表作为一种指定设备还读取第二代码段(layout-data-watch 712)中内容;不同类型的电子设备读取同一UI的同一个界面描述文件,即可呈现不同的UI显示效果;开发一套界面描述文件,即可实现为不同类型的电子设备开发差异化UI,降低了开发者的开发难度。
3、style
自定义UI引擎11支持开发者在style中自定义参数,用于当前的界面描述文件。示例如下,开发者在style中定义myTextStyle,并可以在layout-data-common中以$style.myTextStyle的方式调用该自定义参数。
Figure PCTCN2021108273-appb-000097
采用本申请实施例提供的语法语义规则开发UI,语法简洁高效,开发一套界面描述文件即可适用于不同类型的电子设备,避免了为不同类型的电子设备单独开发UI,降低了开发成本。
UI解析引擎11a用于解析界面描述文件,将界面描述文件中内容转换成与UI执行引擎11b匹配的数据格式。在一些示例中,UI解析引擎11a还可以对界面描述文件中内容进行语法校验,如果对界面描述文件语法校验成功,则对界面描述文件进行解析;如果对界面描述文件语法校验不成功,则不执行解析界面描述文件。
在一种示例中,请参考图18,UI解析引擎11a读取界面描述文件,解析界面描述文件中声明(model)、风格(style)、布局(layout-data-common、layout-data-uimode)等字段内数据,预处理后保存至数据库中。使用控件解析器解析布局字段内的数据,根据布局描述的逻辑结构递归调用UI执行引擎11b实例化控件,形成UI的控件树。使用属性解析器解析各个控件的属性字段,调用UI执行引擎11b为各个控件设置属性,完成UI绘制。
其中,控件解析器和属性解析器的工作流程如图19所示。控件解析器获取控件的名称,并获取控件的属性列表。如果存在控件的标识(identity,ID),添加控件ID;如果存在控件的风格(style),添加控件style;实例化控件形成控件队列。如果存在子布局,递归调用子布局中的控件。解析完所有布局中的控件后,增加控件,返回生成的控件。属性 解析器从控件队列中获取实例化控件,读取控件对应的属性名与属性值,并将属性(包括属性名与属性值)存入哈希表。如果存在子布局,递归调用子布局中的控件。所有控件的属性都解析完成后,将哈希表中保存的各个控件的属性值赋值给对应控件。
UI执行引擎11b用于根据UI解析引擎11a解析的数据构建UI的控件(实例化控件和属性设置),对控件进行布局编排,生成界面描述文件中声明的界面;还可以实现器件事件与用户行为之间的映射,响应于用户行为执行界面描述文件中定义的用户行为对应的动作等。
示例性的,请参考图20A,在UI执行引擎11b中,为每一个控件构造一个构建(Builder)类,Builder类采用与
Figure PCTCN2021108273-appb-000098
相同的继承逻辑,以实现子控件可继承父类控件的所有属性与设置方法,无需重复定义。Builder类内包含了对应控件的实体构造方法与该控件特有的视觉属性的设置方法,以完成控件实例化与属性设置。对于开发者自定义的控件,可以在UI执行引擎11b中提供一个该自定义控件的Builder类即可,接入成本低,对开发者比较友好。
在一些实施例中,对于界面描述文件中声明的
Figure PCTCN2021108273-appb-000099
原生控件属性,UI执行引擎11b可以根据界面描述文件中的声明进行属性设置,完成控件实例化,构造出的控件具备
Figure PCTCN2021108273-appb-000100
原生控件属性。
请参考图20B,对于界面描述文件中声明的控件的扩展属性,如果确定操作系统中包括自定义UI编程能力,UI执行引擎11b根据界面描述文件中的声明进行属性设置,完成控件实例化,构造出具备扩展属性的控件;如果确定操作系统中不包括自定义UI编程能力,UI执行引擎11b根据界面描述文件中声明的扩展属性映射对应的
Figure PCTCN2021108273-appb-000101
原生控件属性,根据对应的
Figure PCTCN2021108273-appb-000102
原生控件属性进行属性设置,完成控件实例化,构造出具备
Figure PCTCN2021108273-appb-000103
原生控件属性的控件,该控件不具备扩展属性。
示例性的,请参考图20C,在开发者设备上生成App的安装包,其中包含界面描述文件。App的安装包上传至服务器,App在服务器提供的应用市场中发布。用户可使用用户侧电子设备(上述电子设备100)在应用市场中下载App的安装包。用户侧电子设备运行App的安装包之后,获取安装包中的界面描述文件;当用户侧电子设备运行App时,按照界面描述文件在显示屏显示与该电子设备相匹配的UI。
比如,界面描述文件中包括以下内容:
Figure PCTCN2021108273-appb-000104
在一种示例中,平板460作为用户侧电子设备运行App。平板460的操作系统中包括自定义UI编程能力。平板460的UI上,自定义控件组HwMagicLayout被成功构造,该控 件组具备操作系统中扩展的布局属性。示例性的,操作系统中扩展的布局属性可以包括自动拉伸、隐藏、均分、占比、延伸或折行等布局属性。其中,自动拉伸是指控件的高度或宽度根据窗口大小自动放大或缩小,以适应窗口大小。隐藏是指控件在布局中可见或不可见的能力。均分是指控件中的内容在布局中均匀分布。占比是指控件在指定方向上按照指定的百分比占据布局总大小。延伸是指控件根据显示屏尺寸在UI上延伸显示。折行是指控件中的内容在布局中通过一行或多行显示。如图20C所示,平板460的UI布局具备系统中扩展的布局属性。平板460竖屏显示时,控件461、控件组462、控件组463、控件464和控件组465竖向排列于一列。平板460横屏显示时,控件461、控件组462、控件组463和控件464竖向排列于第一列,控件组465排列于第二列。平板460的UI布局根据显示窗口大小、形状适应性调整。
平板460的UI上,交互能力"zoomEnable"在控件461“imageview”上生效。当平板460连接鼠标480时,可以响应于用户对控件461的放大操作(比如,鼠标480对应的光标置于控件461时,向上转动鼠标480的滚轮),将控件461放大显示。
在另一种示例中,平板470作为用户侧电子设备运行App。平板470的操作系统中不包括自定义UI编程能力。平板470的UI不支持自定义控件组HwMagicLayout,UI中控件具备
Figure PCTCN2021108273-appb-000105
原生的线性布局(LinerLayout)属性。平板470竖屏或横屏显示时,控件471、控件组472、控件组473、控件474和控件组475竖向排列于一列。平板470的UI布局是固定的,不能根据显示窗口大小、形状适应性调整。
平板470的UI上,交互能力"zoomEnable"不可以在控件471“imageview”上生效。即平板470连接鼠标480时,如果鼠标480对应的光标置于控件471时,向上转动鼠标480的滚轮,控件471的大小保持不变。
这样,符合语法语义库11d中语法规则的界面描述文件可以在不同的操作系统中成功运行,实现了跨操作系统平台运行,降低开发者开发难度。
需要说明的是,UI执行引擎11b在电子设备运行界面描述文件时动态解析数据,在电子设备运行界面描述文件时获取电子设备相关参数;避免了开发者在开发工具中对界面描述文件进行预编译,生成预置数据文件;这样,可以使得UI开发不依赖编译环境,实现跨开发平台开发和运行。
MVVM框架11c用于对UI中的元素与后台数据进行双向绑定。在界面描述文件中,声明并指定UI中的元素(比如控件、控件组)与后台数据之间的绑定关系,可选的,还可以进行简单的数据实例设置之后;MVVM框架11c可以实现根据UI改变刷新后台数据,以及根据后台数据改变自动刷新对应UI。帮助开发者专注于UI设计与编排,简化UI开发过程,大幅减少开发者为实现前后端数据交互投入的开发时间。
示例性的,请参考图21,开发者在界面描述文件中声明控件字段,为控件绑定属性,并绑定后台数据对象。UI解析引擎11a对界面描述文件中的绑定行为进行解析,获取控件属性与后台数据对象的对应关系。MVVM框架11c用于实现控件与后台数据的双向绑定。当后台数据改变时,MVVM框架11c将后台数据与对应的控件属性的数据进行映射;UI执行引擎11b设置控件的属性数据,刷新UI。当UI中控件的属性数据发生改变(比如,响应于用户输入,控件的形状、文本等数据发生改变),MVVM框架11c将控件属性的数据与后台数据进行映射;后台数据刷新。
UI渲染引擎11e用于对UI执行引擎11b生成的界面进行渲染、整理,将显示内容输出至显示屏。
本申请实施例提供的用户接口界面实现方法,不同类型的电子设备读取同一UI的同一个界面描述文件,呈现不同的UI布局。可以实现开发一套界面描述文件,适用于各种不同类型的电子设备,降低了开发者的开发难度。
在一些实施例中,请参考图22,电子设备100软件系统可以包括应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层App1的界面描述文件采用json格式;App2的界面描述文件采用xml格式。
应用程序框架层的操作系统包括控制单元。电子设备100运行App时,控制单元获取App的界面描述文件。示例性的,电子设备100运行App1时,控制单元获取App1的json格式界面描述文件;电子设备100运行App2时,控制单元获取App2的xml格式界面描述文件。控制单元根据界面描述文件的类型将界面描述文件分发至基础UI引擎10或自定义UI引擎11进行UI绘制。比如,控制单元获取App1的json格式界面描述文件,将App1的json格式界面描述文件分发至自定义UI引擎11进行处理。比如,控制单元获取App2的xml格式界面描述文件,将App2的xml格式界面描述文件分发至基础UI引擎10进行处理。在一种实现方式中,json格式界面描述文件和xml格式界面描述文件在应用安装包中的指定路径不同。控制单元在App1应用安装包的第一指定路径获取json格式界面描述文件,在App2应用安装包的第二指定路径获取xml格式界面描述文件。在另一种实现方式中,json格式界面描述文件和xml格式界面描述文件预置不同的标签,控制单元根据界面描述文件的预置标签确定界面描述文件的类型。
自定义UI引擎11对App1的json格式界面描述文件进行解析、执行、渲染,生成App1的UI。App1的UI中控件可以支持通用OS(比如
Figure PCTCN2021108273-appb-000106
)原生的UI编程能力,还可以支持电子设备100操作系统中自定义UI编程能力。
基础UI引擎10对App2的xml格式界面描述文件进行解析、执行、渲染,生成App2的UI。App2的UI中控件可以支持通用OS(比如
Figure PCTCN2021108273-appb-000107
)原生的UI编程能力。
这样,电子设备100既可以运行采用json格式界面描述语言开发的App,也可以运行采用xml格式界面描述语言开发的App;实现操作系统前向兼容。
本申请实施例还提供一种用户接口界面实现方法,用于应用小组件的UI。
开发者可以开发App对应的小组件。比如,手机支持在通知栏,桌面及负一屏上显示应用的小组件。通常,显示在通知栏的应用小组件称为自定义通知栏,显示在桌面的应用小组件称为桌面小部件,显示在负一屏的应用小组件称为负一屏卡片。自定义通知栏、桌面小部件和负一屏卡片可以更直观地向用户呈现应用中的信息,并且支持在不打开应用的情况下对应用执行操作,方便用户使用应用。越来越多的应用提供了小组件,供用户使用。
目前,支持在应用小组件的用户接口界面(user interface,UI)上显示的布局方式和控件类型比较少,不能满足用户多样化的需求。本申请实施例提供一种用户接口界面实现方法及装置,支持应用小组件的UI上显示各种布局方式和控件类型,方便用户使用应用小组件,提高用户体验。
通常,开发者使用一种界面描述语言,在应用开发工具中开发应用(Application,App)的UI。开发者还使用界面描述语言,在应用开发工具中开发应用小组件的UI。
请参考图23,电子设备200上安装有应用开发工具(比如,Android Studio,DevEco Studio等)。本申请中电子设备200也可称为开发者设备。开发者在应用开发工具中开发App的UI,形成界面描述文件。本申请中界面描述文件也可称为描述文件。开发者还在应用开发工具中开发应用小组件的UI,形成组件界面描述文件。开发者将界面描述文件和组件界面描述文件打包到App的安装包,在服务器300提供的应用市场中发布App。应用市场中可以提供各个App的安装包供用户下载。例如,安装包可以为
Figure PCTCN2021108273-appb-000108
应用程序包(Android application package,APK)文件。
需要说明的是,在一种实现方式中,组件界面描述文件是独立于界面描述文件的。在另一种实现方式中,组件界面描述文件可以是界面描述文件的一部分(比如,将界面描述文件中一段代码段作为组件界面描述文件)。本申请实施例对此并不进行限定。下述实施例中,以组件界面描述文件是单独文件为例进行示例性说明。
以手机为电子设备100举例,用户可使用手机在应用市场中下载某一App的安装包。App的安装包包括界面描述文件和组件界面描述文件。以音乐App举例,手机下载音乐App的安装包后,通过运行该安装包可将音乐App安装在手机中。这样,手机也获取了安装包中的界面描述文件和组件界面描述文件。
示例性,如图23所示,手机安装了音乐App后,手机桌面包括音乐App的快捷图标-“音乐”图标103。手机可以接收用户对“音乐”图标103的点击操作,响应于用户对“音乐”图标103的点击操作,手机按照界面描述文件生成音乐App的UI,并在显示屏呈现音乐App的UI。手机还可以根据用户设置,在手机桌面上显示音乐App的小组件(称为音乐小部件)。手机按照组件界面描述文件生成音乐小部件的UI,并在显示屏显示音乐小部件的UI 104。
可以理解的,在一些实施例中,开发者可以直接在电子设备100上开发App的UI和应用小组件的UI,并在电子设备100上运行该App和应用小组件;即电子设备200和电子设备100可以是同一个电子设备。本申请实施例对此并不进行限定。
通常,在UI中呈现的元素称为控件(View),控件能够为用户提供一定的操作功能或用于显示一定内容。示例性的,
Figure PCTCN2021108273-appb-000109
系统原生控件包括文本控件(TextView),输入框(EditText),按钮(Button),图片按钮(ImageButton),图片控件(ImageView)等。
以安卓系统举例,应用中所有UI内的元素都是由控件(View)和控件组(ViewGroup)构成的。一个UI中可以包含一个或多个View或ViewGroup。View是显示在显示界面中的一个元素;ViewGroup是一个用于存放View(或ViewGroup)的布局容器。ViewGroup中可以添加新的View或ViewGroup,使得各个View之间按照一定的层次和结构关系进行排列。示例性的,开发人员可使用线性布局(LinearLayout)、表格布局(TableLayout)、相对布局(RelativeLayout)、层布局(FrameLayout)、绝对布局(AbsoluteLayout)或网格布局(GridLayout)等布局方式设计App中每个UI内的View或ViewGroup,从而生成每个UI的布局文件;比如界面描述文件或组件界面描述文件等。
目前,
Figure PCTCN2021108273-appb-000110
系统支持在应用小组件中应用的布局方式和控件种类有限,无法满足用户多样的需求。本申请实施例提供一种用户接口界面实现方法,能够支持在应用小组件中应用多样的布局方式和控件种类。
本申请实施例提供的用户接口界面实现方法,不仅支持将
Figure PCTCN2021108273-appb-000111
系统原生的线性布 局(LinearLayout)、层布局(FrameLayout)、相对布局(RelativeLayout)和网格布局(GridLayout)应用于应用小组件,还支持将
Figure PCTCN2021108273-appb-000112
系统原生的表格布局(TableLayout)、绝对布局(AbsoluteLayout)等布局方式应用于应用小组件。
本申请实施例提供的用户接口界面实现方法,不仅支持将
Figure PCTCN2021108273-appb-000113
系统原生的按钮(Button),图片控件(ImageView),图片按钮(ImageButton),进度条(ProgressBar),文本控件(TextView),列表控件(ListView),网格控件(GridView),堆叠控件(StackView),控件动态加载(ViewStub),自适应控件(AdapterViewFlipper),切换控件(ViewFlipper),时钟(AnalogClock),计时器(Chronometer)等控件应用于应用小组件;还支持将
Figure PCTCN2021108273-appb-000114
系统原生的输入框(EditText),复选框(CheckBox),滑动选择器(Picker),滚动视图(ScrollView),单选按钮(RadioButton),评分条(RatingBar),搜索框(SearchView),拖动条(SeekBar),开关(Switch)等控件应用于应用小组件。
示例性的,以手机作为电子设备100,手机上使用音乐App为例。手机100可以在桌面上显示音乐App的小组件(称为音乐小部件)。如图24A所示,手机100显示音乐小部件的UI 910。UI 910包括图片控件911,用于显示App设定的图片;文本控件912,用于显示播放音乐的曲目名称;搜索框913,用于接收用户的输入,并响应于用户的输入文本进行搜索;图片按钮914,用于切换音乐小部件的显示风格;拖动条915,用于根据用户操作,调节播放音乐的进度;以及其他控件。
示例性的,如图24B所示,手机100可以接收用户对搜索框913的点击操作,响应于该点击操作,在桌面上显示放大的搜索框913以及软键盘91a。用户可以使用软键盘91a输入用于搜索框913的文本。手机100根据输入搜索框913的文本进行搜索。
示例性的,如图24C所示,手机100可以接收用户对拖动条915的拖动操作,调节播放音乐的进度。
本申请实施例提供的用户接口界面实现方法,还支持将操作系统中自定义的UI编程能力应用于应用小组件,使得应用小组件中的控件具备操作系统中扩展的视觉属性、布局属性、交互属性、动效属性和软硬件依赖属性等。其中,布局属性是指UI中各个控件的布局;比如控件的形状、位置、大小等。视觉属性是指控件的颜色、灰度等视觉效果。交互属性是基于用户行为提供控件响应的能力;比如基于用户的“确认”行为执行搜索。动效属性是指在控件上显示动画效果;比如在控件上显示点击回弹动效等。软硬件依赖属性是指控件依赖设备的软硬件参数。示例性的,操作系统中扩展的布局属性可以包括自动拉伸、隐藏、均分、占比、延伸或折行等布局属性。其中,自动拉伸是指控件的高度或宽度根据窗口大小自动放大或缩小,以适应窗口大小。隐藏是指控件在布局中可见或不可见的能力。均分是指控件中的内容在布局中均匀分布。占比是指控件在指定方向上按照指定的百分比占据布局总大小。延伸是指控件根据显示屏尺寸在UI上延伸显示。折行是指控件中的内容在布局中通过一行或多行显示。
示例性的,如图24D所示,音乐小部件的UI 910上可以包括图片按钮916,用于显示音乐的歌词。手机可以接收用户对图片按钮916的点击操作,响应于用户对图片按钮916的点击操作,手机显示当前播放音乐的歌词。图片按钮916可以在UI 910上显示或者不显示。图片按钮916是否显示依赖于当前播放音乐是否有歌词。如果当前播放音乐有对应歌词,图片按钮916在UI 910上显示;如果当前播放音乐没有对应歌词,图片按钮916不在 UI 910上显示。示例性的,如图24D所示,当前播放音乐是音乐1时,UI 910上包括图片按钮916;当前播放音乐是音乐2时,UI 910上不包括图片按钮916。
本申请实施例提供的用户接口界面实现方法,还支持将开发者在App中定义的布局方式和控件种类应用于应用小组件。
开发者可以根据设计目的,将
Figure PCTCN2021108273-appb-000115
系统原生的各种布局方式和控件种类,操作系统中定义的布局方式和控件种类,以及在App中定义的布局方式和控件种类应用于应用小组件,方便用户使用。
在一种实现方式中,请参考图25,App开发阶段,开发者在电子设备200(开发者设备)中打开一种开发工具(比如,DevEco Studio),在开发工具中使用界面描述语言,按照界面描述语言的语法语义规范进行界面描述和界面行为定义,进行UI开发;形成用于发布的界面描述文件和组件界面描述文件。
示例性的,以界面描述文件和组件界面描述文件采用json格式为例。开发者可以在界面描述文件和组件界面描述文件中分别进行UI的布局编排、数据&界面绑定、交互行为编排以及差异化描述等。应用和应用小组件中所有UI是由控件构成的。UI的布局编排,即编排UI中控件的属性。数据&界面绑定,即在界面描述文件或组件界面描述文件中声明并指定UI中的元素(比如控件、控件组)与后台数据之间的绑定关系。交互行为编排,即在界面描述文件或组件界面描述文件中声明控件响应事件对应的执行动作。控件支持的事件范围由控件支持的事件监听决定。例如,按钮(Button)支持对点击事件的监听setOnClickListener,则可以在界面描述文件中对控件绑定onClick(点击)事件。差异化描述,包括为不同类型的电子设备编排不同的代码段,使得应用小组件的UI在不同类型的电子设备上显示效果不同;根据电子设备的软硬件条件获取配置参数的值,应用于控件;定义适用于App内的参数等。
比如,用户可以在音乐App的组件界面描述文件中声明图24A-图24D所示的图片控件911,文本控件912,搜索框913,图片按钮914,拖动条915和图片按钮916,对这些控件进行属性编排;使得音乐小部件的UI 910中包括这些控件。这些控件可以是
Figure PCTCN2021108273-appb-000116
系统原生的控件,也可以是操作系统中定义的或音乐App中定义的控件。
开发者还可以在组件界面描述文件中对这些控件应用操作系统中定义的控件属性。比如,对图片按钮916应用操作系统中定义的软件依赖属性。声明图片按钮916的显示属性依赖于App当前播放的音乐包括歌词。
开发者还可以在组件界面描述文件中将控件与后台数据进行绑定。比如,将拖动条915与后台数据进行绑定;手机接收到用户对拖动条915的拖动操作,则根据用户对拖动条915的拖动操作更新后台数据中当前音乐播放进度;如果后台数据中当前音乐播放进度发生变化,则更新拖动条915。
开发者还可以在组件界面描述文件中声明这些控件响应事件对应的执行动作。比如,声明搜索框913响应于点击事件,执行搜索动作。
之后,在开发者设备上生成App的安装包,其中包含界面描述文件和组件界面描述文件。App的安装包上传至服务器,App在服务器提供的应用市场中发布。用户可使用用户侧电子设备(上述电子设备100)在应用市场中下载App的安装包。用户侧电子设备运行App的安装包之后,获取安装包中的界面描述文件和组件界面描述文件。示例性的,如图 25所示,手机100运行音乐App的安装包后,在桌面上显示音乐App的图标103。手机100可以接收用户对图标103的点击操作,运行音乐App,按照界面描述文件在显示屏显示音乐App的UI。
进一步的,用户侧电子设备根据用户的设置,在通知栏,桌面或负一屏上添加应用小组件。用户侧电子设备根据组件界面描述文件生成应用小组件的UI,并在通知栏,桌面或负一屏显示应用小组件的UI。示例性的,如图25所示,手机100在桌面上显示音乐小部件的UI 910。
请参考图26,电子设备100中应用小组件进程是独立于应用进程运行的。安装于电子设备100的应用通过调用应用进程运行,应用小组件通过调用应用小组件进程运行。比如,将应用小组件设置在桌面,桌面进程即为应用小组件进程;将应用小组件设置在负一屏,负一屏显示进程即为应用小组件进程;将应用小组件设置在指定的应用中,该指定应用的进程即为应用小组件进程。
电子设备100还包括自定义UI引擎11,小组件框架12等单元。
在一些示例中,应用进程获取App的界面描述文件,调用自定义UI引擎11解析和执行App的界面描述文件,生成App的UI。自定义UI引擎11可以包括UI解析引擎11a,UI执行引擎11b,MVVM(model-view-viewmodel)框架11c等。UI解析引擎11a用于解析界面描述文件,将界面描述文件中内容转换成与UI执行引擎11b匹配的数据格式。在一些示例中,UI解析引擎11a还可以对界面描述文件中内容进行语法校验,如果对界面描述文件语法校验成功,则对界面描述文件进行解析;如果对界面描述文件语法校验不成功,则不执行解析界面描述文件。UI执行引擎11b用于根据UI解析引擎11a解析的数据构建UI的控件(实例化控件和属性设置),对控件进行布局编排,生成界面描述文件中声明的界面;还可以实现器件事件与用户行为之间的映射,响应于用户行为执行界面描述文件中定义的用户行为对应的动作等。MVVM框架11c用于对UI中的元素与后台数据进行双向绑定。在界面描述文件中,声明并指定UI中的元素(比如控件、控件组)与后台数据之间的绑定关系,可选的,还可以进行简单的数据实例设置之后;MVVM框架11c可以实现根据UI改变刷新后台数据,以及根据后台数据改变自动刷新对应UI。帮助开发者专注于UI设计与编排,简化UI开发过程,大幅减少开发者为实现前后端数据交互投入的开发时间。
在一些示例中,应用进程获取组件界面描述文件,调用小组件框架12对组件界面描述文件进行处理,形成用于显示应用小组件UI的小组件UI数据。其中,小组件框架12包括虚拟控件构建12a、数据绑定12b、小组件服务12c和事件代理12d等模块。其中,虚拟控件构建12a通过调用UI解析引擎11a解析组件界面描述文件,对解析后的组件界面描述文件进行实例化,并调用UI执行引擎11b构造界面,构建出控件、控件组等,形成小组件UI数据。这些小组件UI数据存在于应用进程内,用于和后台数据(比如,控件模型(ViewModel))进行绑定。数据绑定12b用于将虚拟控件构建12a构造的控件或控件组的属性、交互事件等与后台数据(比如,处理业务逻辑的控件模型(ViewModel),包含了与虚拟对象相关的数据处理逻辑)进行绑定。小组件服务12c用于对生成应用小组件UI过程中,当前处理的对象以及与该对象绑定的数据(model)进行跟踪处理;还用于管理应用进程和应用小组件进程之间的数据传输;还用于管理跨进程的事件代理,收发跨进程事 件。事件代理12d用于处理应用小组件进程中事件的回传与响应。示例性的,事件代理12d中定义一个专用的事件传输类(比如HiAction),该事件传输类支持实现Parcelable接口,可以跨进程传输(比如,调用
Figure PCTCN2021108273-appb-000117
原生的跨进程Binder机制)。该事件传输类中存储一系列事件,每个事件包括布局标识,控件标识,事件类型等信息。当用户对应用小组件进行操作,应用小组件进程接收到该操作,触发一个交互事件,即在HiAction中新增一个事件。应用小组件进程将该新增的事件传输给应用进程。应用进程响应该事件,执行相应的动作。应用进程还调用MVVM框架进行处理;若存在数据或控件属性变化,则更新小组件UI数据,更新跨进程的界面数据与属性,进一步更新应用小组件进程的显示界面。在一种实现方式中,应用进程还将组件界面描述文件发送给应用小组件进程。应用小组件进程调用小组件框架12对组件界面描述文件进行处理,形成小组件UI数据,并显示小组件UI数据,即显示应用小组件IU。
电子设备上安装了应用,用户可以在通知栏,桌面或负一屏上添加应用对应的应用小组件。
在一些实施例中,用户不打开应用,单独添加应用对应的应用小组件。示例性的,请参考图27,手机100接收用户在桌面上的双指捏合操作,响应于桌面上的双指捏合操作,手机100显示快捷设置界面1010,快捷设置界面1010包括“窗口小工具”选项1011,用于在桌面上添加桌面小部件。手机100可以响应于用户对“窗口小工具”选项1011的点击操作,显示桌面小部件添加界面1020。桌面小部件添加界面1020包括“音乐”选项1021,用于在桌面上添加音乐小部件。手机100接收用户对“音乐”选项1021的点击操作,响应于用户对“音乐”选项1021的点击操作,手机100的桌面上显示“音乐小部件”的UI 910。
在另一些实施例中,用户在应用中添加应用对应的应用小组件。示例性的,如图28所示,用户打开手机100上的“音乐”应用的“音乐设置”界面1030,“音乐设置”界面1030包括“桌面小部件”选项1031。“桌面小部件”选项1031用于在手机桌面上添加音乐小部件。手机100接收用户对“桌面小部件”选项1031的点击操作,响应于用户对“桌面小部件”选项1031的点击操作,手机100显示“桌面小部件”界面1040。“桌面小部件”界面1040包括“风格1”选项1041和“风格2”选项1042;还包括“添加”按钮1043和“取消”按钮1044。用户可以通过点击“添加”按钮1043,将音乐小部件以“风格1”选项1041或“风格2”选项1042对应的界面添加至桌面;还可以通过点击“取消”按钮1044退出添加音乐小部件。示例性的,手机100接收用户对“添加”按钮1043的点击操作,根据用户选择,将音乐小部件以“风格2”选项1042对应的界面添加至桌面。手机100的桌面上显示“音乐小部件”的UI 910。
在一种实现方式中,请参考图29A,用户进行了添加应用小组件的操作。电子设备100的应用小组件进程接收到用户添加应用小组件的操作(比如,用户对图27中“音乐”选项1021的点击操作),应用小组件进程通知应用进程接收到用户添加应用小组件的操作;如果应用进程为未启动状态,则电子设备100拉起应用进程,使得应用进程在后台运行。或者,电子设备100的应用进程接收到用户添加应用小组件的操作(比如,用户对图28中“添加”按钮1043的点击操作),应用进程通知应用小组件进程接收到用户添加应用小组件的操作。
应用进程从应用安装包获取到组件界面描述文件。应用进程调用自定义UI引擎11对 组件界面描述文件进行解析执行,之后调用虚拟控件构建12a进行小组件UI数据控件构建,按照组件界面描述文件中的布局编排生成控件、控件组等,形成小组件UI数据(包括控件和控件的布局等信息)。
需要说明的是,该组件界面描述文件可以是独立于界面描述文件的文件,也可以是界面描述文件中的一段代码。可选的,电子设备100获取到组件界面描述文件后,可以仅对其中的部分代码段进行解析和执行。比如,用户选择图28中“风格1”选项1041对应的音乐小部件界面,手机100在接收到用户对“添加”按钮1043的点击操作后,解析和执行组件界面描述文件中“风格1”对应的代码段;用户选择图28中“风格2”选项1042对应的音乐小部件界面,手机100在接收到用户对“添加”按钮1043的点击操作后,解析和执行组件界面描述文件中“风格2”对应的代码段。
之后,数据绑定12b调用MVVM框架11c将小组件UI数据与后台数据(比如,控件模型)进行数据绑定。这样,如果后台数据发生变化,可以刷新对应的小组件UI数据;小组件UI数据改变,可以刷新对应的后台数据。
进一步的,应用进程将组件界面描述文件发送给应用小组件进程。
应用小组件进程调用自定义UI引擎11对组件界面描述文件进行解析执行,之后调用虚拟控件构建12a进行小组件UI数据控件构建,按照组件界面描述文件中的布局编排生成控件、控件组等,形成小组件UI数据(包括控件和控件的布局等信息);并按照该小组件UI数据进行显示,即显示应用小组件UI。由于应用进程生成小组件UI数据与应用小组件进程生成应用小组件UI使用的是同一段代码段,应用小组件UI上的控件与小组件UI数据中的控件是一一对应的。
可选的,如图29B所示,应用进程也可以在生成小组件UI数据后,将小组件UI数据发送给应用小组件进程,应用小组件进程按照该小组件UI数据进行显示,即显示应用小组件UI。这样,应用小组件UI上的控件与小组件UI数据中的控件也是一一对应的。
在通知栏,桌面或负一屏上添加应用小组件之后,用户可以在应用小组件UI上对应用进行操作。比如,用户可以拖动图24A中音乐小部件UI 910上的拖动条915,调节音乐应用当前播放音乐的播放进度。
在一种实现方式中,请参考图29C,当用户在应用小组件的UI上进行操作时,应用小组件进程接收到用户操作,将用户操作传输给事件代理12d。事件代理12d中定义了一个专用的事件传输类,该事件传输类用于跨进程传输。该事件传输类中存储多个事件,其中每个事件包括布局标识、控件标识、事件类型等信息。事件代理12d接收到用户操作后,在事件传输类中产生该操作对应的事件,将事件发送给应用进程(如果应用进程未启动,则拉起应用进程,使得应用进程在后台运行)。应用进程接收到该事件后,根据布局标识和控件标识获取对应的控件,并根据作用于该控件上的该事件执行相应的业务逻辑。由于应用小组件UI上的控件与小组件UI数据中的控件存在一一对应关系,应用进程还根据接收到的事件刷新后台数据。后台数据改变触发小组件UI数据更新,应用进程还可以将更新的小组件UI数据发送给应用小组件进程,应用小组件进程按照刷新后的小组件UI数据显示更新的应用小组件UI。
在一些实施例中,请参考图29D,应用进程启动,应用进程获取界面描述文件。应用进程调用自定义UI引擎11对界面描述文件进行解析执行,生成应用的UI,并显示该应用 的UI。之后,MVVM框架11c将应用的UI与后台数据(比如,控件模型)进行数据绑定。
应用进程接收到用户添加应用小组件的操作,应用进程从应用安装包获取组件界面描述文件。应用进程调用自定义UI引擎11对组件界面描述文件进行解析执行,之后调用虚拟控件构建12a进行小组件UI数据控件构建,按照组件界面描述文件中的布局编排生成控件、控件组等,形成小组件UI数据(包括控件和控件的布局等信息)。之后,数据绑定12b调用MVVM框架11c将小组件UI数据与后台数据(比如,控件模型)进行数据绑定。进一步的,应用小组件进程根据小组件UI数据显示应用小组件的UI。
这样,电子设备100的显示屏上显示应用的UI以及对应的应用小组件的UI。
在一种示例中,请参考图30,其示出了本申请实施例提供的用户接口界面实现方法的一种流程示例。
用户进行了添加应用小组件的操作。电子设备100的应用小组件进程接收到用户添加应用小组件的操作(比如,用户对图27中“音乐”选项1021的点击操作),应用小组件进程通知应用进程接收到用户添加应用小组件的操作;如果应用进程为未启动状态,则电子设备100拉起应用进程,使得应用进程在后台运行。或者,电子设备100的应用进程接收到用户添加应用小组件的操作(比如,用户对图28中“添加”按钮1043的点击操作),应用进程通知应用小组件进程接收到用户添加应用小组件的操作。小组件框架、MVVM框架、后台数据等模块进行初始化。小组件框架从应用安装包中获取组件界面描述文件,并发送给虚拟控件构建模块。虚拟控件构建模块根据组件界面描述文件进行控件构建,形成小组件UI数据。数据绑定模块调用MVVM框架将小组件UI数据和后台数据进行绑定。应用进程调用小组件服务进行绑定服务,小组件服务对小组件UI数据绑定事件代理;之后,跨进程发送小组件UI数据及小组件UI数据的事件代理等信息。这样,应用小组件进程收到小组件UI数据和小组件UI数据的事件代理信息后,可以按照小组件UI数据显示应用小组件UI。
应用小组件添加成功之后,用户可以在应用小组件UI上进行操作。应用小组件进程接收到用户在应用小组件UI上的操作,事件代理增加该操作相应的事件,将事件发送给应用进程;应用进程响应于该事件执行业务逻辑,并调用MVVM框架对后台数据进行更新。当应用的业务数据发生变化,后台数据变化引起MVVM框架更新小组件UI数据。应用进程跨进程发送更新的小组件UI数据,应用小组件进程接收到更新的小组件UI数据后,可以按照更新的小组件UI数据显示更新的应用小组件UI。
本申请实施例提供的用户接口界面实现方法,应用进程根据组件界面描述文件生成小组件UI数据,并向应用小组件进程发送组件界面描述文件或小组件UI数据;应用小组件进程根据组件界面描述文件或小组件UI数据生成应用小组件UI。开发者可以在组件界面描述文件中声明
Figure PCTCN2021108273-appb-000118
系统原生的布局方式和控件类型,还可以声明操作系统中自定义的控件类型和UI编程能力,以及开发者在App中定义的布局方式和控件种类。操作系统支持应用小组件进程调用UI引擎对组件界面描述文件进行解析执行,生成应用小组件UI。这样,支持应用小组件的UI上显示各种布局方式和控件类型,方便用户使用应用小组件,提高用户体验。
在一些实施例中,电子设备上添加了应用小组件之后,电子设备关机。电子设备再次开机后,显示应用小组件的UI。即电子设备添加了应用小组件之后,重新加载应用小组件 UI。如图31所示,手机100开机,手机100的桌面显示音乐小部件的UI 910。
在一种示例中,请参考图32,其示出了电子设备重新加载应用小组件UI方法的一种流程示例。
电子设备开机后,应用小组件进程启动。应用小组件进程从应用的安装包中获取组件界面描述文件。应用小组件进程调用自定义UI引擎对组件界面描述文件进行解析执行,构建小组件UI数据,并按照小组件UI数据显示应用小组件UI。
应用小组件重新加载成功之后,用户可以在应用小组件UI上进行操作。应用小组件进程接收到用户在应用小组件UI上的操作,事件代理增加该操作对应的事件,并拉起应用进程,使得应用进程在系统后台运行,将事件发送给应用进程;应用进程响应于该事件执行相应的业务逻辑,并调用MVVM框架更新后台数据。事件交互和后台数据变化的处理流程可参考图30的相关描述,此处不再赘述。
在该场景中,应用小组件进程生成并绘制加载应用小组件UI即可。应用进程生成小组件UI数据,绑定小组件UI数据与后台数据,建立小组件UI数据与应用小组件UI的对应关系等流程可以不再执行。
本申请实施例提供的用户接口界面实现方法,电子设备的应用进程根据组件界面描述文件生成小组件UI数据,将小组件UI数据与后台数据绑定;应用小组件进程也根据组件界面描述文件获取小组件UI数据,并将小组件UI数据显示为应用小组件的UI。这样,应用小组件的UI与后台数据建立了对应关系,可以支持应用小组件的UI上显示各种布局方式和控件类型,方便用户使用应用小组件,提高用户体验。
本申请实施例还提供一种用户接口界面实现方法,用于将电子设备上的App投屏至播放设备进行播放时UI的呈现。
随着物联网(Internet of things,IoT)快速发展,IoT设备类型与数量均快速增长。消费者可以使用手机、平板电脑等设备作为IoT设备的控制设备,对IoT设备进行控制,使得控制设备和IoT设备协同工作。比如,用户在控制设备上使用App时,用户在控制设备上使用App时,可以将该App投屏至IoT设备进行播放(IoT设备称为播放设备)。比如,由于电视的屏幕尺寸较大,可以为用户带来更好的观看体验,用户可以将手机上的App投屏至电视上进行播放。由于IoT设备屏幕形态和尺寸差异巨大,如何在各种不同形态和尺寸屏幕的IoT设备上进行投屏,获得与IoT设备屏幕形态和尺寸相匹配的投屏界面,是需要解决的一个问题。
本申请实施例提供一种用户接口界面实现方法及装置,支持将控制设备上各种UI投屏至IoT设备进行播放,提高用户体验。控制设备即为上述各实施例中用户侧电子设备(电子设备100)。
本申请实施例提供一种用户接口界面实现方法,请参考图33,电子设备200上安装有应用开发工具(比如,Android Studio,DevEco Studio等)。本申请中电子设备200也可称为开发者设备。通常,开发者使用一种界面描述语言,在应用开发工具中开发App的UI以及播放端UI。可以理解的,在一些实施例中,开发者可以直接在控制设备100上开发App的UI和播放端UI,并在控制设备100上运行该App;即电子设备200和控制设备100可以是同一个电子设备。本申请实施例对此并不进行限定。
在一些实施例中,开发者在应用开发工具中开发App的UI(即电子设备安装、运行 App时显示的UI),形成界面描述文件。本申请中界面描述文件也可称为描述文件。开发者还在应用开发工具中开发App的用于播放端显示的UI(即播放端UI),形成播放端界面描述文件。开发者将界面描述文件和播放端界面描述文件打包到App的安装包,在服务器300提供的应用市场中发布App。应用市场中可以提供各个App的安装包供用户下载。例如,安装包可以为
Figure PCTCN2021108273-appb-000119
应用程序包(Android application package,APK)文件。
以手机为控制设备100举例,用户可使用手机在应用市场中下载某一App的安装包。以视频App举例,手机下载视频App的安装包后,通过运行该安装包可将视频App安装在手机中。这样,手机也获取了安装包中的界面描述文件和播放端界面描述文件。
可选的,在一些实施例中,也可以将界面描述文件作为播放端界面描述文件,即界面描述文件和播放端界面描述文件是同一文件。
手机可以按照界面描述文件在显示屏呈现相应的App的UI。示例性的,手机下载视频App的安装包后,桌面生成“视频”图标101。用户可以点击“视频”图标101,以打开视频App。响应于用户对“视频”图标101的点击操作,手机运行视频App。手机上安装有OS平台,OS平台的自定义UI引擎读取界面描述文件,解析、执行界面描述语言,按照界面描述文件中的界面描述渲染出视频App的UI。手机的显示装置(比如显示屏)呈现视频App的UI 105。进一步的,界面描述文件中还可以包括对界面行为的定义。手机可以响应于用户对UI 105的操作,按照界面描述文件中定义的界面行为,执行相应的界面动作,实现界面行为。通常,OS平台还有对应的程序语言用于实现界面行为,实现UI 105的动态变化以及响应用户对UI 105的操作;例如
Figure PCTCN2021108273-appb-000120
使用JAVA,
Figure PCTCN2021108273-appb-000121
使用swift编程语言实现界面行为。
手机还可以将视频App的各个界面投屏到播放设备1000上进行显示。比如,将视频App的主界面或播放界面投屏至播放设备1000。播放设备1000按照播放端界面描述文件中与其设备类型匹配的界面描述渲染出对应的的播放端UI。示例性的,请继续参考图33,视频App的UI 105上包括“投屏”按钮106;“投屏”按钮106用于将手机上运行的App的界面投屏至播放设备进行显示。手机接收用户对“投屏”按钮106的点击操作,响应于用户对“投屏”按钮106的点击操作,手机显示设备选择界面107。设备选择界面107包括提示信息108,用于提示用户选择一个播放设备进行投屏。设备选择界面107还包括“客厅的电视”选项109和“我的平板电脑”选项10a。用户可以点击“客厅的电视”选项109,以将视频App的UI投屏至智能电视进行显示。手机接收用户对“客厅的电视”选项109的点击操作,响应于用户对“客厅的电视”选项109的点击操作,手机将视频App的UI投屏至智能电视。智能电视显示UI 105对应的播放端UI 1001。用户还可以点击“我的平板电脑”选项10a,以将视频App的UI投屏至平板电脑进行显示。手机接收用户对“我的平板电脑”选项10a的点击操作,响应于用户对“我的平板电脑”选项10a的点击操作,手机将视频App的UI投屏至平板电脑。平板电脑显示UI 105对应的播放端UI 1002。智能电视和平板电脑的设备类型不同,屏幕尺寸和形态也不同,智能电视上的播放端UI 1001与平板电脑上的播放端UI 1002的界面布局是不同的;即播放端UI在不同设备类型的电子设备上差异化显示。
上述播放设备1000可以包括便携式计算机(如手机等)、智能家居设备(比如,智能电视、智慧屏、大屏、智能音箱等)、手持计算机、个人数字助理(personal digital assistant, PDA)、可穿戴设备(比如,智能手表、智能手环等)、平板电脑、笔记本电脑、上网本、增强现实(augmented reality,AR)\虚拟现实(virtual reality,VR)设备、车载电脑等,本申请实施例对此不做任何限制。在一种示例中,播放设备1000可以包括图2所示结构,此处不再赘述。可以理解的是,本申请实施例图2示意的结构并不构成对播放设备1000的具体限定。在本申请另一些实施例中,播放设备1000可以包括比图2所示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图2所示的部件可以以硬件,软件或软件和硬件的组合实现。
请参考图34,控制设备100包括自定义UI引擎11,投屏框架13,传输通道适配14等单元。自定义UI引擎11提供IF1接口,投屏框架13提供IF2、IF3、IF4和IF5接口。
接口IF1-接口IF5的说明如表2所示:
表2
Figure PCTCN2021108273-appb-000122
自定义UI引擎11解析和执行App的界面描述文件,生成App的UI。自定义UI引擎11可以包括UI解析引擎11a,UI执行引擎11b,MVVM(model-view-viewmodel)框架11c等。UI解析引擎11a用于解析界面描述文件,将界面描述文件中内容转换成与UI执行引擎11b匹配的数据格式。在一些示例中,UI解析引擎11a还可以对界面描述文件中内容进行语法校验,如果对界面描述文件语法校验成功,则对界面描述文件进行解析;如果对界面描述文件语法校验不成功,则不执行解析界面描述文件。UI执行引擎11b用于根据UI解析引擎11a解析的数据构建UI的控件(实例化控件和属性设置),对控件进行布局编排,生成界面描述文件中声明的界面;还可以实现器件事件与用户行为之间的映射,响应于用户行为执行界面描述文件中定义的用户行为对应的动作等。MVVM框架11c用于对UI中的元素与后台数据进行双向绑定。在界面描述文件中,声明并指定UI中的元素(比如控件、控件组)与后台数据之间的绑定关系,可选的,还可以进行简单的数据实例设置之后;MVVM框架11c可以实现根据UI改变刷新后台数据,以及根据后台数据改变自动刷新对应UI。帮助开发者专注于UI设计与编排,简化UI开发过程,大幅减少开发者为实现前后端数据交互投入的开发时间。
传输通道适配14用于适配控制设备100和播放设备1000之间的数据传输通道。比如,将控制设备100的数据转换为适用于数据传输通道的格式,使得控制设备100可以通过数据传输通道向播放设备1000发送数据。
投屏框架13用于对播放端界面描述文件进行处理,形成用于显示播放端UI的播放端UI数据。投屏框架13包括虚拟控件构建13a、数据绑定13b、投屏服务13c、数据收发13d、资源传输13e、事件代理13f和生命周期13g等模块。其中,虚拟控件构建13a通过调用UI解析引擎11a和UI执行引擎11b,根据播放端界面描述文件构建控件、控件组等,形成播放端UI数据。这些播放端UI数据存在于App进程内,用于和后台数据进行绑定。数据绑定13b用于将虚拟控件构建13a构造的控件或控件组的属性、交互事件等与后台数据(比如,处理业务逻辑的控件模型(ViewModel))进行绑定。投屏服务13c用于对投屏过程中,当前处理的对象以及与该对象绑定的数据(model)进行跟踪处理;还用于管理控制设备100和播放设备1000之间的数据传输通道。数据收发13d用于控制设备100和播放设备1000之间的数据发送和接收。比如,可以定义收发代理的接口,实现控制设备100内置的默认收发器。再比如,可以在App内按照接口规范自定义适用于该App的收发器。例如,App内建立通过ContentProvider方式传输信息,则在数据收发13d中send()函数中用ContentProvider实现数据发送和接收。资源传输13e用于传输特定类型的数据资源(比如,数据量大于设定值的数据、图片、视频等)。资源传输13e用于对该特定类型的数据资源进行管理,比如发送、接收、缓存、标识、进度控制等。事件代理13f是用于传递事件的通道,以避免事件被数据传输阻塞。生命周期13g用于对投屏过程中控制设备100与播放设备1000的运行实体间联合的生命周期进行管理。示例性的,控制设备100和播放设备1000的生命周期如表3所示:
表3
Figure PCTCN2021108273-appb-000123
示例性的,如图35A所示,控制设备100在单机运行状态,触发投屏,等待用户授权向播放设备投屏。如果用户同意授权,播放设备向控制设备发送授权指令,控制设备100进入服务端运行状态。如果用户拒绝授权,播放设备向控制设备发送拒绝授权指令,控制 设备100停止投屏。控制设备100在服务端运行状态时,App切换至后台运行,则停止向播放设备推送数据;App切换至前台运行,开始向播放设备推送数据。控制设备100在服务端运行状态时,控制设备关闭App,或播放设备关闭,则停止投屏。控制设备100在服务端运行状态时,播放设备的App切换至后台运行,控制设备100进入服务端暂停状态。控制设备100在服务端暂停状态,播放设备的App切换至前台播放,控制设备100进入服务端运行状态。控制设备100在服务端暂停状态,播放设备关闭,则停止投屏。
示例性的,如图35B所示,播放设备1000收到控制设备发起的投屏请求,等待用户确认授权。如果用户确认授权,播放设备1000进入投屏运行状态;如果用户拒绝授权,播放设备1000停止运行。播放设备1000在投屏运行状态,App切换至后台运行,则进入后台驻留状态。播放设备1000在后台驻留状态,App切换至前台播放,则进入投屏运行状态。播放设备1000在投屏运行状态或后台驻留状态,播放设备关闭或控制设备关闭App,则停止运行。
请参考图36,App开发阶段,开发者在开发者设备上生成App的安装包,其中包含界面描述文件和播放端界面描述文件。开发者使用界面描述语言,按照界面描述语言的语法语义规范在开发者设备上开发界面描述文件和播放端界面描述文件,在界面描述文件和播放端界面描述文件中添加代码,进行UI开发。
开发者可以在界面描述文件和播放端界面描述文件中分别进行UI的布局编排、数据&界面绑定、交互行为编排以及差异化描述等。
App中所有UI是由控件构成的。UI的布局编排,即编排UI中控件的属性。比如,UI中的控件可以包括所有
Figure PCTCN2021108273-appb-000124
原生控件以及操作系统中拓展的控件,还支持开发者在App中自定义的或者通过静态包集成的控件。其中,控件具体可以包括文本控件,例如TextView控件、EditText控件等,也可以包括按钮控件,例如Button控件、ImageButton控件等,还可以包括图片控件,例如Image控件等,本申请实施例对此不做任何限制。控件属性包括
Figure PCTCN2021108273-appb-000125
原生属性以及操作系统中扩展的视觉属性、布局属性、交互属性、动效属性和软硬件依赖属性等。视觉属性是指控件的颜色、灰度等视觉效果。交互属性是基于用户行为提供控件响应的能力;比如基于用户的“确认”行为执行搜索。动效属性是指在控件上显示动画效果;比如在控件上显示点击回弹动效等。软硬件依赖属性是指控件依赖设备的软硬件参数。
数据&界面绑定,即在界面描述文件或播放端界面描述文件中声明并指定UI中的元素(比如控件、控件组)与后台数据之间的绑定关系。
交互行为编排,即在界面描述文件或播放端界面描述文件中声明控件响应事件对应的执行动作。控件支持的事件范围由控件支持的事件监听决定。例如,按钮控件(Button)支持对点击事件的监听setOnClickListener,则可以在界面描述文件中对控件绑定onClick(点击)事件。
差异化描述:
1、开发者可以在界面描述文件或播放端界面描述文件中声明电子设备配置参数的变量。当电子设备运行界面描述文件或播放端界面描述文件时,访问电子设备的配置参数,电子设备根据其软硬件条件获取配置参数的值。这样,不同类型的电子设备运行界面描述文件或播放端界面描述文件时,由于其软硬件条件不同,配置参数不同,生成的UI不同。
2、开发者可以在界面描述文件或播放端界面描述文件中自定义参数。以界面描述文件和播放端界面描述文件采用json格式为例,界面描述文件或播放端界面描述文件可以包括style、layout-data-common等部分。开发者在style中定义myTextStyle,并可以在layout-data-common中以$style.myTextStyle的方式调用该自定义参数。示例如下,
Figure PCTCN2021108273-appb-000126
3、开发者可以在界面描述文件或播放端界面描述文件中针对指定设备进行代码编排。示例性的,以界面描述文件和播放端界面描述文件采用json格式为例,界面描述文件或播放端界面描述文件可以包括如下结构:
Figure PCTCN2021108273-appb-000127
开发者可以在layout-data-common中声明通用播放端UI,各种类型的播放设备都解析layout-data-common中内容,按照layout-data-common中的内容布局通用播放端UI。layout-data-uimode用于描述指定设备的播放端UI。在一种实现方式中,layout-data-uimode中声明指定设备播放端UI与通用播放端UI的区别。指定设备结合layout-data-common和layout-data-uimode中内容进行解析执行,生成指定设备的播放端UI。在另一种实现方式中,layout-data-uimode中声明适用于指定设备的播放端UI的全部条件。指定设备按照layout-data-uimode中的内容布局其播放端UI。其中,指定设备可以为手机、手表、车机、智能家居设备(比如,智能电视、智慧屏、智能音箱等)、大屏、平板电脑、笔记本电脑或台式电脑等类型其中的一种。比如,layout-data-uimode的具体形式可以包括layout-data-phone(用于手机),layout-data-watch(用于手表),layout-data-television(用于智能电视),layout-data-pad(用于平板电脑),layout-data-car(用于车机)等。这样,不同类型的播放设备,可以根据其对应的代码段进行解析执行并构建播放端UI;实现在不同类型的播放设备上显示与播放设备的类型相匹配的播放端UI。
开发者将在开发者设备上生成的App安装包上传至服务器,App在服务器提供的应用 市场中发布。用户可使用用户侧电子设备(上述控制设备100)在应用市场中下载App的安装包。控制设备运行App的安装包之后,获取安装包中的界面描述文件和播放端界面描述文件。当控制设备运行App时,按照界面描述文件在显示屏显示与该控制设备相匹配的UI。
控制设备运行App过程中,还可以将App的界面投屏至播放设备进行显示。比如,控制设备根据用户输入确定进行投屏的播放设备,向播放设备发送投屏指令,投屏指令中包括投屏界面的标识。播放设备接收到投屏指令,根据投屏界面的标识获取到对应的播放端界面描述文件,并按照播放端界面描述文件形成与其设备类型匹配的播放端UI。
示例性的,请参考图37A,控制设备100接收到用户的投屏操作(比如,手机接收到用户对图33中“我的平板电脑”选项10a的点击操作),从App的安装包获取到当前界面对应的播放端界面描述文件。控制设备100还根据用户输入确定进行投屏的播放设备1000的设备类型(比如,手机接收到用户对图33中“客厅的电视”选项109的点击操作,则确定播放设备1000为智能电视;手机接收到用户对图33中“我的平板电脑”选项10a的点击操作,则确定播放设备1000为平板电脑)。
控制设备100的OS中的虚拟控件构建13a通过调用自定义UI引擎11解析和执行播放端界面描述文件中播放设备1000的设备类型对应的代码段,根据播放端界面描述文件中播放设备1000的设备类型对应的代码段进行控件构建,按照该代码段中的布局编排生成控件、控件组等,形成播放端UI数据。比如,播放设备1000是智能电视,控制设备100对播放端界面描述文件中与智能电视对应的代码段进行解析执行,形成向智能电视投屏的播放端UI数据;播放设备1000是平板电脑,控制设备100对播放端界面描述文件中与平板电脑对应的代码段进行解析执行,形成向平板电脑投屏的播放端UI数据。之后,数据绑定13b调用MVVM框架11c将播放端UI数据中的对象与后台数据(比如,控件模型)进行数据绑定。这样,如果后台数据发生变化可以刷新对应的播放端UI数据,播放端UI数据改变刷新对应的后台数据。进一步的,控制设备100将播放端界面描述文件以及资源文件(包括播放端界面描述文件中关联的数据资源)通过数据收发13d发送给播放设备1000。在一种实现方式中,控制设备100将播放端界面描述文件进行编码,布局信息、资源值、数据、响应事件定义等数据编码后,经数据传输通道传输至播放设备1000;特定类型的数据资源(比如,数据量大于设定值的数据、图片、视频等),经特定数据传输通道传输至播放设备1000。可选的,特定类型的数据资源可以在发送播放端界面描述文件之前传输至播放设备1000,这样,提高了将播放端界面描述文件传输至播放设备1000的速率,缩短播放设备1000显示播放端UI的时延。控制设备100还初始化事件代理13f,建立控制设备100和播放设备1000之间的事件传输通道,用于传输事件信息。
播放设备1000通过数据收发13d接收到播放端界面描述文件以及数据资源,播放设备1000的OS中的虚拟控件构建13a通过调用自定义UI引擎11解析和执行播放端界面描述文件中播放设备1000的设备类型对应的代码段,根据播放端界面描述文件中播放设备1000的设备类型对应的代码段进行控件构建,按照该代码段中的布局编排生成控件、控件组等,形成播放端UI数据(包括控件和控件的布局等信息);并按照该播放端UI数据进行显示,即显示播放端UI。由于控制设备100生成播放端UI数据与播放设备1000生成播放端UI数据使用的是同一段代码段,播放设备1000生成的播放端UI上的控件与控制设 备100生成的播放端UI数据中的控件是一一对应的。
生成播放端UI后,播放设备1000在显示屏显示与播放设备屏幕的形态和尺寸相匹配的播放端UI。示例性的,如图36所示,智能电视作为播放设备时,智能电视解析和执行播放端界面描述文件中layout-data-television代码段,生成智能电视相应的播放端UI;平板电脑作为播放设备时,平板电脑解析和执行播放端界面描述文件中layout-data-pad代码段,生成平板电脑相应的播放端UI。
可选的,在一些实施例中,如图37B所示,控制设备100根据播放端界面描述文件中播放设备1000的设备类型对应的代码段进行控件构建,按照该代码段中的布局编排生成播放端UI数据之后,将生成的播放端UI数据发送给播放设备1000。播放设备1000接收到播放端UI数据后,根据该播放端UI数据显示播放端UI。
本申请实施例提供的用户接口界面实现方法,播放设备根据播放端界面描述文件中播放设备的设备类型对应的代码段,生成与该播放设备相对应的播放端UI。不同类型的播放设备显示的播放端UI与其屏幕的形状和尺寸相匹配。并且,开发者在开发App时,可以方便的进行播放端UI开发,在App的播放端界面描述文件中定义各种类型的控件(包括所有
Figure PCTCN2021108273-appb-000128
原生控件以及操作系统中拓展的控件,还支持开发者在App中自定义的或者通过静态包集成的控件),支持的控件类型多样;使得各种类型的App都可以支持投屏功能,并且播放端UI支持的控件类型更丰富多样,方便用户使用。
在一种实现方式中,控制设备100的OS中的虚拟控件构建13a解析和执行播放端界面描述文件中播放设备1000的设备类型对应的代码段,根据播放端界面描述文件中播放设备1000的设备类型对应的代码段构建播放端UI数据。该播放端UI数据不在控制设备100的显示屏进行显示(即播放端UI数据存在于App进程内,不向显示进程发送)。控制设备100显示的是根据界面描述文件生成的UI。控制设备100的界面投屏至播放设备1000后,用户可以在控制设备100上进行其他操作,播放设备1000正常播放投屏内容。
示例性的,如图38A所示,手机显示“视频”App的界面1210,界面1210包括“投屏”按钮1211。手机接收用户对“投屏”按钮1211的点击操作,根据用户的输入确定播放设备。手机根据用户输入,向智能电视投屏。如图37A或图37B所示,手机根据播放端界面描述文件生成播放端UI数据,并向智能电视发送播放端界面描述文件。智能电视根据播放端界面描述文件生成播放端UI数据,并按照播放端UI数据显示播放端UI。请参考图38A,智能电视显示播放端UI 1220。
之后,用户可以继续对手机进行其他操作。示例性的,如图38A所示,手机接收用户对图片1212的点击操作,响应于用户对图片1212的点击操作,显示“视频”App的界面1230。
这样,控制设备向播放设备投屏后,可以继续执行其他功能,与播放设备播放投屏内容独立运行、互不影响,实现设备间更好地协同工作。
示例性的,如图38B所示,控制设备100的OS中的自定义UI引擎11解析和执行界面描述文件,生成应用的UI,并显示该应用的UI。MVVM框架11c将应用的UI与后台数据(比如,控件模型)进行数据绑定。
控制设备100的OS中的虚拟控件构建13a通过调用自定义UI引擎11解析和执行播放端界面描述文件中播放设备1000的设备类型对应的代码段,根据播放端界面描述文件 中播放设备1000的设备类型对应的代码段进行控件构建,按照该代码段中的布局编排生成控件、控件组等,形成播放端UI数据。之后,数据绑定13b调用MVVM框架11c将播放端UI数据中的对象与后台数据(比如,控件模型)进行数据绑定。进一步的,控制设备100将播放端界面描述文件以及资源文件(包括播放端界面描述文件中关联的数据资源)通过数据收发13d发送给播放设备1000;或者将播放端UI数据发送给播放设备1000,这样播放设备1000可以根据播放端UI数据显示播放端UI。
在另一种实现方式中,控制设备100的OS中的虚拟控件构建13a解析和执行播放端界面描述文件中播放设备1000的设备类型对应的代码段,根据播放端界面描述文件中播放设备1000的设备类型对应的代码段构建播放端UI数据。App进程将播放端UI数据发送给显示进程,在控制设备100的显示屏上显示播放端UI。控制设备100和播放设备1000显示的是根据同一段代码段生成的播放端UI。
示例性的,如图39A所示,手机显示“视频”App的界面1210,界面1210包括“投屏”按钮1211。手机接收用户对“投屏”按钮1211的点击操作,根据用户的输入确定播放设备。手机根据用户输入,向智能电视投屏。如图39B所示,手机根据播放端界面描述文件生成播放端UI数据,并按照播放端UI数据显示播放端UI。手机还向智能电视发送播放端界面描述文件。智能电视根据播放端界面描述文件生成播放端UI数据,并按照播放端UI数据显示播放端UI。
如图39A所示,手机向智能电视投屏之后,手机也显示播放端UI,手机和智能电视都显示播放端UI 1220。
这样,控制设备与播放设备同步播放播放端UI,可以实现镜像投屏,控制设备与播放设备协同工作。
在一些实施例中,控制设备100接收到用户在播放端UI上的操作或业务数据发生变化时,数据绑定13b调用MVVM框架11c更新播放端UI数据。由于播放端UI的控件与播放端UI数据中控件存在一一对应关系,播放端UI数据更新触发播放端UI更新。这样,控制设备100接收到用户操作或者业务数据发生变化,播放设备1000的播放端UI可以同步更新。
示例性的,请参考图40A,手机显示“视频”App的UI 1310。手机根据用户输入,将“视频”App的UI 1310投屏至智能电视。智能电视显示“视频”App的播放端UI 1320,播放端UI 1320包括“播放”按钮1321。
用户可以在智能电视上执行开启播放视频的操作(比如,用户通过智能电视的遥控器选中“播放”按钮1321,并点击“播放”按钮1321)。智能电视接收到用户对“播放”按钮1321的点击操作,响应于用户对“播放”按钮1321的点击操作,执行播放视频,并且显示更新的UI 1320。更新后的播放端UI 1320包括“暂停”按钮1322。
在一种实现方式中,如图40B所示,事件代理13f中定义了一个专用的事件传输类,该事件传输类用于跨设备传输。该事件传输类中存储多个事件,其中每个事件包括布局标识、控件标识、事件类型等信息。播放设备1000接收到用户在播放端UI上的操作,在事件传输类中产生该操作对应的事件,并通过事件传输通道传输给控制设备100。控制设备100接收到该事件后,根据布局标识和控件标识获取对应的控件,并根据作用于该控件上的该事件执行相应的业务逻辑。由于播放端UI的控件与播放端UI数据中控件存在一一对 应关系,控制设备100还更新后台数据,后台数据改变触发播放端UI数据更新。控制设备100将更新后的播放端UI数据发送给播放设备1000,播放设备1000按照更新的播放端UI数据显示更新后的播放端UI。
这样,用户可以在播放设备上控制App,由控制设备执行相应的业务逻辑,并更新播放设备上的播放端UI。在一些示例中,如果控制设备和播放设备镜像显示播放端UI,还可以同步更新控制设备上的UI,方便用户使用。并且,对于用户在播放设备上的操作,由控制设备执行相关业务逻辑,控制设备统一控制播放设备,方便管理;且避免播放设备性能较低,不支持复杂的业务逻辑处理。
在一些实施例中,播放设备1000接收到用户在播放端UI上的第二操作。播放设备1000从控制设备100获取更新的播放端界面描述文件,并生成更新的播放端UI。
示例性的,如图40C,手机显示“视频”App的UI 1330。手机根据用户输入,将“视频”App的UI 1330投屏至智能电视。智能电视显示“视频”App的播放端UI 1340。智能电视接收到用户在播放端UI 1340上的第二操作(比如,用户通过遥控器将播放端UI 1340上的焦点从“电影”移至“综艺”)。响应于用户在播放端UI 1340上的第二操作,智能电视显示更新的播放端UI,即“视频”App的播放端UI 1350。
在一种实现方式中,如图40D所示,播放设备1000接收到用户在播放端UI上的第二操作,在事件传输类中产生该第二操作对应的事件,并通过事件传输通道传输给控制设备100。控制设备100接收到该事件后,根据布局标识和控件标识获取对应的控件,并根据作用于该控件上的该事件执行相应的业务逻辑。控制设备100确定将播放端UI更新为焦点为“综艺”的播放端UI,获取焦点为“综艺”的播放端UI对应的播放端界面描述文件二。控制设备100的OS中的虚拟控件构建13a通过调用UI,解析和执行播放端界面描述文件二中播放设备1000的设备类型对应的代码段,根据播放端界面描述文件二中播放设备1000的设备类型对应的代码段进行控件构建,按照该代码段中的布局编排生成控件、控件组等,形成播放端UI数据二。之后,数据绑定13b调用MVVM框架11c将播放端UI数据二中的对象与后台数据(比如,控件模型)进行数据绑定。进一步的,控制设备100将播放端界面描述文件二以及资源文件(包括播放端界面描述文件二中关联的数据资源)通过数据收发13d发送给播放设备1000。在一种实现方式中,控制设备100将播放端界面描述文件二进行编码,布局信息、资源值、数据、响应事件定义等数据编码后,经数据传输通道传输至播放设备1000;特定类型的数据资源(比如,数据量大于设定值的数据、图片、视频等),经特定数据传输通道传输至播放设备1000。播放设备1000通过数据收发13d接收到播放端界面描述文件二以及数据资源资源文件,播放设备1000的OS中的虚拟控件构建13a通过调用自定义UI引擎11解析和执行播放端界面描述文件二中播放设备1000的设备类型对应的代码段,根据播放端界面描述文件二中播放设备1000的设备类型对应的代码段进行控件构建,按照该代码段中的布局编排生成控件、控件组等,形成播放端UI数据二(包括控件和控件的布局等信息);并按照该播放端UI数据二进行显示,即显示更新的播放端UI。
这样,用户可以直接在播放设备上对播放端UI进行操作,控制设备执行该操作对应的业务逻辑,并向播放设备发送更新的播放端UI对应的播放端界面描述文件,播放设备根据更新的播放端界面描述文件生成更新的播放端UI。实现在播放设备上直接对播放端 UI进行操作,成功切换播放端UI。
请参考图41A,其示出了本申请实施例提供的用户接口界面实现方法中控制设备的处理流程的一种示例。控制设备安装了App后,投屏框架、MVVM框架、后台数据等进行初始化。然后资源传输模块传输App相关的数据资源,并将数据资源绑定投屏服务。投屏框架从应用安装包中获取播放端界面描述文件,并发送给虚拟控件构建模块。虚拟控件构建模块根据播放端界面描述文件进行控件构建,形成播放端UI数据;并将播放端UI数据绑定投屏服务。之后,虚拟控件构建模块通知数据绑定模块进行播放端UI数据与后台数据的绑定,数据绑定模块调动MVVM框架将播放端UI数据与后台数据进行绑定。投屏服务还绑定事件代理。进一步的,播放端界面描述文件经编码后发送给播放设备。这样,播放设备接收到编码的播放端界面描述文件后,可以根据播放端界面描述文件生成播放端UI并显示。
投屏框架接收到播放设备发送的事件,将事件发送给MVVM框架;MVVM框架根据事件更新后台数据。当App的业务数据发生变化,后台数据变化引起MVVM框架更新播放端UI数据。控制设备向播放设备发送更新的播放端UI数据。这样,播放设备接收到更新的播放端UI数据后,可以按照更新的播放端UI数据显示更新的播放端UI。
请参考图41B,其示出了本申请实施例提供的用户接口界面实现方法中播放设备的处理流程的一种示例。播放设备通过传输通道接收到控制设备发送的编码的播放端界面描述文件。投屏框架调用自定义UI引擎对播放端界面描述文件进行解析执行,形成播放端UI数据,并按照播放端UI数据显示播放端UI。事件代理模块接收到用户在播放端UI上的操作,生成对应的事件,并向控制设备传输该事件,以使得控制设备处理该事件。
本申请实施例提供一种用户接口界面实现方法,控制设备运行App的过程中,如果满足预设条件,控制设备将预设信息推送至播放设备进行播放。
以手机作为控制设备,智能手表作为播放设备为例。示例性的,如图42A,用户在手机上打开“外卖”App进行订餐,下单支付。可选的,App切换至后台运行。满足预设条件(比如,手机确定外卖订单预计再过20分钟送达;再比如,用户在智能手表上进行查询操作)时,手机将预设信息推送至智能手表进行显示。比如,如图42A所示,智能手表显示播放端UI 1410;播放端UI 1410上包括“外卖订单进展”控件1411,提示信息1412。
在一种实现方式中,开发者在开发阶段定义满足预设条件时向播放端推送信息的播放端界面描述文件(或播放端界面描述文件中一段代码)。在该播放端界面描述文件中定义了智能手表的播放端UI包括控件1411和提示信息1412。手机运行“外卖”App过程(包括“外卖”App切换至后台运行)中,手机确定满足预设条件,读取指定的代码段,根据指定的代码段生成播放端UI数据,并将该指定的代码段(或生成的播放端UI数据)发送给智能手表。智能手表根据指定的代码段(或生成的播放端UI数据)生成播放端UI 1410。
示例性的,如图42B,用户在手机上使用导航App进行导航。满足预设条件(比如,改变前进方向)时,手机根据指定的代码段生成播放端UI数据,并将该指定的代码段(或生成的播放端UI数据)发送给智能手表。智能手表根据指定的代码段(或生成的播放端UI数据)生成播放端UI 1420。
进一步的,智能手表接收用户作用于智能手表的操作,生成该操作对应的事件,并将事件发送给手机进行处理。手机进行业务逻辑处理,执行相应的动作;并更新播放端UI 数据。手机还将更新后的播放端UI数据发送给智能手表,智能手表根据更新后的播放端UI数据更新播放端UI。
本申请实施例提供的用户接口界面实现方法,在满足预设条件时,控制设备自动将运行的App的部分信息推送至播放设备进行播放。不同类型的播放设备可以读取该设备类型对应的代码段,可以方便的实现设备差异化布局播放端UI。并且用户可以在播放设备上控制App,由控制设备进行业务逻辑处理;这样可以提高用户的使用体验,避免播放设备性能较低,不支持复杂的业务逻辑处理。
可以理解的是,上述电子设备为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请实施例能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。本领域技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请实施例的范围。
本申请实施例可以根据上述方法示例对上述电子设备进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
如图43所示,本申请实施例公开了一种电子设备1500,该电子设备可以为运行上述开发工具的电子设备,或上述实施例中运行App的电子设备,或上述实施例中运行应用小组件的电子设备;该电子设备可以为上述控制设备或播放设备。该电子设备具体可以包括:显示屏1501;输入设备1502(例如鼠标、键盘或触摸屏等);一个或多个处理器1503;存储器1504;一个或多个应用程序(未示出);以及一个或多个计算机程序1505,上述各器件可以通过一个或多个通信总线1506连接。其中,上述一个或多个计算机程序1505被存储在上述存储器1504中并被配置为被该一个或多个处理器1503执行,该一个或多个计算机程序1505包括指令,该指令可以用于执行上述实施例中的相关步骤。在一种示例中,该电子设备1500可以为图1中电子设备100或电子设备200。在一种示例中,该电子设备1500可以为图14中开发者设备或用户侧电子设备。在一种示例中,该电子设备1500可以为图23中电子设备100或电子设备200。在一种示例中,该电子设备1500可以为图33中控制设备100或电子设备200或播放设备1000。
本申请实施例还提供一种计算机可读存储介质,该计算机可读存储介质中存储有计算机程序代码,当处理器执行该计算机程序代码时,电子设备执行上述实施例中的方法。
本申请实施例还提供了一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述实施例中的方法。
其中,本申请实施例提供的电子设备1500、计算机可读存储介质或者计算机程序产品均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述 功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以使用硬件的形式实现,也可以使用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、ROM、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (87)

  1. 一种用户接口界面实现方法,其特征在于,包括:
    电子设备安装第一应用的应用安装包,所述应用安装包包括第一描述文件和第二描述文件;所述第一描述文件和所述第二描述文件用于对所述第一应用的第一用户接口界面UI进行界面描述和界面行为定义;所述第一描述文件采用第一界面描述语言,所述第二描述文件采用第二界面描述语言;所述第一界面描述语言与所述第二界面描述语言不同;
    所述电子设备运行所述第一应用;其中,所述电子设备的第一UI引擎读取所述第一描述文件,并解析和执行所述第一描述文件,生成所述第一UI的第一部分;所述电子设备的第二UI引擎读取所述第二描述文件,并解析和执行所述第二描述文件,生成所述第一UI的第二部分;
    所述电子设备显示所述第一UI。
  2. 根据权利要求1所述的方法,其特征在于,所述电子设备的第一UI引擎生成所述第一UI的第一部分包括:
    所述电子设备的第一UI引擎根据所述第一描述文件生成所述第一UI中一个或多个第一控件;所述一个或多个第一控件具备第一UI编程能力。
  3. 根据权利要求2所述的方法,其特征在于,所述电子设备的第二UI引擎生成所述第一UI的第二部分包括:
    所述电子设备的第二UI引擎根据所述第二描述文件对一个或多个所述第一控件应用第二UI编程能力。
  4. 根据权利要求2或3所述的方法,其特征在于,所述电子设备的第二UI引擎生成所述第一UI的第二部分包括:
    所述电子设备的第二UI引擎根据所述第二描述文件生成所述第一UI中一个或多个第二控件;所述一个或多个第二控件具备第二UI编程能力。
  5. 根据权利要求3或4所述的方法,其特征在于,所述第二UI编程能力包括:视觉属性能力,布局能力,统一交互能力和动效能力中至少一种。
  6. 根据权利要求5所述的方法,其特征在于,所述布局能力包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
  7. 根据权利要求1-6任意一项所述的方法,其特征在于,所述方法还包括:
    所述电子设备确定存在所述第二描述文件,所述第二UI引擎触发所述第一UI引擎读取所述第一描述文件,所述第二UI引擎读取所述第二描述文件。
  8. 根据权利要求1-7任意一项所述的方法,其特征在于,所述第一描述文件与所述第二描述文件在所述应用安装包中不同路径。
  9. 根据权利要求1-8任意一项所述的方法,其特征在于,所述方法还包括:
    所述第二UI引擎对所述第二界面描述语言进行语法校验;
    若所述语法校验通过,所述第二UI引擎解析和执行所述第二描述文件。
  10. 根据权利要求1-9任意一项所述的方法,其特征在于,所述方法还包括:
    所述电子设备的第二UI引擎实现器件事件与所述第二描述文件中用户行为之间的映射;
    响应于所述器件事件,执行所述第二描述文件中用户行为对应的控件动作。
  11. 根据权利要求1-10任意一项所述的方法,其特征在于,所述第二UI引擎包括所述第二描述文件中字段的语法语义规范集合。
  12. 一种用户接口界面实现方法,其特征在于,包括:
    显示第一应用的开发界面;所述第一应用的开发界面包括第一描述文件和第二描述文件;所述第一描述文件和所述第二描述文件用于对所述第一应用的第一用户接口界面UI进行界面描述和界面行为定义;所述第一描述文件采用第一界面描述语言,所述第二描述文件采用第二界面描述语言;所述第一界面描述语言与所述第二界面描述语言不同;
    响应于用户输入的第一操作,在所述第一描述文件中增加对所述第一UI的第一部分的描述;
    响应于用户输入的第二操作,在所述第二描述文件中增加对所述第一UI的第二部分的描述;
    根据所述第一描述文件和所述第二描述文件生成所述第一应用的应用安装包。
  13. 根据权利要求12所述的方法,其特征在于,所述在所述第一描述文件中增加对所述第一UI的第一部分的描述包括:
    在所述第一描述文件中增加对所述第一UI中一个或多个第一控件的描述;
    对所述一个或多个第一控件应用第一UI编程能力。
  14. 根据权利要求13所述的方法,其特征在于,所述在所述第二描述文件中增加对所述第一UI的第二部分的描述包括:
    在所述第二描述文件中增加对一个或多个所述第一控件的描述;
    对一个或多个所述第一控件应用第二UI编程能力。
  15. 根据权利要求13或14所述的方法,其特征在于,所述在所述第二描述文件中增加对所述第一UI的第二部分的描述包括:
    在所述第二描述文件中增加对一个或多个第二控件的描述;
    对所述一个或多个第二控件应用第二UI编程能力。
  16. 根据权利要求14或15所述的方法,其特征在于,所述第二UI编程能力包括:视觉属性能力,布局能力,统一交互能力和动效能力中至少一种。
  17. 根据权利要求16所述的方法,其特征在于,所述布局能力包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
  18. 根据权利要求12-17任意一项所述的方法,其特征在于,所述第一描述文件与所述第二描述文件在所述应用安装包中不同路径。
  19. 一种电子设备,其特征在于,包括:
    一个或多个处理器;
    显示屏;
    存储器;
    其中,所述存储器中存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述电子设备执行时,使得所述电子设备执行如权利要求1-18中任意一项所述的方法。
  20. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在 于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求12-18中任意一项所述的方法。
  21. 一种计算机可读存储介质,其特征在于,包括计算机指令,所述计算机指令用于对第一应用的第一用户接口界面UI进行界面描述和界面行为定义,其中,
    所述计算机指令包括存储于第一描述文件中的第一指令,以及存储于第二描述文件中的第二指令;
    所述第一描述文件采用第一界面描述语言,所述第二描述文件采用第二界面描述语言;所述第一界面描述语言与所述第二界面描述语言不同;
    所述第一指令用于描述所述第一UI的第一部分,所述第二指令用于描述所述第一UI的第二部分。
  22. 根据权利要求21所述的计算机可读存储介质,其特征在于,所述第一指令具体用于:
    描述所述第一UI中一个或多个第一控件,对所述一个或多个第一控件应用第一UI编程能力。
  23. 根据权利要求22所述的计算机可读存储介质,其特征在于,所述第二指令具体用于:
    对所述一个或多个第一控件应用第二UI编程能力。
  24. 根据权利要求22或23所述的计算机可读存储介质,其特征在于,所述第二指令具体用于:
    描述所述第一UI中一个或多个第二控件,对所述一个或多个第二控件应用第二UI编程能力。
  25. 根据权利要求23或24所述的计算机可读存储介质,其特征在于,所述第二UI编程能力包括:视觉属性能力,布局能力,统一交互能力和动效能力中至少一种。
  26. 根据权利要求25所述的计算机可读存储介质,其特征在于,所述布局能力包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
  27. 根据权利要求21-26任意一项所述的计算机可读存储介质,其特征在于,所述第一描述文件与所述第二描述文件在所述计算机可读存储介质中不同路径。
  28. 一种用户接口界面实现方法,其特征在于,包括:
    第一电子设备和第二电子设备分别从服务器下载第一应用的应用安装包;所述应用安装包包括描述文件和资源文件;所述描述文件用于对所述第一应用的第一用户接口界面UI进行界面描述和界面行为定义;所述资源文件包括生成所述第一应用的UI使用的资源;
    所述第一电子设备和所述第二电子设备分别安装所述应用安装包;
    所述第一电子设备读取所述描述文件中与所述第一电子设备的设备类型对应的第一代码,按照所述第一代码的定义使用所述资源文件的资源生成所述第一电子设备的第一UI;
    所述第二电子设备读取所述描述文件中与所述第二电子设备的设备类型对应的第二代码,按照所述第二代码的定义使用所述资源文件的资源生成所述第二电子设备的第一UI;
    所述第一电子设备的设备类型与所述第二电子设备的设备类型不同。
  29. 根据权利要求28所述的方法,其特征在于,所述方法还包括:
    所述第一电子设备按照所述描述文件中第三代码的定义生成所述第一电子设备的 第一UI中第一控件,所述第一电子设备的第一UI中第一控件具备所述第一电子设备的操作系统自定义的控件属性;所述第三代码是所述第一代码的部分或全部;
    第三电子设备按照所述描述文件中第三代码的定义生成所述第三电子设备的第一UI中第一控件,所述第三电子设备的第一UI中第一控件具备通用操作系统的控件属性。
  30. 一种用户接口界面实现方法,其特征在于,包括:
    第一电子设备下载第一应用的应用安装包;所述应用安装包包括描述文件和资源文件;所述描述文件用于对所述第一应用的第一用户接口界面UI进行界面描述和界面行为定义;所述资源文件包括生成所述第一应用的UI使用的资源;
    所述第一电子设备安装所述应用安装包;
    所述第一电子设备读取所述描述文件中与所述第一电子设备的设备类型对应的第一代码,按照所述第一代码的定义使用所述资源文件的资源生成所述第一电子设备的第一UI。
  31. 根据权利要求30所述的方法,其特征在于,所述方法还包括:
    所述第一电子设备按照所述描述文件中第三代码的定义生成所述第一电子设备的第一UI中第一控件,所述第一电子设备的第一UI中第一控件具备所述第一电子设备的操作系统自定义的控件属性;所述第三代码是所述第一代码的部分或全部。
  32. 根据权利要求29或31所述的方法,其特征在于,所述第一电子设备的操作系统包括自定义UI编程能力,所述自定义UI编程能力用于提供所述第一电子设备的操作系统自定义的控件属性。
  33. 根据权利要求32所述的方法,其特征在于,所述自定义的控件属性包括:视觉属性,布局属性,交互属性,动效属性和软硬件依赖属性中至少一种。
  34. 根据权利要求33所述的方法,其特征在于,所述布局属性包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
  35. 根据权利要求28-34任意一项所述的方法,其特征在于,所述第一电子设备的第一UI包括第二控件,所述第二控件具备通用操作系统的控件属性。
  36. 根据权利要求28-35任意一项所述的方法,其特征在于,所述第一电子设备的第一UI包括第三控件,所述第三控件具备所述第一应用中自定义的控件属性。
  37. 根据权利要求28-36任意一项所述的方法,其特征在于,所述描述文件包括第四代码,所述第四代码用于定义所述第一电子设备的第一UI中第四控件的控件属性与所述第一电子设备的操作系统中第一数据的对应关系,
    所述方法还包括:
    所述第一电子设备接收用户在所述第四控件上的第一输入;
    根据所述第一输入修改所述第一数据的值。
  38. 根据权利要求37所述的方法,其特征在于,所述方法还包括:
    所述第一电子设备的第一UI中第四控件的控件属性随着所述第一电子设备的操作系统中第一数据的改变而改变。
  39. 一种用户接口界面实现方法,其特征在于,包括:
    显示第一应用的开发界面;所述第一应用的开发界面包括描述文件;所述描述文件用于对所述第一应用的第一用户接口界面UI进行界面描述和界面行为定义;
    响应于用户输入的第一操作,在所述描述文件中增加与第一电子设备的设备类型对应 的第一代码;
    响应于用户输入的第二操作,在所述描述文件中增加与第二电子设备的设备类型对应的第二代码;所述第一电子设备的设备类型与所述第二电子设备的设备类型不同;
    根据所述描述文件生成所述第一应用的应用安装包。
  40. 根据权利要求39所述的方法,其特征在于,所述第一应用的应用安装包还包括资源文件,所述资源文件包括生成所述第一应用的UI使用的资源。
  41. 根据权利要求39或40所述的方法,其特征在于,
    所述描述文件中包括定义第一控件具备所述第一电子设备的操作系统自定义的控件属性的第三代码,以及定义第二控件具备通用操作系统的控件属性的第四代码。
  42. 根据权利要求41所述的方法,其特征在于,所述第一电子设备的操作系统自定义的控件属性包括:视觉属性,布局属性,交互属性,动效属性和软硬件依赖属性中至少一种。
  43. 根据权利要求42所述的方法,其特征在于,所述布局属性包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
  44. 一种电子设备,其特征在于,包括:
    一个或多个处理器;
    显示屏;
    存储器;
    其中,所述存储器中存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述电子设备执行时,使得所述电子设备执行如权利要求30-43中任意一项所述的方法。
  45. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求39-43中任意一项所述的方法。
  46. 一种计算机可读存储介质,其特征在于,包括计算机指令,所述计算机指令用于对第一应用的第一用户接口界面UI进行界面描述和界面行为定义;其中,所述计算机指令包括与第一电子设备的设备类型对应的第一代码以及与第二电子设备的设备类型对应的第二代码;所述第一电子设备的设备类型与所述第二电子设备的设备类型不同。
  47. 根据权利要求46所述的计算机可读存储介质,其特征在于,所述计算机指令还包括生成所述第一应用的UI使用的资源。
  48. 根据权利要求46或47所述的计算机可读存储介质,其特征在于,所述计算机指令还包括定义第一控件具备所述第一电子设备的操作系统自定义的控件属性的第三代码,以及定义第二控件具备通用操作系统的控件属性的第四代码。
  49. 一种用户接口界面实现方法,其特征在于,包括:
    电子设备的第一应用进程读取组件界面描述文件,所述组件界面描述文件用于对所述第一应用的应用小组件的第一用户接口界面UI进行界面描述和界面行为定义;
    所述第一应用进程根据所述组件界面描述文件生成第一小组件UI数据,将所述第一小组件UI数据中的控件与所述电子设备操作系统中的后台数据进行绑定;
    所述第一应用进程向应用小组件进程发送第一数据;
    所述应用小组件进程接收所述第一数据,根据所述第一数据获取所述第一小组件UI数据;按照所述第一小组件UI数据显示所述应用小组件的第一UI。
  50. 根据权利要求49所述的方法,其特征在于,所述第一数据为所述组件界面描述文件,所述应用小组件进程根据所述第一数据获取所述第一小组件UI数据包括:
    所述应用小组件进程根据所述组件界面描述文件生成所述第一小组件UI数据。
  51. 根据权利要求49所述的方法,其特征在于,所述第一数据为第一小组件UI数据。
  52. 根据权利要求50或51所述的方法,其特征在于,所述方法还包括:
    所述第一应用进程按照所述组件界面描述文件中第一代码的定义生成所述第一小组件UI数据中第一控件,所述第一控件具备所述电子设备的操作系统原生的控件属性。
  53. 根据权利要求52所述的方法,其特征在于,所述操作系统原生的控件包括:
    输入框,复选框,滑动选择器,滚动视图,单选按钮,评分条,搜索框,拖动条,或开关。
  54. 根据权利要求49-53任意一项所述的方法,其特征在于,所述方法还包括:
    所述第一应用进程按照所述组件界面描述文件中第二代码的定义生成所述第一小组件UI数据中第二控件,所述第二控件具备所述电子设备的操作系统中自定义的控件属性。
  55. 根据权利要求54所述的方法,其特征在于,所述自定义的控件属性包括:视觉属性,布局属性,交互属性,动效属性和软硬件依赖属性中至少一种。
  56. 根据权利要求55所述的方法,其特征在于,所述布局属性包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
  57. 根据权利要求49-56任意一项所述的方法,其特征在于,所述组件界面描述文件包括第三代码,所述第三代码用于定义所述应用小组件的第一UI中第三控件的控件属性与所述电子设备的操作系统中第一数据的对应关系,
    所述方法还包括:
    所述电子设备接收用户在所述第三控件上的第一输入;
    根据所述第一输入修改所述第一数据的值。
  58. 根据权利要求57所述的方法,其特征在于,所述方法还包括:
    所述应用小组件的第一UI中第三控件的控件属性随着所述电子设备的操作系统中第一数据的改变而改变。
  59. 根据权利要求49-58任意一项所述的方法,其特征在于,所述方法还包括:
    所述电子设备从服务器下载所述第一应用的应用安装包;所述应用安装包包括所述组件界面描述文件;
    所述电子设备使用所述应用安装包安装所述第一应用。
  60. 一种用户接口界面实现方法,其特征在于,包括:
    显示第一应用的开发界面;所述第一应用的开发界面包括组件界面描述文件;所述组件界面描述文件用于对所述第一应用的应用小组件的第一用户接口界面UI进行界面描述和界面行为定义;
    响应于用户输入的第一操作,在所述组件界面描述文件中增加定义所述第一小组件 UI中第一控件的第一代码;所述第一控件具备操作系统原生的控件属性;其中,所述操作系统原生的控件包括:输入框,复选框,滑动选择器,滚动视图,单选按钮,评分条,搜索框,拖动条,或开关;
    根据所述组件界面描述文件生成所述第一应用的应用安装包。
  61. 根据权利要求60所述的方法,其特征在于,所述方法还包括:
    响应于用户输入的第二操作,在所述组件界面描述文件中增加定义所述第一小组件UI中第二控件的第二代码,所述第二控件具备所述操作系统中自定义的控件属性;所述自定义的控件属性包括:视觉属性,布局属性,交互属性,动效属性和软硬件依赖属性中至少一种。
  62. 根据权利要求61所述的方法,其特征在于,所述布局属性包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
  63. 一种电子设备,其特征在于,包括:
    一个或多个处理器;
    显示屏;
    存储器;
    其中,所述存储器中存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述电子设备执行时,使得所述电子设备执行如权利要求49-62中任意一项所述的方法。
  64. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求60-62中任意一项所述的方法。
  65. 一种计算机可读存储介质,其特征在于,包括计算机指令,所述计算机指令用于对第一应用的应用小组件的第一用户接口界面UI进行界面描述和界面行为定义;
    其中,所述计算机指令包括生成所述第一小组件UI中第一控件的第一代码,所述第一控件具备操作系统原生的控件属性;
    所述操作系统原生的控件包括:输入框,复选框,滑动选择器,滚动视图,单选按钮,评分条,搜索框,拖动条,或开关。
  66. 根据权利要求65所述的计算机可读存储介质,其特征在于,
    所述计算机指令还包括生成所述第一小组件UI中第二控件的第二代码,所述第二控件具备所述操作系统中自定义的控件属性;
    所述自定义的控件属性包括:视觉属性,布局属性,交互属性,动效属性和软硬件依赖属性中至少一种。
  67. 根据权利要求66所述的计算机可读存储介质,其特征在于,所述布局属性包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
  68. 一种用户接口界面实现方法,其特征在于,包括:
    第一电子设备读取第一应用的第一播放端界面描述文件,所述第一播放端界面描述文件用于对在第二电子设备上播放所述第一应用的第一播放端用户接口界面UI进行界面描述和界面行为定义;
    所述第一电子设备根据所述第一播放端界面描述文件生成第一播放端UI数据,将所 述第一播放端UI数据中的控件与所述第一电子设备操作系统中的后台数据进行绑定;
    所述第一电子设备向所述第二电子设备发送第一数据;
    所述第二电子设备接收所述第一数据,根据所述第一数据获取所述第一播放端UI数据,按照所述第一播放端UI数据显示所述第一播放端UI。
  69. 根据权利要求68所述的方法,其特征在于,所述第一数据为所述第一播放端界面描述文件,所述第二电子设备根据所述第一数据获取所述第一播放端UI数据包括:
    所述第二电子设备根据所述第一播放端界面描述文件生成所述第一播放端UI数据。
  70. 根据权利要求68所述的方法,其特征在于,所述第一数据为所述第一播放端UI数据。
  71. 根据权利要求68-70任意一项所述的方法,其特征在于,所述方法还包括:
    所述第二电子设备接收用户在所述第一播放端UI上的第一操作;
    响应于用户在所述第一播放端UI上的第一操作,所述第二电子设备向所述第一电子设备发送第一指令;
    所述第一电子设备接收到所述第一指令,读取第二播放端界面描述文件;所述第二播放端界面描述文件用于对在第二电子设备上播放所述第一应用的第二播放端UI进行界面描述和界面行为定义;
    所述第一电子设备根据所述第二播放端界面描述文件生成第二播放端UI数据,将所述第二播放端UI数据中的控件与所述第一电子设备操作系统中的后台数据进行绑定;
    所述第一电子设备向所述第二电子设备发送所述第二播放端界面描述文件;
    所述第二电子设备接收所述第二播放端界面描述文件,根据所述第二播放端界面描述文件生成第二播放端UI数据,按照所述第二播放端UI数据显示所述第二播放端UI。
  72. 根据权利要求68-70任意一项所述的方法,其特征在于,所述方法还包括:
    所述第二电子设备接收用户在所述第一播放端UI上的第一操作;
    响应于用户在所述第一播放端UI上的第一操作,所述第二电子设备向所述第一电子设备发送第一指令;
    所述第一电子设备接收到所述第一指令,读取第二播放端界面描述文件;所述第二播放端界面描述文件用于对在第二电子设备上播放所述第一应用的第二播放端UI进行界面描述和界面行为定义;
    所述第一电子设备根据所述第二播放端界面描述文件生成第二播放端UI数据,将所述第二播放端UI数据中的控件与所述第一电子设备操作系统中的后台数据进行绑定;
    所述第一电子设备向所述第二电子设备发送所述第二播放端UI数据;
    所述第二电子设备接收所述第二播放端UI数据,按照所述第二播放端UI数据显示所述第二播放端UI。
  73. 根据权利要求68-72任意一项所述的方法,其特征在于,所述方法还包括:
    所述第一电子设备从服务器下载第一应用的应用安装包;所述应用安装包包括第一播放端界面描述文件和资源文件;所述资源文件包括生成所述第一应用的播放端UI使用的资源;
    所述第一电子设备使用所述应用安装包安装所述第一应用。
  74. 根据权利要求68-73任意一项所述的方法,其特征在于,所述方法还包括:
    所述第一电子设备读取所述第一播放端界面描述文件中与第三电子设备的设备类型对应的第一代码,按照所述第一代码的定义使用所述资源文件的资源生成第三播放端UI数据;
    所述第一电子设备读取所述第一播放端界面描述文件中与第四电子设备的设备类型对应的第二代码,按照所述第二代码的定义使用所述资源文件的资源生成第四播放端UI数据;所述第四电子设备的设备类型与所述第三电子设备的设备类型不同;
    所述第一电子设备分别向所述第三电子设备和所述第四电子设备发送所述第一播放端界面描述文件和所述资源文件;
    所述第三电子设备根据所述第一播放端界面描述文件中与第三电子设备的设备类型对应的第一代码的定义使用所述资源文件的资源生成第三播放端UI数据;按照所述第三播放端UI数据显示所述第一播放端UI;
    所述第四电子设备根据所述第一播放端界面描述文件中与第四电子设备的设备类型对应的第二代码的定义使用所述资源文件的资源生成第四播放端UI数据;按照所述第四播放端UI数据显示所述第一播放端UI。
  75. 根据权利要求68-73任意一项所述的方法,其特征在于,所述方法还包括:
    所述第一电子设备读取所述第一播放端界面描述文件中与第三电子设备的设备类型对应的第一代码,按照所述第一代码的定义使用所述资源文件的资源生成第三播放端UI数据;
    所述第一电子设备读取所述第一播放端界面描述文件中与第四电子设备的设备类型对应的第二代码,按照所述第二代码的定义使用所述资源文件的资源生成第四播放端UI数据;所述第四电子设备的设备类型与所述第三电子设备的设备类型不同;
    所述第一电子设备向所述第三电子设备发送所述第三播放端UI数据;
    所述第三电子设备按照所述第三播放端UI数据显示所述第一播放端UI;
    所述第一电子设备向所述第四电子设备发送所述第四播放端UI数据;
    所述第四电子设备按照所述第四播放端UI数据显示所述第一播放端UI。
  76. 根据权利要求68-75任意一项所述的方法,其特征在于,所述方法还包括:
    所述第一电子设备按照所述第一播放端界面描述文件中第三代码的定义生成所述第一播放端UI中第一控件,所述第一控件具备所述第一电子设备的操作系统自定义的控件属性;
    所述第一电子设备的操作系统自定义的控件属性包括:视觉属性,布局属性,交互属性,动效属性和软硬件依赖属性中至少一种。
  77. 根据权利要求76所述的方法,其特征在于,所述布局属性包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
  78. 一种用户接口界面实现方法,其特征在于,包括:
    显示第一应用的开发界面;所述第一应用的开发界面包括播放端界面描述文件;所述播放端界面描述文件用于对在播放端播放所述第一应用的播放端用户接口界面UI进行界面描述和界面行为定义;
    响应于用户的第一输入,在所述播放端界面描述文件中增加与第一电子设备的设备类型对应的第一代码;
    响应于用户的第二输入,在所述播放端界面描述文件中增加与第二电子设备的设备类型对应的第二代码;所述第一电子设备的设备类型与所述第二电子设备的设备类型不同;
    根据所述播放端界面描述文件生成所述第一应用的应用安装包。
  79. 根据权利要求78所述的方法,其特征在于,所述第一应用的应用安装包还包括资源文件,所述资源文件包括生成所述第一应用的播放端UI使用的资源。
  80. 根据权利要求78或79所述的方法,其特征在于,
    所述播放端界面描述文件中包括定义所述第一播放端UI中第一控件具备所述第一电子设备的操作系统自定义的控件属性的第三代码;
    所述第一电子设备的操作系统自定义的控件属性包括:视觉属性,布局属性,交互属性,动效属性和软硬件依赖属性中至少一种。
  81. 根据权利要求80所述的方法,其特征在于,所述布局属性包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
  82. 一种电子设备,其特征在于,包括:
    一个或多个处理器;
    显示屏;
    存储器;
    其中,所述存储器中存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述电子设备执行时,使得所述电子设备执行如权利要求78-81中任意一项所述的方法。
  83. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求78-81中任意一项所述的方法。
  84. 一种计算机可读存储介质,其特征在于,包括计算机指令,所述计算机指令用于对第一应用的第一播放端用户接口界面UI进行界面描述和界面行为定义;其中,所述计算机指令包括与第一电子设备的设备类型对应的第一代码以及与第二电子设备的设备类型对应的第二代码;所述第一电子设备的设备类型与所述第二电子设备的设备类型不同。
  85. 根据权利要求84所述的计算机可读存储介质,其特征在于,所述计算机指令还包括生成所述第一应用的播放端UI使用的资源。
  86. 根据权利要求84或85所述的计算机可读存储介质,其特征在于,所述计算机指令还包括定义所述第一播放端UI中第一控件具备所述第一电子设备的操作系统自定义的控件属性的第三代码;
    所述第一电子设备的操作系统自定义的控件属性包括:视觉属性,布局属性,交互属性,动效属性和软硬件依赖属性中至少一种。
  87. 根据权利要求86所述的计算机可读存储介质,其特征在于,所述布局属性包括:拉伸,隐藏,折行,均分,占比和延伸中至少一种。
PCT/CN2021/108273 2020-08-25 2021-07-23 用户接口界面实现方法及装置 WO2022042162A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21860002.1A EP4191400A4 (en) 2020-08-25 2021-07-23 METHOD AND DEVICE FOR IMPLEMENTING A USER INTERFACE
US18/042,929 US20230325209A1 (en) 2020-08-25 2021-07-23 User Interface Implementation Method and Apparatus

Applications Claiming Priority (14)

Application Number Priority Date Filing Date Title
CN202010862489 2020-08-25
CN202010862489.9 2020-08-25
CN202011064544.6 2020-09-30
CN202011064544 2020-09-30
CN202011142718 2020-10-22
CN202011141010 2020-10-22
CN202011141010.9 2020-10-22
CN202011142718.6 2020-10-22
CN202011381146 2020-11-30
CN202011381146.7 2020-11-30
CN202011384490 2020-11-30
CN202011384490.1 2020-11-30
CN202011475517.8 2020-12-14
CN202011475517.8A CN114115870A (zh) 2020-08-25 2020-12-14 用户接口界面实现方法及装置

Publications (1)

Publication Number Publication Date
WO2022042162A1 true WO2022042162A1 (zh) 2022-03-03

Family

ID=80352612

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/108273 WO2022042162A1 (zh) 2020-08-25 2021-07-23 用户接口界面实现方法及装置

Country Status (3)

Country Link
US (1) US20230325209A1 (zh)
EP (1) EP4191400A4 (zh)
WO (1) WO2022042162A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114756234A (zh) * 2022-06-13 2022-07-15 中邮消费金融有限公司 基于传统应用和动态配置策略的app开发方法
WO2023184301A1 (zh) * 2022-03-31 2023-10-05 京东方科技集团股份有限公司 触控事件的处理方法及装置、存储介质、电子设备

Citations (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6556217B1 (en) * 2000-06-01 2003-04-29 Nokia Corporation System and method for content adaptation and pagination based on terminal capabilities
US20030160822A1 (en) * 2002-02-22 2003-08-28 Eastman Kodak Company System and method for creating graphical user interfaces
CN101030204A (zh) * 2006-02-27 2007-09-05 株式会社日立制作所 在用户终端设备上生成用户界面的入口服务器和方法
EP1865422A1 (en) * 2006-06-09 2007-12-12 Nextair Corporation Software, methods and apparatus facilitating presentation of a wireless communication device user interface with multi-language support
US20070288858A1 (en) * 2006-06-09 2007-12-13 Mindy Pereira Engine for rendering widgets using platform-specific attributes
US20080028327A1 (en) * 2006-07-27 2008-01-31 Canon Kabushiki Kaisha Information processing apparatus and user interface control method
CN101477460A (zh) * 2008-12-17 2009-07-08 三星电子(中国)研发中心 浏览器应用在手持设备上的制作和定制方法
US8032540B1 (en) * 2004-10-29 2011-10-04 Foundry Networks, Inc. Description-based user interface engine for network management applications
CN102331934A (zh) * 2011-10-21 2012-01-25 广州市久邦数码科技有限公司 一种基于go桌面系统的桌面组件的实现方法
EP2498179A1 (en) * 2011-03-09 2012-09-12 Telefónica, S.A. Method for managing widgets in an electronic device to improve the user experience of the device
US20130227427A1 (en) * 2010-09-15 2013-08-29 Jürg Möckli Method for configuring a graphical user interface
US8694925B1 (en) * 2005-10-05 2014-04-08 Google Inc. Generating customized graphical user interfaces for mobile processing devices
CN104484171A (zh) * 2014-12-11 2015-04-01 深圳市路通网络技术有限公司 终端界面设计系统、方法及相关设备
CN106371850A (zh) * 2016-09-19 2017-02-01 上海葡萄纬度科技有限公司 一种创建可自定义的桌面小组件的方法
US20170091159A1 (en) * 2015-09-25 2017-03-30 Yahoo! Inc. Programmatic native rendering of structured content
CN107104947A (zh) * 2017-03-20 2017-08-29 福建天泉教育科技有限公司 多屏互动方法
US20180081645A1 (en) * 2016-09-16 2018-03-22 Oracle International Corporation Generic-flat structure rest api editor
CN109271162A (zh) * 2018-09-03 2019-01-25 中国建设银行股份有限公司 一种页面生成方法和装置
CN109710258A (zh) * 2018-12-28 2019-05-03 北京金山安全软件有限公司 微信小程序界面生成的方法及装置
CN110377250A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏场景下的触控方法及电子设备
CN110381195A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备
CN110457620A (zh) * 2019-08-15 2019-11-15 深圳乐信软件技术有限公司 一种页面访问的方法、装置、设备及存储介质
CN110908627A (zh) * 2019-10-31 2020-03-24 维沃移动通信有限公司 投屏方法及第一电子设备
CN111124473A (zh) * 2018-10-31 2020-05-08 成都鼎桥通信技术有限公司 一种基于专网终端类型生成apk的方法和装置
CN111399789A (zh) * 2020-02-20 2020-07-10 华为技术有限公司 界面布局方法、装置及系统

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8984448B2 (en) * 2011-10-18 2015-03-17 Blackberry Limited Method of rendering a user interface

Patent Citations (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6556217B1 (en) * 2000-06-01 2003-04-29 Nokia Corporation System and method for content adaptation and pagination based on terminal capabilities
US20030160822A1 (en) * 2002-02-22 2003-08-28 Eastman Kodak Company System and method for creating graphical user interfaces
US8032540B1 (en) * 2004-10-29 2011-10-04 Foundry Networks, Inc. Description-based user interface engine for network management applications
US8694925B1 (en) * 2005-10-05 2014-04-08 Google Inc. Generating customized graphical user interfaces for mobile processing devices
CN101030204A (zh) * 2006-02-27 2007-09-05 株式会社日立制作所 在用户终端设备上生成用户界面的入口服务器和方法
EP1865422A1 (en) * 2006-06-09 2007-12-12 Nextair Corporation Software, methods and apparatus facilitating presentation of a wireless communication device user interface with multi-language support
US20070288858A1 (en) * 2006-06-09 2007-12-13 Mindy Pereira Engine for rendering widgets using platform-specific attributes
US20080028327A1 (en) * 2006-07-27 2008-01-31 Canon Kabushiki Kaisha Information processing apparatus and user interface control method
CN101477460A (zh) * 2008-12-17 2009-07-08 三星电子(中国)研发中心 浏览器应用在手持设备上的制作和定制方法
US20130227427A1 (en) * 2010-09-15 2013-08-29 Jürg Möckli Method for configuring a graphical user interface
EP2498179A1 (en) * 2011-03-09 2012-09-12 Telefónica, S.A. Method for managing widgets in an electronic device to improve the user experience of the device
CN102331934A (zh) * 2011-10-21 2012-01-25 广州市久邦数码科技有限公司 一种基于go桌面系统的桌面组件的实现方法
CN104484171A (zh) * 2014-12-11 2015-04-01 深圳市路通网络技术有限公司 终端界面设计系统、方法及相关设备
US20170091159A1 (en) * 2015-09-25 2017-03-30 Yahoo! Inc. Programmatic native rendering of structured content
US20180081645A1 (en) * 2016-09-16 2018-03-22 Oracle International Corporation Generic-flat structure rest api editor
CN106371850A (zh) * 2016-09-19 2017-02-01 上海葡萄纬度科技有限公司 一种创建可自定义的桌面小组件的方法
CN107104947A (zh) * 2017-03-20 2017-08-29 福建天泉教育科技有限公司 多屏互动方法
CN109271162A (zh) * 2018-09-03 2019-01-25 中国建设银行股份有限公司 一种页面生成方法和装置
CN111124473A (zh) * 2018-10-31 2020-05-08 成都鼎桥通信技术有限公司 一种基于专网终端类型生成apk的方法和装置
CN109710258A (zh) * 2018-12-28 2019-05-03 北京金山安全软件有限公司 微信小程序界面生成的方法及装置
CN110377250A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏场景下的触控方法及电子设备
CN110381195A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备
CN110457620A (zh) * 2019-08-15 2019-11-15 深圳乐信软件技术有限公司 一种页面访问的方法、装置、设备及存储介质
CN110908627A (zh) * 2019-10-31 2020-03-24 维沃移动通信有限公司 投屏方法及第一电子设备
CN111399789A (zh) * 2020-02-20 2020-07-10 华为技术有限公司 界面布局方法、装置及系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4191400A4

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023184301A1 (zh) * 2022-03-31 2023-10-05 京东方科技集团股份有限公司 触控事件的处理方法及装置、存储介质、电子设备
CN114756234A (zh) * 2022-06-13 2022-07-15 中邮消费金融有限公司 基于传统应用和动态配置策略的app开发方法
CN114756234B (zh) * 2022-06-13 2022-09-30 中邮消费金融有限公司 基于传统应用和动态配置策略的app开发方法

Also Published As

Publication number Publication date
US20230325209A1 (en) 2023-10-12
EP4191400A1 (en) 2023-06-07
EP4191400A4 (en) 2024-01-10

Similar Documents

Publication Publication Date Title
US11902377B2 (en) Methods, systems, and computer program products for implementing cross-platform mixed-reality applications with a scripting framework
US11010147B2 (en) Method and apparatus for running mobile device software
US9058193B2 (en) Methods and systems for providing compatibility of applications with multiple versions of an operating system
WO2021129253A1 (zh) 显示多窗口的方法、电子设备和系统
US8762936B2 (en) Dynamic design-time extensions support in an integrated development environment
WO2021018005A1 (zh) 一种跨进程通信方法、装置及设备
CN114115870A (zh) 用户接口界面实现方法及装置
US20120137211A1 (en) Method and Apparatus for Specifying Mapping Parameters for User Interface Element Presentation in an Application
WO2010091623A1 (zh) 应用程序界面动态生成装置及方法
WO2022042162A1 (zh) 用户接口界面实现方法及装置
KR20150043333A (ko) 선언형 템플릿을 사용하여 컨트롤을 스탬프 아웃하기 위한 사용자 인터페이스 컨트롤 프레임워크
WO2011101845A1 (en) Modified operating systems allowing mobile devices to accommodate io devices more convenient than their own inherent io devices and methods for generating such systems
US20140143763A1 (en) Method and System to develop operating system agnostic software applications for mobile devices using a virtual machine
WO2023109764A1 (zh) 一种壁纸显示方法及电子设备
US8700802B2 (en) Method and system for providing advertising content suitable for multiple platforms
US20230139886A1 (en) Device control method and device
Zhou et al. Windows Phone 7 programming for Android and iOS developers
CN116340680A (zh) 一种显示设备及插件对象生命周期管理的控制方法
CN110399040B (zh) 多模态交互方法、用户端设备、服务器及系统
Zdziarski iPhone SDK application development: Building applications for the AppStore
CN116743908B (zh) 壁纸显示方法及相关装置
EP4216052A1 (en) Method for developing mvvm architecture-based application, and terminal
EP4343533A1 (en) Screen projection method and related apparatus
CN118193152A (zh) 处理启动任务的方法和装置
Zucker et al. Beginning Nokia Apps Development: Qt and HTML5 for Symbian and MeeGo

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21860002

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2021860002

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2021860002

Country of ref document: EP

Effective date: 20230302

NENP Non-entry into the national phase

Ref country code: DE