WO2022252804A1 - 显示视图控件的方法及装置 - Google Patents

显示视图控件的方法及装置 Download PDF

Info

Publication number
WO2022252804A1
WO2022252804A1 PCT/CN2022/085167 CN2022085167W WO2022252804A1 WO 2022252804 A1 WO2022252804 A1 WO 2022252804A1 CN 2022085167 W CN2022085167 W CN 2022085167W WO 2022252804 A1 WO2022252804 A1 WO 2022252804A1
Authority
WO
WIPO (PCT)
Prior art keywords
view control
control
engine
component
view
Prior art date
Application number
PCT/CN2022/085167
Other languages
English (en)
French (fr)
Inventor
陈本智
李世浩
池浩
兰守忍
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022252804A1 publication Critical patent/WO2022252804A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • the embodiments of the present application relate to the field of electronic equipment, and in particular, to a method and device for displaying a view control.
  • the operating system supports the display of various types of view controls, which is conducive to the construction of an OS ecosystem.
  • OS operating system
  • view control Android view control
  • the terminal device installed with operating system A can support displaying the The application page is beneficial for operating system A to build an OS ecosystem.
  • the Android view control may be a web view (webview) control, a map view (mapview) control, a camera view (cameraview) control, and the like.
  • the solution to realize the display view control supported by the operating system is mainly as follows: for each view control, the development framework of the operating system can provide a customized control corresponding to the view control, and the operating system can use the customized control corresponding to the view control.
  • the view control displays the view control.
  • different view controls correspond to different customized controls.
  • Embodiments of the present application provide a method and device for displaying view controls, which can enable a terminal device to display various view controls, and are universal for different view controls.
  • the embodiment of the present application provides a method for displaying a view control, and the method can be applied to a terminal device.
  • a terminal device includes a first engine, a first component, and a second engine.
  • the method includes: the first engine determines the plug-in attribute of the view control corresponding to the first application, and the front-end document object model node corresponding to the view control.
  • the first component determines the layout information corresponding to the view control according to the front-end document object model node corresponding to the view control.
  • the first component creates a third-party plug-in and texture data corresponding to the view control according to the plug-in attribute of the view control, and registers the texture data corresponding to the view control into the GPU.
  • the first component sends layout information corresponding to the view control to the second engine.
  • the first component displays the texture data corresponding to the view control according to the third-party plug-in corresponding to the view control.
  • the second engine obtains the texture data corresponding to the view control from the GPU, and renders the texture data corresponding to the view control according to the layout information corresponding to the view control.
  • the first application may be an application developed by a developer based on the view control.
  • the view control may include any one of a web page view control, a map view control, a camera view control, a video view control, a text view control, a live view control, an advertisement view control, a screen projection view control, and the like.
  • the view control may be an Android view control, a Hongmeng view control, or a view control developed in other java languages, which is not limited here.
  • the first packaging file of the first application includes the plug-in attribute of the view control corresponding to the first application, and the plug-in attribute of the view control is preconfigured.
  • the developer of the first application may set the plug-in property of the view control as a web page view control or a map view control in the first packaging file of the first application, and preconfigure the plug-in property of the view control.
  • the developer of the first application only needs to set the plug-in (plugin) attribute of the view control in the code of the first application, so that the terminal device can support displaying different view controls.
  • the terminal device displays different view controls according to this method, there is no need to consider the adaptation and connection of different view controls.
  • the entire implementation process can be automatically completed by the development framework of the operating system of the terminal device, and the developer of the first application may not be aware of it.
  • the difficulty of developing the first application adapted to the operating system of the terminal device is greatly reduced, which is beneficial to the construction of an operating system ecology of the terminal device.
  • the development framework of the operating system of the terminal device can achieve the goal of one-time development and the deployment of various complex view controls, which greatly reduces the development cost.
  • the developer of the first application sets the plug-in (plugin) attribute of the view control in the code of the first application, and the development framework of the operating system automatically matches the corresponding third-party plug-in according to the plug-in (plugin) attribute of the view control set by the developer. It can prevent developers from performing secondary development for different view controls, and can effectively improve development efficiency.
  • the method can also avoid creating a texture cache in a central processing unit (central processing unit, CPU).
  • the second engine can directly obtain the texture data corresponding to the view control from the GPU for rendering, and does not need to transfer texture data between the CPU and GPU, which can save the CPU copying time between different texture caches and the texture upload time to the GPU. , greatly improving rendering performance. At the same time, it also saves the memory occupied by texture data and GPU memory.
  • the step of the first component registering the texture data corresponding to the view control in the GPU may include:
  • the first component generates a texture identifier of the texture data corresponding to the view control, and registers the texture data corresponding to the view control into the GPU according to the texture identifier of the texture data corresponding to the view control.
  • the method further includes: the first component sends the texture identifier of the texture data corresponding to the view control to the second engine.
  • the step of the second engine obtaining the texture data corresponding to the view control from the GPU may include:
  • the second engine acquires the texture data corresponding to the view control from the GPU according to the texture identifier of the texture data corresponding to the view control.
  • the texture identifier of the texture data may be a texture ID, and there is a one-to-one binding relationship between the texture ID and the texture data.
  • the second engine may index the texture data corresponding to the texture ID from the GPU based on the texture ID.
  • the step of the first component displaying the texture data corresponding to the view control according to the third-party plug-in corresponding to the view control may include:
  • the first component gets a canvas texture object.
  • the first component creates a virtual display controller in a third-party plug-in corresponding to the view control, and creates a corresponding virtual display control according to the canvas in the canvas texture object and the layout information corresponding to the view control.
  • the first component synthesizes the third-party plug-in corresponding to the view control with the virtual display control through the virtual display controller, and creates a corresponding display control.
  • the first component adds the third-party plug-in corresponding to the view control to the container corresponding to the view in the display control through the display control, and sends the texture data corresponding to the view control to the canvas texture object to synthesize on the canvas.
  • the second engine acquires texture data corresponding to the view control from the GPU, and renders the texture data corresponding to the view control according to layout information corresponding to the view control, including:
  • the second engine obtains the canvas from the canvas texture object, and obtains texture data corresponding to the view control from the GPU.
  • the second engine renders the texture data corresponding to the view control according to the layout information corresponding to the view control.
  • the first engine determines the plug-in attribute of the view control corresponding to the first application and the front-end document object model node corresponding to the view control, including:
  • the first engine parses the first packaging file in the first application to obtain the plug-in properties of the view control corresponding to the first application, the first object corresponding to the view control, and the style and properties of the first object;
  • the first object corresponding to the control and the style and attributes of the first object are used to generate a front-end document object model node corresponding to the view control.
  • the first application may be a JS application
  • the first package file may be a JS package
  • the first object may be a JS object.
  • the first component determines the layout information corresponding to the view control according to the front-end document object model node corresponding to the view control, including:
  • the first component connects the front-end document object model node corresponding to the view control to the back-end component node.
  • the first component creates a corresponding element node according to the back-end component node, and creates a corresponding drawing node in the element node.
  • the first component calculates layout information corresponding to the view control at the drawing node.
  • the layout information corresponding to the view control may include the position and size corresponding to the view control.
  • the first component may include: a component module, an element module, a drawing module, a resource registration module, a texture plug-in module, and a texture module.
  • the component module can be used to create a component node corresponding to the view control, and set the style and property of the first object parsed by the first engine to the corresponding component node.
  • the element module can be used to create the element node corresponding to the component node, and mount the element node on the element tree to facilitate the layout calculation of the entire tree and the update of related style attributes.
  • the drawing module can be used to create a drawing node corresponding to the display of the view control, and calculate the layout information corresponding to the view control in the drawing node. After the drawing module calculates the layout information corresponding to the view control, it can create a resource management module.
  • the resource registration module can create texture data corresponding to the view control, generate a texture identifier of the texture data corresponding to the view, and register the texture data corresponding to the view control into the GPU based on the resource registration module and the texture plug-in module.
  • the texture module can create the corresponding canvas texture (surfaceTexture) object.
  • the drawing module When the drawing module creates the resource management module, it can match the view control plug-in module and The view control module, based on the asynchronous callback technology, creates a third-party plug-in corresponding to the view control through the view control plug-in module and the view control module corresponding to the plug-in attribute of the view control.
  • the view control plug-in module and the view control module may be preset in the first component by the developer of the operating system.
  • a webview control plug-in module and a webview control module corresponding to the webview control may be preset in the first component. Match the webview control plug-in module and the webview control module.
  • mapview control plug-in module and the mapview control module corresponding to the mapview control can also be preset in the first component.
  • a component matches the mapview control plug-in module and the mapview control module.
  • the drawing module After the drawing module creates the third-party plug-in corresponding to the view control, it can create a virtual display controller in the third-party plug-in based on the surfaceTexture object and the view control plug-in module corresponding to the plug-in attribute of the view control. ; and create a corresponding virtual display control (virtual display) according to the surface in the surfaceTexture object and the layout information corresponding to the view control.
  • the virtual display controller can synthesize third-party plug-ins and virtual display controls, and create corresponding display controls (presentation).
  • the display control can add a third-party plug-in to the container corresponding to the view in the display control, and send the texture data corresponding to the view control to the surfaceTexture object and synthesize it on the surface.
  • the operating system may be the Hongmeng system
  • the development framework of the operating system may be the JS development framework
  • the first engine may be the JS engine
  • the second engine may be the rendering engine.
  • the rendering engine may be a skia engine.
  • the embodiment of the present application provides a terminal device, which may include a first engine, a first component, and a second engine.
  • the first engine is used to determine the plug-in attribute of the view control corresponding to the first application, and the front-end document object model node corresponding to the view control.
  • the first component is used to determine the layout information corresponding to the view control according to the front-end document object model node corresponding to the view control; according to the plug-in properties of the view control, create a third-party plug-in and texture data corresponding to the view control, and set the texture corresponding to the view control Data is registered to the GPU.
  • the first component is also used to send the layout information corresponding to the view control to the second engine; display the texture data corresponding to the view control according to the third-party plug-in corresponding to the view control.
  • the second engine is configured to acquire texture data corresponding to the view control from the GPU, and render the texture data corresponding to the view control according to layout information corresponding to the view control.
  • the first application may be an application developed by a developer based on the view control.
  • the view control may include any one of a web page view control, a map view control, a camera view control, a video view control, a text view control, a live view control, an advertisement view control, a screen projection view control, and the like.
  • the view control may be an Android view control, a Hongmeng view control, or a view control developed in other java languages, which is not limited here.
  • the first packaging file of the first application includes the plug-in attribute of the view control corresponding to the first application, and the plug-in attribute of the view control is preconfigured.
  • the developer of the first application may set the plug-in property of the view control as a web page view control or a map view control in the first packaging file of the first application, and preconfigure the plug-in property of the view control.
  • the first component may be specifically configured to generate a texture identifier of the texture data corresponding to the view control, and register the texture data corresponding to the view control into the GPU according to the texture identifier of the texture data corresponding to the view control.
  • the first component is further configured to send the texture identifier of the texture data corresponding to the view control to the second engine.
  • the second engine may be configured to acquire the texture data corresponding to the view control from the GPU according to the texture identifier of the texture data corresponding to the view control.
  • the first component can be specifically used to obtain a canvas texture object; create a virtual display controller in a third-party plug-in corresponding to the view control, and create a corresponding Virtual display control; synthesize the third-party plug-in corresponding to the view control with the virtual display control through the virtual display controller, and create the corresponding display control; add the third-party plug-in corresponding to the view control to the corresponding view in the display control through the display control on the container, and send the texture data corresponding to the view control to the canvas texture object to synthesize on the canvas.
  • the second engine may be specifically configured to obtain the canvas from the canvas texture object, and obtain the texture data corresponding to the view control from the GPU; render the texture data corresponding to the view control according to the layout information corresponding to the view control.
  • the first engine may specifically be used to parse the first packaged file in the first application to obtain the plug-in properties of the view control corresponding to the first application, the first object corresponding to the view control and the style of the first object and attributes; according to the first object corresponding to the view control and the style and attributes of the first object, generate a front-end document object model node corresponding to the view control.
  • the first application may be a JS application
  • the first package file may be a JS package
  • the first object may be a JS object.
  • the first component can specifically be used to connect the front-end document object model node corresponding to the view control to the back-end component node; create a corresponding element node according to the back-end component node, and create a corresponding drawing node in the element node;
  • the layout information corresponding to the view control is calculated at the drawing node.
  • the layout information corresponding to the view control may include the position and size corresponding to the view control.
  • the first component may include: a component module, an element module, a drawing module, a resource registration module, a texture plug-in module, and a texture module.
  • the component module can be used to create a component node corresponding to the view control, and set the style and property of the first object parsed by the first engine to the corresponding component node.
  • the element module can be used to create the element node corresponding to the component node, and mount the element node on the element tree to facilitate the layout calculation of the entire tree and the update of related style attributes.
  • the drawing module can be used to create a drawing node corresponding to the display of the view control, and calculate the layout information corresponding to the view control in the drawing node. After the drawing module calculates the layout information corresponding to the view control, it can create a resource management module.
  • the resource registration module can create texture data corresponding to the view control, generate a texture identifier of the texture data corresponding to the view, and register the texture data corresponding to the view control into the GPU based on the resource registration module and the texture plug-in module.
  • the texture module can create the corresponding canvas texture (surfaceTexture) object.
  • the drawing module When the drawing module creates the resource management module, it can match the view control plug-in module and The view control module, based on the asynchronous callback technology, creates a third-party plug-in corresponding to the view control through the view control plug-in module and the view control module corresponding to the plug-in attribute of the view control.
  • the drawing module After the drawing module creates the third-party plug-in corresponding to the view control, it can create a virtual display controller in the third-party plug-in based on the surfaceTexture object and the view control plug-in module corresponding to the plug-in attribute of the view control. ; and create a corresponding virtual display control (virtual display) according to the surface in the surfaceTexture object and the layout information corresponding to the view control.
  • the virtual display controller can synthesize third-party plug-ins and virtual display controls, and create corresponding display controls (presentation).
  • the display control can add a third-party plug-in to the container corresponding to the view in the display control, and send the texture data corresponding to the view control to the surfaceTexture object and synthesize it on the surface.
  • the operating system of the terminal device may include any of these.
  • the first engine, the first component, and the second engine are all codes in the development framework of the operating system of the terminal device.
  • an embodiment of the present application provides an apparatus for displaying a view control, which can be used to implement the method for displaying a view control described in the first aspect.
  • the functions of the device can be realized by hardware, and can also be realized by executing corresponding software by hardware.
  • the hardware or software includes one or more modules or units corresponding to the above functions.
  • the device may include modules such as a first engine, a first component, and a second engine.
  • the apparatus may implement the method described in the first aspect and any possible implementation manner of the first aspect through the first engine, the first component, and the second engine. No more details here.
  • an embodiment of the present application provides an electronic device, including: a processor; a memory; and a computer program; wherein the computer program is stored in the memory, and when the computer program is executed by the processor , so that the electronic device implements the method described in the first aspect and any possible implementation manner of the first aspect.
  • an embodiment of the present application provides a computer-readable storage medium, the computer-readable storage medium includes a computer program, and when the computer program is run on an electronic device, the electronic device implements the method described in the first aspect. and the method described in any possible implementation manner of the first aspect.
  • the embodiment of the present application also provides a computer program product, including computer readable code, when the computer readable code is run in the electronic device, the electronic device can realize any of the first aspect and the first aspect.
  • a computer program product including computer readable code, when the computer readable code is run in the electronic device, the electronic device can realize any of the first aspect and the first aspect. The method described in one possible implementation.
  • FIG. 1A is a schematic structural diagram of a terminal device provided in an embodiment of the present application.
  • FIG. 1B is a schematic diagram of the composition of the JS development framework provided by the embodiment of the present application.
  • FIG. 1C is a schematic diagram of the architecture of the operating system of the terminal device provided by the embodiment of the present application.
  • FIG. 2 is a schematic flowchart of a method for displaying a view control provided by an embodiment of the present application
  • FIG. 3 is a schematic structural diagram of the first component 200 provided by the embodiment of the present application.
  • Fig. 4 is a schematic diagram showing the rendering effect of the Android view control provided by the embodiment of the present application.
  • FIG. 5 is a schematic diagram showing another rendering effect of the Android view control provided by the embodiment of the present application.
  • FIG. 6 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • references to "one embodiment” or “some embodiments” or the like in this specification means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically stated otherwise.
  • the terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless specifically stated otherwise.
  • the term “connected” includes both direct and indirect connections, unless otherwise stated.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features. Thus, a feature defined as “first” and “second” may explicitly or implicitly include one or more of these features.
  • the operating system supports the display of various types of view controls, which is conducive to the construction of an OS ecosystem.
  • the view control refers to a control that can be responsible for drawing and event processing of a certain display area in the application page.
  • Android view control when operating system A supports displaying Android view (view) control (hereinafter referred to as Android view control), the terminal device installed with operating system A can support displaying the The application page is beneficial for operating system A to build an OS ecosystem.
  • the Android view control may include a web view (webview) control, a map view (mapview) control, a camera view (cameraview) control, a text view (textview) control, and the like.
  • the operating system supports the display of Android view controls mainly as follows: for each Android view control, the development framework of the operating system can provide a customized control corresponding to the Android view control, and the operating system can communicate with the Android view control The corresponding customized control displays the Android view control.
  • different Android view controls correspond to different customized controls.
  • the operating system can develop a framework and develop a customized control corresponding to the webview control, and the operating system can display the webview control through the customized control corresponding to the webview control.
  • an embodiment of the present application provides a method for displaying a view control, which can be applied to a terminal device, and the terminal device is installed with an operating system.
  • the terminal device installs the first application developed by the developer based on the view control
  • the terminal device may display the view control corresponding to the first application based on the development framework of the operating system according to the method for displaying the view control provided in the embodiment of the present application.
  • the method can enable the operating system of the terminal device to support the display of various view controls, and is universal for different view controls.
  • the development framework of the operating system when the development framework of the operating system is developed, the development efficiency can be higher, which is beneficial to the construction of the OS ecology.
  • terminal devices may include mobile phones, large screens (such as smart screens), tablet computers, wearable devices (such as smart watches, smart bracelets, etc.), TVs, vehicle-mounted devices, augmented reality ( Augmented reality (AR)/virtual reality (virtual reality, VR) equipment, notebook computer, ultra-mobile personal computer (UMPC), netbook, personal digital assistant (personal digital assistant, PDA), etc.
  • AR Augmented reality
  • VR virtual reality
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • a specific form of the terminal device is not limited here.
  • FIG. 1A is a schematic structural diagram of a terminal device provided in an embodiment of the present application.
  • mobile phone can comprise processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charging management module 140, power management module 141, battery 142, antenna 1.
  • Antenna 2 mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the processor 110 may include multiple processing units, for example: an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP) , controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the controller can be the nerve center and command center of the mobile phone.
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input/output (general-purpose input/output, GPIO) interface, SIM interface, and/or USB interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM interface SIM interface
  • USB interface etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the mobile phone by executing instructions stored in the internal memory 121 . For example, by running the instructions stored in the internal memory 121, the processor 110 can make the mobile phone implement the method for displaying view controls described in the embodiment of the present application.
  • the internal memory 121 may also include an area for storing programs and an area for storing data.
  • the program storage area may store an operating system, at least one application required by a function (such as the first application described in the embodiment of the present application), and the like.
  • the storage data area can store data created during the use of the mobile phone (such as image data, phone book) and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the charging management module 140 is configured to receive a charging input from a charger. While the charging management module 140 is charging the battery 142 , it can also provide power for the mobile phone through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 , and the processor 110 .
  • the power management module 141 can also receive the input of the battery 142 to provide power for the mobile phone.
  • the mobile phone can realize the audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the display screen 194 may be used to display a JS page of a JS application.
  • the mobile phone realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the structure shown in FIG. 1A does not constitute a specific limitation on the mobile phone.
  • the mobile phone may also include more or fewer components than those shown in FIG. 1A , or combine certain components, or separate certain components, or arrange different components, etc.
  • some components shown in Fig. 1A may be implemented in hardware, software or a combination of software and hardware.
  • the terminal device is other large screens (such as smart screens), tablet computers, wearable devices (such as smart watches, smart bracelets, etc.), TVs, vehicle-mounted devices, augmented reality (augmented reality, AR)/virtual reality ( virtual reality (VR) equipment, notebook computer, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook, personal digital assistant (personal digital assistant, PDA) and other equipment
  • augmented reality augmented reality, AR
  • notebook computer ultra-mobile personal computer (ultra-mobile personal computer, UMPC)
  • netbook personal digital assistant
  • PDA personal digital assistant
  • components of other terminal devices may be added or reduced on the basis of the structure shown in FIG. 1A , which will not be repeated here.
  • the view control may be an Android view control, a Hongmeng view control, or other view controls (such as other view controls developed in java language), etc.
  • the operating system in the method of the embodiment of the present application may be The specific types of view controls that support display are not limited.
  • the operating system can display different Android view controls through this method.
  • the operating system can be or or or or etc., and the specific type of the operating system is not limited here.
  • the following takes the Hongmeng system as an example of the operating system, and combined with the scene where the Hongmeng system displays the Android view control, the method for displaying the view control provided by the embodiment of the present application is exemplarily described.
  • the Hongmeng system uses JavaScript as the development language, and its core development framework is the JS development framework.
  • the JS development framework may include a JS engine and a rendering engine.
  • a first component can be developed in the JS development framework, and the first component can cooperate with the JS engine and the rendering engine to implement the method for displaying the view control.
  • the first component may be code capable of implementing the corresponding functions described in the embodiments of the present application. That is, in this embodiment of the application, the terminal device may include a JS engine, a first component, and a rendering engine, and the terminal device may implement the display view control described in the embodiment of this application through the JS engine, the first component, and the rendering engine.
  • a JS application refers to an application developed using the JavaScript language.
  • the JS application may provide a JS page, and the JS page is displayed by calling the Android view control, that is, the JS page is a display page of the Android view control.
  • JS bundle is also called JS package file.
  • JS bundle refers to a package that includes hypertext markup language (hypertext markup language, HTML) files, cascading style sheets (cascading style sheets, CSS) files and JS files in a JS application. You can use a packaging tool to package HTML files, CSS files, and JS files to get a JS bundle.
  • HTML file For example, suppose the HTML file, CSS file, and JS file in the JS application are as follows:
  • HTML files, CSS files, JS files, and JS bundles only provide some codes as examples.
  • JS objects In JavaScript, except for strings, numbers, booleans, null characters, and undefined values, other data are JS objects, such as JS objects can be Arrays, dates, even functions, etc.
  • VDOM virtual document object model
  • DOM document object model
  • VODM nodes are virtual/simulated DOM nodes.
  • DOM nodes can be used to connect to back-end component nodes to realize the capabilities of front-end components.
  • the component node is a declarative description of the underlying UI component, describing the properties and styles of the UI component. For example: the component node describes each front-end DOM node in each JS page, including the styles and properties of the JS object corresponding to the front-end DOM node. The component node can be used to generate the entity element of the component.
  • the element node is an instance of the component node, representing a specific component node.
  • the render node is also called a render node.
  • the render node is used to calculate the rendering data of each element node.
  • the rendering data of the element node may include the position, size/dimension, drawing command, etc. of the element node in the JS page.
  • Virtual display a control that grabs the content displayed on the screen.
  • Display control a special dialog box displayed on the specified display screen, used to display content on the auxiliary screen.
  • Canvas texture A canvas (surface) that can convert texture data into an external texture, and the surface will not be displayed directly.
  • Context refers to the information of an application environment.
  • FIG. 1B is a schematic composition diagram of the JS development framework provided by the embodiment of the present application.
  • the JS development framework may include: a JS engine 100 , a first component 200 , and a rendering engine 300 .
  • the terminal device when the terminal device installs the JS application developed by the developer based on the Android view control, the terminal device can load the JS application through the JS development framework.
  • the terminal device may provide an icon of the JS application, and the user may click the icon of the JS application.
  • the terminal device may load the JS application through the JS development framework in response to the user clicking the icon of the JS application.
  • the JS engine 100 can analyze the JS package (bundle) in the JS application to determine the plug-in (plugin) attribute of the Android view control corresponding to the JS application and the front-end DOM node corresponding to the Android view control. .
  • the first component 200 can determine the layout information corresponding to the Android view control according to the front-end DOM node corresponding to the Android view control, and create a third-party plug-in and texture data corresponding to the Android view control according to the plugin attribute of the Android view control, and map the Android view control to the corresponding
  • the texture data is registered to the GPU.
  • the first component 200 can display the texture data corresponding to the Android view control according to the third-party plug-in corresponding to the Android view control.
  • the rendering engine 300 may acquire texture data corresponding to the Android view control from the GPU, and render the texture data corresponding to the Android view control according to the layout information corresponding to the Android view control determined by the first component 200 .
  • FIG. 1C is a schematic structural diagram of an operating system of a terminal device provided in an embodiment of the present application.
  • the operating system of the terminal device may include an application program layer (applications), an application program framework layer (application framework), a system runtime library layer (libraries), and a kernel layer.
  • applications application program layer
  • application framework application program framework layer
  • libraries system runtime library layer
  • kernel layer kernel layer
  • the application layer includes multiple applications, such as: the application layer can include system applications such as e-mail, SMS, calendar, map, browser, and contact management, as well as third-party applications designed and written by developers using the Java language (such as JS applications).
  • system applications such as e-mail, SMS, calendar, map, browser, and contact management
  • third-party applications designed and written by developers using the Java language (such as JS applications).
  • JS applications designed and written by developers using the Java language
  • the developer can pre-configure the plugin attribute of the Android view control in the JS bundle of the JS application when developing the application.
  • the JS engine 100 , the first component 200 , and the rendering engine 300 shown in FIG. 1B above can be deployed at the application framework layer or the system runtime layer shown in FIG. 1C , without limitation here.
  • FIG. 2 is a schematic flowchart of a method for displaying a view control provided by an embodiment of the present application. As shown in FIG. 2, the method for displaying a view control may include S201-S210.
  • the JS engine 100 parses the JS bundle in the JS application developed by the developer to obtain the plugin attribute of the Android view control, the JS object corresponding to the Android view control, and the styles and attributes of the JS object.
  • the JS application developed by the developer is an application developed by the developer based on the Android view control.
  • a developer develops a JS application, he can preset (preconfigure) the plugin (plugin) attribute of the Android view control in the code of the HTML file of the JS application.
  • the plug-in (plugin) attribute of the Android view control can be pre-configured as a webview control, a mapview control, a cameraview control, a textview control, and the like.
  • the JS engine 100 can load the JS bundle in the JS application developed by the developer into the JS development framework, analyze the JS bundle, and obtain the JS object corresponding to the Android view control, the style and attributes of the JS object, and the Android view plug-in ( plugin) attribute.
  • the JS application developed by the developer may be an application developed based on the webview control.
  • the JS engine 100 analyzes the JS bundle in the JS application, and can obtain the plug-in (plugin) attribute of the Android view control as the webview control, and at the same time, can obtain the JS object corresponding to the webview control and the styles and attributes of the JS object.
  • plug-in plug-in
  • the JS engine 100 can analyze and obtain that the plugin (plugin) attribute of the Android view control in the JS application is: webview control.
  • the JS engine 100 can analyze and obtain that the plugin (plugin) attribute of the Android view control in the JS application is: mapview control.
  • the JS engine 100 generates a front-end document object model (document object model, DOM) node corresponding to the Android view control according to the JS object corresponding to the Android view control and the styles and attributes of the JS object.
  • a front-end document object model document object model, DOM
  • the step of JS engine 100 generating the front-end DOM node corresponding to the Android view control according to the JS object corresponding to the Android view control and the styles and attributes of the JS object may include: the JS engine 100 generates the corresponding JS object according to the Android view control.
  • the tree structure of the VDOM nodes VDOM tree for short
  • the VODM tree includes a plurality of VDOM nodes.
  • the JS engine 100 sets the style and attributes of the JS object to the corresponding VODM node, and performs data binding on each VDOM node on the VDOM tree. While generating the VDOM nodes, the JS engine 100 synchronously creates the corresponding front-end DOM nodes.
  • the VODM tree is also a JS object tree that simulates a DOM tree.
  • Data binding refers to the process of establishing a connection between the application UI interface and the data source. When the data of each VDOM node changes, the corresponding UI node will automatically draw and update.
  • the first component 200 can sequentially create corresponding back-end component nodes according to the front-end DOM node , element (element) nodes, and rendering (render) nodes to complete layout calculations, component style and property updates, etc.
  • the first component 200 may execute S203-S205.
  • the first component 200 connects the front-end DOM node corresponding to the Android view control to the back-end component node.
  • the first component 200 creates a corresponding element (element) node according to the backend component node, and creates a corresponding rendering (render) node in the element node.
  • the first component 200 calculates the layout information corresponding to the Android view control at the drawing node.
  • the layout information may include a position and a size
  • the size is the layout size.
  • the first component 200 can calculate the position and size of each DOM node corresponding to the Android view control at the drawing node.
  • the first component 200 creates a third-party plug-in corresponding to the Android view control according to the plugin attribute of the Android view control, and creates texture data corresponding to the Android view control.
  • the JS engine 100 can analyze and obtain that the plugin (plugin) attribute of the Android view control in the JS application is: webview control.
  • the first component 200 in S206 is a webview control according to the plugin attribute of the Android view control
  • the created third-party plug-in corresponding to the Android view control may be: a webview plug-in.
  • part of the code of the HTML file in the JS application designed (or developed) by the developer is as follows:
  • the JS engine 100 can analyze and obtain that the plugin (plugin) attribute of the Android view control in the JS application is: mapview control.
  • the first component 200 in S206 is a mapview control according to the plugin attribute of the Android view control
  • the third-party plug-in corresponding to the created Android view control may be: a mapview plugin.
  • the first component 200 generates the texture identification of the texture data corresponding to the Android view control, and registers the texture data corresponding to the Android view control to a graphics processor (graphics processing unit, GPU) according to the texture identification of the texture data corresponding to the Android view control. )middle.
  • the texture identifier of the texture data corresponding to the Android view control may be a texture ID of the texture data, and the texture ID may represent unique identity information of the texture data.
  • the rendering engine 300 can subsequently index/obtain the Android view from the GPU according to the texture identifier of the texture data corresponding to the Android view control The texture data corresponding to the control.
  • S205 and S206-S207 may be executed synchronously.
  • S205 may also be performed before or after S206-S207.
  • the first component 200 sends the layout information corresponding to the Android view control and the texture identifier of the texture data corresponding to the Android view control to the rendering engine 300.
  • the first component 200 displays the texture data corresponding to the Android view control according to the third-party plug-in corresponding to the Android view control.
  • the step of the first component 200 displaying the texture data corresponding to the Android view control according to the third-party plug-in corresponding to the Android view control may include: the first component 200 acquires a canvas texture (surfaceTexture) object.
  • the first component 200 creates a virtual display controller (virtual display controller) in a third-party plug-in, and creates a corresponding virtual display controller (virtual display) according to the canvas (surface) in the surfaceTexture object and the layout information corresponding to the Android view control.
  • the first component 200 synthesizes the third-party plug-in and the virtual display control through the virtual display controller, and creates a corresponding display control (presentation).
  • the first component 200 adds a third-party plug-in to the container corresponding to the view in the display control through the display control, and sends the texture data corresponding to the Android view control to the surfaceTexture object and synthesizes it on the surface.
  • FIG. 3 is a schematic structural diagram of a first component 200 provided in an embodiment of the present application.
  • the process described in S203-S207 and S209 will be specifically described below in conjunction with the structure shown in FIG. 3 .
  • the first component 200 may include:
  • the component module 210 can also be called the NativeTextureComponent module, can be used to create the component node corresponding to the Android view control, and the style and attribute of the JS object parsed from the JS bundle by the JS engine 100 are set to the corresponding component node.
  • the element module 220 can also be called the NativeTextureElement module, which can be used to create the element node corresponding to the component node, and mount the element node on the element tree, so as to facilitate the layout calculation of the whole tree and the update of related style attributes .
  • the drawing module 230 which can also be called the NativeTextureRender module, can be used to create a render node corresponding to the Android view control, and calculate the layout information corresponding to the Android view control at the render node. After the drawing module 230 calculates the layout information corresponding to the Android view control, it can create a resource management module 231 .
  • the resource management module 231 can also be called a NativeDelegate module or a resource manager.
  • the first component 200 also includes: a resource registration module 240 , a texture plug-in module 250 , and a texture module 260 .
  • the resource registration module may also be called a ResourceRegister module
  • the texture plug-in module may also be called a TexturePlugin module
  • the texture module may also be called a Texture module.
  • the resource management module 231 can create texture data corresponding to the Android view control, generate a texture identifier of the texture data corresponding to the Android view control, and register the texture data corresponding to the Android view control in the GPU based on the resource registration module 240 and the texture plug-in module 250 .
  • Texture module 260 may create a corresponding canvas texture (surfaceTexture) object.
  • the drawing module 230 When the drawing module 230 creates the resource management module 231, it can match the plug-in (plugin) attribute of the Android view control from the first component 200 according to the plug-in (plugin) attribute of the Android view control obtained by the JS engine 100 in S201.
  • Corresponding Android view control plug-in module and Android view control module and based on asynchronous callback technology, create a third party corresponding to Android view control through the Android view control plug-in module and Android view control module corresponding to the plug-in attribute of the Android view control plugin.
  • the Android view control plug-in module and the Android view control module may be preset in the first component 200 by the developer of the operating system.
  • a webview control plug-in module and a webview control module corresponding to the webview control may be preset in the first component 200.
  • the plug-in (plugin) attribute of the Android view control obtained by JS engine 100 analysis in S201 is a webview control
  • the drawing module 230 A webview control plug-in module (WebviewPlugin module) and a webview control module (Webview module) corresponding to the plugin attribute of the Android view control can be matched from the first component 200 .
  • mapview control plug-in module (mapviewPlugin module) and mapview control module (mapview module) corresponding to the mapview control can also be preset in the first component 200, when the plug-in (plugin ) attribute is the mapview control, the drawing module 230 can match the mapview control plug-in module and the mapview control module etc. corresponding to the plug-in (plugin) attribute of the Android view control from the first component 200 .
  • the drawing module 230 After the drawing module 230 creates a third-party plug-in corresponding to the Android view control, it can create a virtual display controller ( virtual display controller); and create a corresponding virtual display control (virtual display) according to the surface in the surfaceTexture object and the layout information corresponding to the Android view control.
  • the virtual display controller synthesizes the third-party plug-in and the virtual display control, and creates a corresponding display control (presentation).
  • the display control can add a third-party plug-in to the container corresponding to the view in the display control, and send the texture data corresponding to the Android view control to the surfaceTexture object and synthesize it on the surface.
  • the rendering engine 300 obtains the texture data corresponding to the Android view control from the GPU according to the texture identifier of the texture data corresponding to the Android view control, and renders the texture data corresponding to the Android view control according to the layout information corresponding to the Android view control.
  • the rendering engine 300 obtains the texture data corresponding to the Android view control from the GPU according to the texture identifier of the texture data corresponding to the Android view control, and renders the texture data corresponding to the Android view control according to the layout information corresponding to the Android view control.
  • the steps may include: the rendering engine 300 obtains the surface from the surfaceTexture object, and obtains the texture data corresponding to the Android view control from the GPU according to the texture identifier of the texture data corresponding to the Android view control.
  • the rendering engine 300 renders the texture data corresponding to the Android view control according to the layout information corresponding to the Android view control.
  • the JS engine 100 analyzes the JS package (bundle) in the JS application developed by the developer, and can obtain the Android view control.
  • the plug-in (plugin) attribute is the webview control, the JS object corresponding to the webview control, and the styles and attributes of the JS object. Then, the JS engine 100 can generate the front-end DOM node corresponding to the webview control according to the JS object corresponding to the webview control and the styles and attributes of the JS object.
  • the first component 200 can connect the front-end DOM node corresponding to the webview control to the back-end component node, create a corresponding element node according to the back-end component node, and create a corresponding render node in the element node. Then, the first component 200 can calculate the layout information corresponding to the webview control at the render node. In addition, the first component 200 can create a webview plug-in and texture data corresponding to the webview control. After creating the texture data corresponding to the webview control, the first component 200 may generate a texture identifier of the texture data corresponding to the webview control, and register the texture data corresponding to the webview control into the GPU according to the texture identifier of the texture data corresponding to the webview control.
  • the first component 200 may send the layout information corresponding to the webview control and the texture identifier of the texture data corresponding to the webview control to the rendering engine 300 .
  • the first component 200 can also obtain a canvas texture (surfaceTexture) object, create a virtual display controller (virtual display controller) in the webview plug-in, and create a corresponding layout information according to the canvas (surface) in the surfaceTexture object and the webview control Virtual display control (virtual display).
  • the first component 200 can synthesize the webview plug-in and the virtual display control through the virtual display controller, and create a corresponding display control (presentation).
  • the first component 200 can add the webview plug-in to the container corresponding to the view in the display control through the display control, and send the texture data corresponding to the webview control to the surfaceTexture object to synthesize on the surface.
  • the rendering engine 300 can obtain the texture data corresponding to the webview control from the GPU according to the texture identifier of the texture data corresponding to the webview control, and render the texture data corresponding to the webview control according to the layout information corresponding to the webview control. Therefore, the JS framework can support displaying the webview control.
  • FIG. 4 is a schematic diagram showing a rendering effect of an Android view control provided by an embodiment of the present application.
  • the plugin attribute of the Android view control preset in the code of the HTML file of the JS application designed (or developed) by the developer is a webview control (the code can refer to the description in the foregoing embodiments)
  • the method described in the embodiment of the present application can render a webpage similar to that shown in FIG. 4 .
  • Figure 4 is for illustration only.
  • the development framework of the operating system can support other Android view controls in the manner described in the foregoing embodiments.
  • the display of the control is not limited to: mapview control
  • FIG. 5 is a schematic diagram showing another rendering effect of an Android view control provided by an embodiment of the present application.
  • the plugin attribute of the Android view control preset in the code of the HTML file of the JS application designed (or developed) by the developer is a mapview control (the code can refer to the description in the foregoing embodiments)
  • the method described in the embodiment of the present application can render a map page similar to that shown in FIG. 5 .
  • FIG. 5 is also illustrated by way of example only.
  • the developer of the JS application only needs to set the plug-in (plugin) attribute of the Android view control in the code of the HTML file of the JS application, so that the operating system side can support the display.
  • Different Android view controls When the operating system side displays different Android view controls according to the method for displaying Android view controls provided in the embodiment of the present application, it does not need to consider the adaptation and connection of different Android view controls, and the entire implementation process can be automatically completed by the development framework of the operating system , developers of JS applications may not be aware of it, which greatly reduces the difficulty of developing JS applications that adapt to the operating system, and is conducive to attracting more developers to participate in the construction of the OS ecosystem.
  • the development framework on the operating system side can realize the goal of one-time development and the deployment of various complex Android views, greatly reducing development costs.
  • the developer of the JS application sets the plugin (plugin) attribute of the Android view control in the code of the HTML file of the JS application, and the development framework of the operating system automatically matches the corresponding third-party Plug-ins can also prevent developers from secondary development for different Android view controls, which can effectively improve development efficiency.
  • the rendering engine can directly obtain the texture data corresponding to the Android view control from the GPU for rendering. It does not need to transfer texture data between the CPU and the GPU, which can save the CPU's copying time between different texture caches and the time it takes for the texture to be uploaded to the GPU. , greatly improving rendering performance. At the same time, it also saves the memory occupied by texture data and GPU memory.
  • the Android view control is used as the webview control or the mapview control in the foregoing embodiments, the illustration is exemplified.
  • the view controls that can be displayed by the operating system through the method for displaying view controls provided in the embodiments of the present application may also be cameraview controls, textview controls, live view controls, advertisement view controls, and screen projection view controls And other Android view controls, or Hongmeng view controls, other view controls developed in java language and other view controls.
  • the view control is of the other types mentioned above, its implementation principle is the same as that of displaying the webview control or mapview control described in the foregoing embodiments, and will not be repeated one by one.
  • the JS development framework can be replaced with the development framework of other languages corresponding to other systems, and the JS engine can be developed for other languages.
  • the development framework corresponding to the ios TM system may be a Swift development framework, and the Swift development framework may include a Swift engine, a first component, a rendering engine, and the like.
  • the application developed by the developer based on the view control may be called the first application, and the package file corresponding to the first application (such as the JS bundle of the JS application) may be called the first package file.
  • the engine that parses the first application in the development framework of the operating system (such as the JS engine in the Hongmeng system) may be called the first engine, and the rendering engine may be called the second engine.
  • the first engine parses the first packaged file in the first application, and the obtained object (such as a JS object) may be called a first object.
  • an embodiment of the present application further provides a terminal device, and the terminal device may include a first engine, a first component, and a second engine.
  • the first engine is used to determine the plug-in attribute of the view control corresponding to the first application, and the front-end document object model node corresponding to the view control.
  • the first component is used to determine the layout information corresponding to the view control according to the front-end document object model node corresponding to the view control; according to the plug-in properties of the view control, create a third-party plug-in and texture data corresponding to the view control, and set the texture corresponding to the view control Data is registered to the GPU.
  • the first component is also used to send the layout information corresponding to the view control to the second engine; display the texture data corresponding to the view control according to the third-party plug-in corresponding to the view control.
  • the second engine is configured to acquire texture data corresponding to the view control from the GPU, and render the texture data corresponding to the view control according to layout information corresponding to the view control.
  • the first application may be an application developed by a developer based on the view control.
  • the view control may include any one of a web page view control, a map view control, a camera view control, a video view control, a text view control, a live view control, an advertisement view control, a screen projection view control, and the like.
  • the view control may be an Android view control, a Hongmeng view control, or a view control developed in other java languages, which is not limited here.
  • the first packaging file of the first application includes the plug-in attribute of the view control corresponding to the first application, and the plug-in attribute of the view control is preconfigured.
  • the developer of the first application may set the plug-in property of the view control as a web page view control or a map view control in the first packaging file of the first application, and preconfigure the plug-in property of the view control.
  • the first component may be specifically configured to generate a texture identifier of the texture data corresponding to the view control, and register the texture data corresponding to the view control into the GPU according to the texture identifier of the texture data corresponding to the view control.
  • the first component is further configured to send the texture identifier of the texture data corresponding to the view control to the second engine.
  • the second engine may be configured to acquire the texture data corresponding to the view control from the GPU according to the texture identifier of the texture data corresponding to the view control.
  • the first component can be specifically used to obtain a canvas texture object; create a virtual display controller in a third-party plug-in corresponding to the view control, and create a corresponding Virtual display control; synthesize the third-party plug-in corresponding to the view control with the virtual display control through the virtual display controller, and create the corresponding display control; add the third-party plug-in corresponding to the view control to the corresponding view in the display control through the display control on the container, and send the texture data corresponding to the view control to the canvas texture object to synthesize on the canvas.
  • the second engine may be specifically configured to obtain the canvas from the canvas texture object, and obtain the texture data corresponding to the view control from the GPU; render the texture data corresponding to the view control according to the layout information corresponding to the view control.
  • the first engine may specifically be used to parse the first packaged file in the first application to obtain the plug-in properties of the view control corresponding to the first application, the first object corresponding to the view control and the style of the first object and attributes; according to the first object corresponding to the view control and the style and attributes of the first object, generate a front-end document object model node corresponding to the view control.
  • the first application may be a JS application
  • the first package file may be a JS package
  • the first object may be a JS object.
  • the first component can specifically be used to connect the front-end document object model node corresponding to the view control to the back-end component node; create a corresponding element node according to the back-end component node, and create a corresponding drawing node in the element node;
  • the layout information corresponding to the view control is calculated at the drawing node.
  • the layout information corresponding to the view control may include the position and size corresponding to the view control.
  • the first component may include: a component module, an element module, a drawing module, a resource registration module, a texture plug-in module, and a texture module.
  • the component module can be used to create a component node corresponding to the view control, and set the style and property of the first object parsed by the first engine to the corresponding component node.
  • the element module can be used to create the element node corresponding to the component node, and mount the element node on the element tree to facilitate the layout calculation of the entire tree and the update of related style attributes.
  • the drawing module can be used to create a drawing node corresponding to the display of the view control, and calculate the layout information corresponding to the view control in the drawing node. After the drawing module calculates the layout information corresponding to the view control, it can create a resource management module.
  • the resource registration module can create texture data corresponding to the view control, generate a texture identifier of the texture data corresponding to the view, and register the texture data corresponding to the view control into the GPU based on the resource registration module and the texture plug-in module.
  • the texture module can create the corresponding canvas texture (surfaceTexture) object.
  • the drawing module When the drawing module creates the resource management module, it can match the view control plug-in module and The view control module, based on the asynchronous callback technology, creates a third-party plug-in corresponding to the view control through the view control plug-in module and the view control module corresponding to the plug-in attribute of the view control.
  • the drawing module After the drawing module creates the third-party plug-in corresponding to the view control, it can create a virtual display controller in the third-party plug-in based on the surfaceTexture object and the view control plug-in module corresponding to the plug-in attribute of the view control. ; and create a corresponding virtual display control (virtual display) according to the surface in the surfaceTexture object and the layout information corresponding to the view control.
  • the virtual display controller can synthesize third-party plug-ins and virtual display controls, and create corresponding display controls (presentation).
  • the display control can add a third-party plug-in to the container corresponding to the view in the display control, and send the texture data corresponding to the view control to the surfaceTexture object and synthesize it on the surface.
  • the operating system of the terminal device may include any of these.
  • the first engine, the first component, and the second engine are all codes in the development framework of the operating system of the terminal device.
  • the first engine may be a JS engine
  • the second engine may be a rendering engine
  • the terminal device may realize all functions of the method for displaying a view control described in the foregoing method embodiments of the present application through the first engine, the first component, and the second engine.
  • the embodiment of the present application further provides an apparatus for displaying a view control, which can be used to implement the method for displaying a view control described in the foregoing method embodiments.
  • the functions of the device can be realized by hardware, and can also be realized by executing corresponding software by hardware.
  • Hardware or software includes one or more modules or units corresponding to the functions described above.
  • the apparatus for displaying view controls may include modules such as a first engine, a first component, and a second engine.
  • the apparatus for displaying view controls can implement all functions of the method for displaying view controls described in the foregoing method embodiments of the present application through modules such as the first engine, the first component, and the second engine.
  • the first engine may be used to determine the plug-in attribute of the view control corresponding to the first application, and the front-end document object model node corresponding to the view control.
  • the first component can be used to determine the layout information corresponding to the view control according to the front-end document object model node corresponding to the view control; according to the plug-in properties of the view control, create a third-party plug-in and texture data corresponding to the view control, and set the corresponding The texture data is registered to the GPU.
  • the first component can also be used to send the layout information corresponding to the view control to the second engine; display the texture data corresponding to the view control according to the third-party plug-in corresponding to the view control.
  • the second engine may be used to acquire texture data corresponding to the view control from the GPU, and render the texture data corresponding to the view control according to layout information corresponding to the view control.
  • the first application may be an application developed by a developer based on the view control.
  • the view control may include any one of a web page view control, a map view control, a camera view control, a video view control, a text view control, a live view control, an advertisement view control, a screen projection view control, and the like.
  • the view control may be an Android view control, a Hongmeng view control, or a view control developed in other java languages, which is not limited here.
  • the first packaging file of the first application includes the plug-in attribute of the view control corresponding to the first application, and the plug-in attribute of the view control is preconfigured.
  • the developer of the first application may set the plug-in property of the view control as a web page view control or a map view control in the first packaging file of the first application, and preconfigure the plug-in property of the view control.
  • the first component may be specifically configured to generate a texture identifier of the texture data corresponding to the view control, and register the texture data corresponding to the view control into the GPU according to the texture identifier of the texture data corresponding to the view control.
  • the first component is further configured to send the texture identifier of the texture data corresponding to the view control to the second engine.
  • the second engine may be configured to acquire the texture data corresponding to the view control from the GPU according to the texture identifier of the texture data corresponding to the view control.
  • the first component can be specifically used to obtain a canvas texture object; create a virtual display controller in a third-party plug-in corresponding to the view control, and create a corresponding Virtual display control; synthesize the third-party plug-in corresponding to the view control with the virtual display control through the virtual display controller, and create the corresponding display control; add the third-party plug-in corresponding to the view control to the corresponding view in the display control through the display control on the container, and send the texture data corresponding to the view control to the canvas texture object to synthesize on the canvas.
  • the second engine may be specifically configured to obtain the canvas from the canvas texture object, and obtain the texture data corresponding to the view control from the GPU; render the texture data corresponding to the view control according to the layout information corresponding to the view control.
  • the first engine may specifically be used to parse the first packaged file in the first application to obtain the plug-in properties of the view control corresponding to the first application, the first object corresponding to the view control and the style of the first object and attributes; according to the first object corresponding to the view control and the style and attributes of the first object, generate a front-end document object model node corresponding to the view control.
  • the first component can specifically be used to connect the front-end document object model node corresponding to the view control to the back-end component node; create a corresponding element node according to the back-end component node, and create a corresponding drawing node in the element node;
  • the layout information corresponding to the view control is calculated at the drawing node.
  • the layout information corresponding to the view control may include the position and size corresponding to the view control.
  • the first component may include: a component module, an element module, a drawing module, a resource registration module, a texture plug-in module, and a texture module.
  • a component module an element module
  • a drawing module a drawing module
  • a resource registration module a texture plug-in module
  • a texture plug-in module a texture module
  • the component module can be used to create a component node corresponding to the view control, and set the style and property of the first object parsed by the first engine to the corresponding component node.
  • the element module can be used to create the element node corresponding to the component node, and mount the element node on the element tree to facilitate the layout calculation of the entire tree and the update of related style attributes.
  • the drawing module can be used to create a drawing node corresponding to the display of the view control, and calculate the layout information corresponding to the view control in the drawing node. After the drawing module calculates the layout information corresponding to the view control, it can create a resource management module.
  • the resource registration module can create texture data corresponding to the view control, generate a texture identifier of the texture data corresponding to the view, and register the texture data corresponding to the view control into the GPU based on the resource registration module and the texture plug-in module.
  • the texture module can create the corresponding canvas texture (surfaceTexture) object.
  • the drawing module When the drawing module creates the resource management module, it can match the view control plug-in module and The view control module, based on the asynchronous callback technology, creates a third-party plug-in corresponding to the view control through the view control plug-in module and the view control module corresponding to the plug-in attribute of the view control.
  • the drawing module After the drawing module creates the third-party plug-in corresponding to the view control, it can create a virtual display controller in the third-party plug-in based on the surfaceTexture object and the view control plug-in module corresponding to the plug-in attribute of the view control. ; and create a corresponding virtual display control (virtual display) according to the surface in the surfaceTexture object and the layout information corresponding to the view control.
  • the virtual display controller can synthesize third-party plug-ins and virtual display controls, and create corresponding display controls (presentation).
  • the display control can add a third-party plug-in to the container corresponding to the view in the display control, and send the texture data corresponding to the view control to the surfaceTexture object and synthesize it on the surface.
  • the operating system of the terminal device may include any of these.
  • the first engine, the first component, and the second engine are all codes in the development framework of the operating system of the terminal device.
  • the division of units (or modules) in the above device is only a logical function division, which can be implemented in practice Fully or partially integrated into one physical entity, or physically separate.
  • the units in the device can all be implemented in the form of software called by the processing element; they can also be implemented in the form of hardware; some units can also be implemented in the form of software called by the processing element, and some units can be implemented in the form of hardware.
  • each unit can be a separate processing element, or it can be integrated in a certain chip of the device. In addition, it can also be stored in the memory in the form of a program, which is called and executed by a certain processing element of the device. Function. In addition, all or part of these units can be integrated together, or implemented independently.
  • the processing element described here may also be referred to as a processor, and may be an integrated circuit with a signal processing capability. In the process of implementation, each step of the above method or each unit above may be implemented by an integrated logic circuit of hardware in the processor element or implemented in the form of software called by the processing element.
  • the units in the above device may be one or more integrated circuits configured to implement the above method, for example: one or more application specific integrated circuits (ASIC), or, one or more A digital signal processor (DSP), or, one or more field programmable gate arrays (FPGA), or a combination of at least two of these integrated circuit forms.
  • ASIC application specific integrated circuits
  • DSP digital signal processor
  • FPGA field programmable gate arrays
  • the processing element can be a general-purpose processor, such as a central processing unit (central processing unit, CPU) or other processors that can call programs.
  • CPU central processing unit
  • these units can be integrated together and implemented in the form of a system-on-a-chip (SOC).
  • the units of the above apparatus for implementing each corresponding step in the above method may be implemented in the form of a processing element scheduler.
  • the apparatus may include a processing element and a storage element, and the processing element invokes a program stored in the storage element to execute the methods described in the above method embodiments.
  • the storage element may be a storage element on the same chip as the processing element, that is, an on-chip storage element.
  • the program for executing the above method may be stored in a storage element on a different chip from the processing element, that is, an off-chip storage element.
  • the processing element invokes or loads a program from the off-chip storage element to the on-chip storage element, so as to invoke and execute the methods described in the above method embodiments.
  • FIG. 6 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • the electronic device may include: a processor 601; a memory 602; and a computer program; wherein the computer program is stored on the memory 602, and when the computer program is executed by the processor 601 , so that the electronic device implements the method for displaying a view control as described in the foregoing embodiments.
  • the memory 602 may be located inside the electronic device, or outside the electronic device.
  • the processor 601 includes one or more.
  • the electronic device may be a mobile phone, a tablet computer, a wearable device (such as a smart watch, a smart bracelet, etc.), a vehicle device, an augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) device , notebook computer, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook, personal digital assistant (personal digital assistant, PDA), etc.
  • augmented reality augmented reality, AR
  • VR virtual reality
  • notebook computer ultra-mobile personal computer
  • ultra-mobile personal computer ultra-mobile personal computer
  • UMPC ultra-mobile personal computer
  • netbook personal digital assistant
  • PDA personal digital assistant
  • the unit of the device implementing each step in the above method may be configured as one or more processing elements, where the processing elements may be integrated circuits, for example: one or more ASICs, or, one or more Multiple DSPs, or, one or more FPGAs, or a combination of these types of integrated circuits. These integrated circuits can be integrated together to form a chip.
  • an embodiment of the present application further provides a chip, and the chip can be applied to the above-mentioned electronic device.
  • the chip includes one or more interface circuits and one or more processors; the interface circuits and processors are interconnected through lines; the processor receives and executes computer instructions from the memory of the electronic device through the interface circuits, so as to realize the Method for displaying view controls.
  • An embodiment of the present application further provides a computer program product, including computer readable codes, and when the computer readable codes are run in the electronic device, the electronic device implements the method for displaying a view control as described in the foregoing embodiments.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • the integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • the software product is stored in a program product, such as a computer-readable storage medium, and includes several instructions to make a device (which may be a single-chip microcomputer, a chip, etc.) or a processor (processor) execute all of the methods described in various embodiments of the present application. or partial steps.
  • the aforementioned storage medium includes: various media capable of storing program codes such as U disk, mobile hard disk, ROM, RAM, magnetic disk or optical disk.
  • an embodiment of the present application may also provide a computer-readable storage medium, the computer-readable storage medium includes a computer program, and when the computer program is run on an electronic device, the electronic device implements the above-mentioned embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种显示视图控件的方法及装置,涉及电子设备领域。方法可以应用于终端设备,终端设备包括第一引擎、第一组件、以及第二引擎。方法包括:第一引擎确定第一应用对应的视图控件的插件属性、以及视图控件对应的前端DOM节点。第一组件根据前端DOM节点确定视图控件对应的布局信息;根据插件属性创建视图控件对应的第三方插件和纹理数据,并将纹理数据注册到GPU中。第一组件向第二引擎发送布局信息。第一组件根据第三方插件显示纹理数据。第二引擎从GPU获取纹理数据,并根据布局信息对纹理数据进行渲染。终端设备通过方法可以显示各种不同的视图控件,对不同的视图控件具有通用性。

Description

显示视图控件的方法及装置
本申请要求于2021年5月31日提交国家知识产权局、申请号为202110602158.6、申请名称为“显示视图控件的方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及电子设备领域,尤其涉及一种显示视图控件的方法及装置。
背景技术
操作系统(Operation System,OS)支持显示各种类型的视图控件,有利于OS生态的构建。例如,对操作系统A而言,当操作系统A支持显示安卓(android)的视图(view)控件(以下简称安卓view控件)时,安装操作系统A的终端设备可以支持显示基于安卓view控件开发的应用页面,有利于操作系统A构建OS生态。其中,安卓view控件可以是网页视图(webview)控件、地图视图(mapview)控件、相机视图(cameraview)控件等。
目前,实现操作系统支持显示视图控件的方案主要为:针对每一种视图控件,操作系统的开发框架可以提供一个与该视图控件对应的定制化控件,操作系统可以通过与该视图控件对应的定制化控件显示该视图控件。其中,不同的视图控件对应不同的定制化控件。
但是,上述方案中,视图控件的接口众多,导致在操作系统的开发框架中实现与视图控件对应的定制化控件是非常复杂的,开发工作量极大。另外,一旦操作系统有支持新的视图控件的需求,则操作系统的开发框架需要重新定制开发,这将严重影响开发效率,不利于OS生态的发展。
发明内容
本申请实施例提供一种显示视图控件的方法及装置,可以使得终端设备能够显示各种视图控件,对不同的视图控件具有通用性。
第一方面,本申请实施例提供一种显示视图控件的方法,该方法可以应用于终端设备。终端设备包括第一引擎、第一组件、以及第二引擎。
该方法包括:第一引擎确定第一应用对应的视图控件的插件属性、以及视图控件对应的前端文档对象模型节点。第一组件根据视图控件对应的前端文档对象模型节点,确定视图控件对应的布局信息。第一组件根据视图控件的插件属性,创建视图控件对应的第三方插件和纹理数据,并将视图控件对应的纹理数据注册到GPU中。第一组件向第二引擎发送视图控件对应的布局信息。第一组件根据视图控件对应的第三方插件,显示视图控件对应的纹理数据。第二引擎从GPU获取视图控件对应的纹理数据,并根据视图控件对应的布局信息对视图控件对应的纹理数据进行渲染。
其中,第一应用可以为开发人员基于视图控件开发的应用。
可选地,视图控件可以包括网页视图控件、地图视图控件、相机视图控件、视频视图控件、文本视图控件、直播视图控件、广告视图控件、投屏视图控件等中的任意一种。
一些实施例中,视图控件可以是安卓视图控件、或者鸿蒙视图控件、又或者其他java语言开发的视图控件,在此不作限制。
第一应用的第一打包文件包括第一应用对应的视图控件的插件属性,视图控件的插件属性是预配置的。例如,第一应用的开发人员可以在第一应用的第一打包文件中设置视图控件的插件属性为网页视图控件或地图视图控件,对视图控件的插件属性进行预配置。
该方法中,第一应用的开发人员只需要在第一应用的代码中设置视图控件的插件(plugin)属性,即可使得终端设备可以实现支持显示不同的视图控件。终端设备按照该方法来显示不同视图控件时,不需要考虑不同视图控件的适配对接问题,整个实现流程可以由终端设备的操作系统的开发框架自动完成,第一应用的开发人员可以不感知,大大降低了适配终端设备的操作系统的第一应用的开发难度,有利于终端设备的操作系统生态的构建。
另外,终端设备的操作系统的开发框架可以实现一次开发,可部署各种复杂的视图控件的目标,大大降低开发成本。第一应用的开发人员在第一应用的代码中设置视图控件的插件(plugin)属性,操作系统的开发框架自动根据开发人员设置的视图控件的插件(plugin)属性匹配对应的第三方插件,还可以避免开发人员针对不同的视图控件进行二次开发,能够有效提高开发效率。
进一步地,该方法通过将视图控件对应的纹理数据直接注册到GPU中,还可以避免在中央处理器(central processing unit,CPU)中创建纹理缓存。第二引擎可以直接从GPU中获取视图控件对应的纹理数据进行渲染,不需要CPU和GPU之间进行纹理数据的传递,可以节省CPU在不同纹理缓存之间的拷贝时间以及纹理上传到GPU的时间,大大提高渲染性能。同时,还节省了纹理数据占用的内存和GPU显存。
可选地,所述第一组件将视图控件对应的纹理数据注册到GPU中的步骤,可以包括:
第一组件生成视图控件对应的纹理数据的纹理标识,并根据视图控件对应的纹理数据的纹理标识,将视图控件对应的纹理数据注册到GPU中。
该方法还包括:第一组件向第二引擎发送视图控件对应的纹理数据的纹理标识。
所述第二引擎从GPU获取视图控件对应的纹理数据的步骤,可以包括:
第二引擎根据视图控件对应的纹理数据的纹理标识,从GPU获取视图控件对应的纹理数据。
其中,纹理数据的纹理标识可以是纹理ID,纹理ID与纹理数据之间为一一绑定的关系。第二引擎从GPU获取纹理数据时,可以基于纹理ID从GPU中索引纹理ID对应的纹理数据。
可选地,所述第一组件根据视图控件对应的第三方插件,显示视图控件对应的纹理数据的步骤,可以包括:
第一组件获取画布纹理对象。第一组件在视图控件对应的第三方插件中创建虚显控制器,并根据画布纹理对象中的画布、以及视图控件对应的布局信息创建对应的虚显控件。第一组件通过虚显控制器将视图控件对应的第三方插件与虚显控件进行合成,并创建对应的显示控件。第一组件通过显示控件将视图控件对应的第三方插件添加到 显示控件中的视图对应的容器上,并将视图控件对应的纹理数据送至画布纹理对象中合成到画布上。
可选地,所述第二引擎从GPU获取视图控件对应的纹理数据,并根据视图控件对应的布局信息对视图控件对应的纹理数据进行渲染,包括:
第二引擎从画布纹理对象中获取画布,并从GPU获取视图控件对应的纹理数据。第二引擎根据视图控件对应的布局信息对视图控件对应的纹理数据进行渲染。
可选地,所述第一引擎确定第一应用对应的视图控件的插件属性、以及视图控件对应的前端文档对象模型节点,包括:
第一引擎对第一应用中的第一打包文件进行解析,得到第一应用对应的视图控件的插件属性、以及视图控件对应的第一对象及第一对象的样式和属性;第一引擎根据视图控件对应的第一对象及第一对象的样式和属性,生成视图控件对应的前端文档对象模型节点。
示例性地,第一应用可以为JS应用,第一打包文件可以为JS包。第一对象可以是JS对象。
可选地,所述第一组件根据视图控件对应的前端文档对象模型节点,确定视图控件对应的布局信息,包括:
第一组件将视图控件对应的前端文档对象模型节点对接到后端组件节点。第一组件根据后端组件节点创建对应的元素节点,并在元素节点中创建对应的绘制节点。第一组件在绘制节点计算所述视图控件对应的布局信息。
示例性地,视图控件对应的布局信息可以包括视图控件对应的位置和尺寸。
一些实施例中,第一组件可以包括:组件模块、元素模块、绘制模块、资源注册模块、纹理插件模块、纹理模块。
其中,组件模块可以用于创建视图控件对应的组件节点,将第一引擎解析的第一对象的样式和属性设置给对应的组件节点。元素模块可以用于创建组件节点对应的元素节点,并将元素节点挂载在元素树上,以便于整颗树的布局计算及相关样式属性的更新。绘制模块可以用于创建视图控件对应显示的绘制节点,并在绘制节点计算视图控件对应的布局信息。绘制模块在计算完视图控件对应的布局信息后,可以创建一个资源管理模块。
资源注册模块可以创建视图控件对应的纹理数据,生成视图对应的纹理数据的纹理标识,并基于资源注册模块和纹理插件模块将视图控件对应的纹理数据注册到GPU中。纹理模块可以创建对应的画布纹理(surfaceTexture)对象。
绘制模块在创建资源管理模块的同时,可以根据第一引擎解析得到的视图控件的插件(plugin)属性,从第一组件中匹配出与视图控件的插件(plugin)属性对应的视图控件插件模块和视图控件模块,并基于异步回调技术,通过与视图控件的插件(plugin)属性对应的视图控件插件模块和视图控件模块创建视图控件对应的第三方插件。
其中,视图控件插件模块和视图控件模块可以是由操作系统的开发人员预设在第一组件中的。
例如,第一组件中可以预设有webview控件对应的webview控件插件模块和 webview控件模块,当第一引擎解析得到的视图控件的插件(plugin)属性为webview控件时,绘制模块可以从第一组件中匹配出与webview控件插件模块和webview控件模块。
又例如,第一组件中还可以预设有mapview控件对应的mapview控件插件模块和mapview控件模块,当第一引擎解析得到的视图控件的插件(plugin)属性为mapview控件时,绘制模块可以从第一组件中匹配出mapview控件插件模块和mapview控件模块等。
绘制模块在创建视图控件对应的第三方插件后,可以在第三方插件中,基于surfaceTexture对象、以及与视图控件的插件(plugin)属性对应的视图控件插件模块创建虚显控制器(virtual display controller);并根据surfaceTexture对象中的surface、以及视图控件对应的布局信息创建对应的虚显控件(virtual display)。虚显控制器可以将第三方插件与虚显控件进行合成,并创建对应的显示控件(presentation)。显示控件可以将第三方插件添加到显示控件中的视图对应的容器上,并将视图控件对应的纹理数据送至到surfaceTexture对象中合成到surface上。
示例性地,终端设备的操作系统可以包括
Figure PCTCN2022085167-appb-000001
Figure PCTCN2022085167-appb-000002
等中的任意一种。
例如,一种可能的实现场景中,操作系统可以为鸿蒙系统,操作系统的开发框架为JS开发框架,第一引擎可以为JS引擎,第二引擎可以为渲染引擎。
具体地,渲染引擎可以为skia引擎。
第二方面,本申请实施例提供一种终端设备,可以包括第一引擎、第一组件、以及第二引擎。
其中,第一引擎用于确定第一应用对应的视图控件的插件属性、以及视图控件对应的前端文档对象模型节点。第一组件用于根据视图控件对应的前端文档对象模型节点,确定视图控件对应的布局信息;根据视图控件的插件属性,创建视图控件对应的第三方插件和纹理数据,并将视图控件对应的纹理数据注册到GPU中。第一组件还用于向第二引擎发送视图控件对应的布局信息;根据视图控件对应的第三方插件,显示视图控件对应的纹理数据。第二引擎用于从GPU获取视图控件对应的纹理数据,并根据视图控件对应的布局信息对视图控件对应的纹理数据进行渲染。
其中,第一应用可以为开发人员基于视图控件开发的应用。
可选地,视图控件可以包括网页视图控件、地图视图控件、相机视图控件、视频视图控件、文本视图控件、直播视图控件、广告视图控件、投屏视图控件等中的任意一种。
一些实施例中,视图控件可以是安卓视图控件、或者鸿蒙视图控件、又或者其他java语言开发的视图控件,在此不作限制。
第一应用的第一打包文件包括第一应用对应的视图控件的插件属性,视图控件的插件属性是预配置的。例如,第一应用的开发人员可以在第一应用的第一打包文件中设置视图控件的插件属性为网页视图控件或地图视图控件,对视图控件的插件属性进行预配置。
可选地,第一组件具体可以用于生成视图控件对应的纹理数据的纹理标识,并根 据视图控件对应的纹理数据的纹理标识,将视图控件对应的纹理数据注册到GPU中。第一组件还用于向第二引擎发送视图控件对应的纹理数据的纹理标识。第二引擎具体可以用于根据视图控件对应的纹理数据的纹理标识,从GPU获取视图控件对应的纹理数据。
可选地,第一组件具体可以用于获取画布纹理对象;在视图控件对应的第三方插件中创建虚显控制器,并根据画布纹理对象中的画布、以及视图控件对应的布局信息创建对应的虚显控件;通过虚显控制器将视图控件对应的第三方插件与虚显控件进行合成,并创建对应的显示控件;通过显示控件将视图控件对应的第三方插件添加到显示控件中的视图对应的容器上,并将视图控件对应的纹理数据送至画布纹理对象中合成到画布上。
可选地,第二引擎具体可以用于从画布纹理对象中获取画布,并从GPU获取视图控件对应的纹理数据;根据视图控件对应的布局信息对视图控件对应的纹理数据进行渲染。
可选地,第一引擎具体可以用于对第一应用中的第一打包文件进行解析,得到第一应用对应的视图控件的插件属性、以及视图控件对应的第一对象及第一对象的样式和属性;根据视图控件对应的第一对象及第一对象的样式和属性,生成视图控件对应的前端文档对象模型节点。
示例性地,第一应用可以为JS应用,第一打包文件可以为JS包。第一对象可以是JS对象。
可选地,第一组件具体可以用于将视图控件对应的前端文档对象模型节点对接到后端组件节点;根据后端组件节点创建对应的元素节点,并在元素节点中创建对应的绘制节点;在绘制节点计算所述视图控件对应的布局信息。
示例性地,视图控件对应的布局信息可以包括视图控件对应的位置和尺寸。
一些实施例中,第一组件可以包括:组件模块、元素模块、绘制模块、资源注册模块、纹理插件模块、纹理模块。
其中,组件模块可以用于创建视图控件对应的组件节点,将第一引擎解析的第一对象的样式和属性设置给对应的组件节点。元素模块可以用于创建组件节点对应的元素节点,并将元素节点挂载在元素树上,以便于整颗树的布局计算及相关样式属性的更新。绘制模块可以用于创建视图控件对应显示的绘制节点,并在绘制节点计算视图控件对应的布局信息。绘制模块在计算完视图控件对应的布局信息后,可以创建一个资源管理模块。
资源注册模块可以创建视图控件对应的纹理数据,生成视图对应的纹理数据的纹理标识,并基于资源注册模块和纹理插件模块将视图控件对应的纹理数据注册到GPU中。纹理模块可以创建对应的画布纹理(surfaceTexture)对象。
绘制模块在创建资源管理模块的同时,可以根据第一引擎解析得到的视图控件的插件(plugin)属性,从第一组件中匹配出与视图控件的插件(plugin)属性对应的视图控件插件模块和视图控件模块,并基于异步回调技术,通过与视图控件的插件(plugin)属性对应的视图控件插件模块和视图控件模块创建视图控件对应的第三方插件。
绘制模块在创建视图控件对应的第三方插件后,可以在第三方插件中,基于surfaceTexture对象、以及与视图控件的插件(plugin)属性对应的视图控件插件模块创建虚显控制器(virtual display controller);并根据surfaceTexture对象中的surface、以及视图控件对应的布局信息创建对应的虚显控件(virtual display)。虚显控制器可以将第三方插件与虚显控件进行合成,并创建对应的显示控件(presentation)。显示控件可以将第三方插件添加到显示控件中的视图对应的容器上,并将视图控件对应的纹理数据送至到surfaceTexture对象中合成到surface上。
示例性地,终端设备的操作系统可以包括
Figure PCTCN2022085167-appb-000003
Figure PCTCN2022085167-appb-000004
等中的任意一种。
示例性地,第一引擎、第一组件、以及第二引擎均为终端设备的操作系统的开发框架中的代码。
第三方面,本申请实施例提供一种显示视图控件的装置,该装置可以用于实现上述第一方面所述的显示视图控件的方法。该装置的功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述功能相对应的模块或单元,例如,该装置可以包括第一引擎、第一组件、第二引擎等模块。该装置可以通过第一引擎、第一组件、第二引擎实现如第一方面及第一方面的任意一种可能的实现方式中所述的方法。在此不再一一赘述。
第四方面,本申请实施例提供一种电子设备,包括:处理器;存储器;以及计算机程序;其中,所述计算机程序存储在所述存储器上,当所述计算机程序被所述处理器执行时,使得所述电子设备实现如第一方面及第一方面的任意一种可能的实现方式中所述的方法。
第五方面,本申请实施例提供一种计算机可读存储介质,所述计算机可读存储介质包括计算机程序,当所述计算机程序在电子设备上运行时,使得所述电子设备实现如第一方面及第一方面的任意一种可能的实现方式中所述的方法。
第六方面,本申请实施例还提供一种计算机程序产品,包括计算机可读代码,当所述计算机可读代码在电子设备中运行时,使得电子设备实现如第一方面及第一方面的任意一种可能的实现方式中所述的方法。
上述第二方面至第六方面所具备的有益效果,可参考第一方面中所述,在此不再赘述。
应当理解的是,本申请中对技术特征、技术方案、有益效果或类似语言的描述并不是暗示在任意的单个实施例中可以实现所有的特点和优点。相反,可以理解的是对于特征或有益效果的描述意味着在至少一个实施例中包括特定的技术特征、技术方案或有益效果。因此,本说明书中对于技术特征、技术方案或有益效果的描述并不一定是指相同的实施例。进而,还可以任何适当的方式组合本实施例中所描述的技术特征、技术方案和有益效果。本领域技术人员将会理解,无需特定实施例的一个或多个特定的技术特征、技术方案或有益效果即可实现实施例。在其他实施例中,还可在没有体现所有实施例的特定实施例中识别出额外的技术特征和有益效果。
附图说明
图1A为本申请实施例提供的终端设备的结构示意图;
图1B为本申请实施例提供的JS开发框架的组成示意图;
图1C为本申请实施例提供的终端设备的操作系统的架构示意图;
图2为本申请实施例提供的显示视图控件的方法的流程示意图;
图3为本申请实施例提供的第一组件200的结构示意图;
图4为本申请实施例提供的显示安卓view控件的渲染效果示意图;
图5为本申请实施例提供的显示安卓view控件的另一渲染效果示意图;
图6为本申请实施例提供的电子设备的结构示意图。
具体实施方式
以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在也包括例如“一个或多个”这种表达形式,除非其上下文中明确地有相反指示。还应当理解,在本申请以下各实施例中,“至少一个”、“一个或多个”是指一个或两个以上(包含两个)。字符“/”一般表示前后关联对象是一种“或”的关系。
在本说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。术语“连接”包括直接连接和间接连接,除非另外说明。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。
在本申请实施例中,“示例性地”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性地”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性地”或者“例如”等词旨在以具体方式呈现相关概念。
操作系统(Operation System,OS)支持显示各种类型的视图控件,有利于OS生态的构建。其中,视图控件是指可以负责应用页面中某个显示区域的绘制和事件处理的一个控件。
例如,对操作系统A而言,当操作系统A支持显示安卓(android)的视图(view)控件(以下简称安卓view控件)时,安装操作系统A的终端设备可以支持显示基于安卓view控件开发的应用页面,有利于操作系统A构建OS生态。
示例性地,安卓view控件可以包括网页视图(webview)控件、地图视图(mapview)控件、相机视图(cameraview)控件、文本视图(textview)控件等。
目前,操作系统支持显示安卓view控件的方案主要为:针对每一种安卓view控件,操作系统的开发框架可以提供一个与该安卓view控件对应的定制化控件,操作系统可以通过与该安卓view控件对应的定制化控件显示该安卓view控件。其中,不同 的安卓view控件对应不同的定制化控件。例如,针对webview控件,操作系统可以开发框架可以开发一个webview控件对应的定制化控件,操作系统可以通过webview控件对应的定制化控件显示webview控件。
但是,上述方案中,安卓view控件的接口众多,导致在操作系统的开发框架中实现与安卓view控件对应的定制化控件是非常复杂的,开发工作量极大。另外,一旦操作系统有支持新的安卓view控件的需求,则操作系统的开发框架需要重新定制开发,这将严重影响开发效率,不利于OS生态的发展。
为此,本申请实施例提供一种显示视图控件的方法,可以应用于终端设备,终端设备安装有操作系统。当终端设备安装开发者基于视图控件开发的第一应用时,终端设备可以按照本申请实施例提供的显示视图控件的方法,基于操作系统的开发框架显示第一应用对应的视图控件。该方法可以使得终端设备的操作系统支持显示各种视图控件,对不同的视图控件具有通用性。本申请实施例中,对操作系统的开发框架进行开发时,开发效率可以更高,有利于OS生态的构建。
可选地,本申请实施例中,终端设备可以包括手机、大屏(如智慧屏)、平板电脑、可穿戴设备(例如智能手表、智能手环器等)、电视、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等。在此对终端设备的具体形态不作限制。
示例性地,以终端设备为手机为例,图1A为本申请实施例提供的终端设备的结构示意图。如图1A所示,手机可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。
处理器110可以包括多个处理单元,例如:应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是手机的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound, I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,SIM接口,和/或USB接口等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行手机的各种功能应用以及数据处理。例如,处理器110通过运行存储在内部存储器121的指令,可以使得手机实现本申请实施例所述的显示视图控件的方法。
内部存储器121还可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如本申请实施例中所述的第一应用)等。存储数据区可存储手机使用过程中所创建的数据(比如图像数据,电话本)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
充电管理模块140用于从充电器接收充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为手机供电。电源管理模块141用于连接电池142,充电管理模块140,以及处理器110。电源管理模块141也可接收电池142的输入为手机供电。
手机可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。例如,本申请实施例中,显示屏194可以用于显示JS应用的JS页面。
手机通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
可以理解的是,图1A所示的结构并不构成对手机的具体限定。在一些实施例中,手机也可以包括比图1A所示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置等。又或者,图1A所示的一些部件可以以硬件,软件或 软件和硬件的组合实现。
另外,当终端设备是其他大屏(如智慧屏)、平板电脑、可穿戴设备(例如智能手表、智能手环器等)、电视、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等设备时,这些其他终端设备的具体结构也可以参考图1A所示。示例性地,其他终端设备可以是在图1A给出的结构的基础上增加或减少了组件,在此不再一一赘述。
可选地,本申请实施例中,视图控件可以是安卓view控件、鸿蒙视图控件、或者其他视图控件(如其他用java语言开发的视图控件)等,本申请实施例对该方法中操作系统可以支持显示的视图控件的具体类型不作限制。例如,操作系统可以通过该方法显示不同的安卓view控件。
可选地,本申请实施例中,操作系统可以是
Figure PCTCN2022085167-appb-000005
或者
Figure PCTCN2022085167-appb-000006
Figure PCTCN2022085167-appb-000007
或者
Figure PCTCN2022085167-appb-000008
或者
Figure PCTCN2022085167-appb-000009
又或者
Figure PCTCN2022085167-appb-000010
等,在此对操作系统的具体类型也不作限制。
下面以操作系统为鸿蒙系统为例,结合鸿蒙系统显示安卓view控件的场景,对本申请实施例提供的显示视图控件的方法进行示例性说明。
其中,鸿蒙系统以JavaScript作为开发语言,其核心开发框架为JS开发框架。JS开发框架可以包括JS引擎和渲染引擎。
本申请实施例中,可以在JS开发框架中开发一个第一组件,第一组件可以和JS引擎、以及渲染引擎配合实现该显示视图控件的方法。第一组件可以是能够实现本申请实施例中所述的相应功能的代码。也即,本申请实施例中,终端设备可以包括JS引擎、第一组件、渲染引擎,终端设备可以通过JS引擎、第一组件、以及渲染引擎实现本申请实施例所述的实现该显示视图控件的方法。
下述示例性说明中将涉及以下概念:
1、JS应用
JS应用是指使用JavaScript语言开发的应用。JS应用中可以提供JS页面,JS页面通过调用安卓view控件进行显示,即,该JS页面为安卓view控件的显示页面。
2、JS包(JS bundle)
JS bundle也称JS打包文件。JS bundle是指JS应用中包含超文本标记语言(hypertext markup language,HTML)文件、层叠样式表(cascading style sheets,CSS)文件和JS文件的包。可以使用打包工具对HTML文件、CSS文件和JS文件进行打包得到JS bundle。
例如,假设JS应用中的HTML文件、CSS文件、JS文件分别如下所示:
HTML文件:
Figure PCTCN2022085167-appb-000011
Figure PCTCN2022085167-appb-000012
则,使用打包工具将上述HTML文件、CSS文件、JS文件打包成的JS bundle可以如下所示:
Figure PCTCN2022085167-appb-000013
Figure PCTCN2022085167-appb-000014
应当理解,上述HTML文件、CSS文件、JS文件、以及JS bundle中,仅作为示例给出了部分代码。
3、JS对象
在JavaScript中,除了字符串(string)、数字(number)、布尔值(boolean)、空字符(null)、未定义值(undefined)之外,其他的数据都是JS对象,如JS对象可以是数组、日期甚至函数等。
4、虚拟文档对象模型(virtual document object model,VDOM)节点和文档对象模型(document object model,DOM)节点
VODM节点是虚拟/模拟DOM节点。
DOM节点可以用于对接后端组件(component)节点,实现前端组件的能力。
5、组件(component)节点
component节点是底层UI组件的声明式描述,描述了UI组件的属性和样式。如:component节点描述了每个JS页面中每个前端DOM节点,包含前端DOM节点对应的JS对象的样式和属性。component节点可以用于生成component的实体元素(element)。
6、元素(element)节点
element节点是component节点的实例,表示具体的component节点。
7、绘制(render)节点
render节点也称渲染节点。render节点用于计算每个element节点的渲染数据,如:element节点的渲染数据可以包括element节点在JS页面中的位置、大小/尺寸、绘制命令等。
8、虚显控件(virtual display):一种抓取屏幕上显示内容的控件。
9、显示控件(presentation):一个显示在指定显示屏的特殊的对话框,用于在辅助屏幕上显示内容。
10、画布纹理(surfaceTexture):一种可将纹理数据转为外部纹理的画布(surface),该surface不会直接来显示。
11、surface:类似一个画布,页面上的节点都绘制在这个画布。
12、上下文(context):是指一个应用程序环境的信息。
示例性地,图1B为本申请实施例提供的JS开发框架的组成示意图。如图1B所示,本申请实施例中,JS开发框架可以包括:JS引擎100、第一组件200、渲染引擎 300。
本申请实施例中,当终端设备安装开发者基于安卓view视图控件开发的JS应用时,终端设备可以通过JS开发框架加载JS应用。例如,终端设备可以提供JS应用的图标,用户可以点击JS应用的图标。终端设备可以响应于用户点击JS应用的图标,通过JS开发框架加载JS应用。通过JS开发框架加载JS应用后,JS引擎100可以对JS应用中的JS包(bundle)进行解析,确定JS应用对应的安卓view控件的插件(plugin)属性、以及安卓view控件对应的前端DOM节点。第一组件200可以根据安卓view控件对应的前端DOM节点确定安卓view控件对应的布局信息,以及根据安卓view控件的plugin属性创建安卓view控件对应的第三方插件和纹理数据,并将安卓view控件对应的纹理数据注册到GPU中。第一组件200可以根据安卓view控件对应的第三方插件,显示安卓view控件对应的纹理数据。渲染引擎300可以从GPU获取安卓view控件对应的纹理数据,并根据第一组件200确定出的安卓view控件对应的布局信息,对安卓view控件对应的纹理数据进行渲染。
示例性地,图1C为本申请实施例提供的终端设备的操作系统的架构示意图。如图1C所示,终端设备的操作系统可以包括应用程序层(applications)、应用程序框架层(application framework)、系统运行库层(libraries)、以及内核层。
其中,应用程序层包含了多个应用,如:应用程序层可以包含电子邮件、短信、日历、地图、浏览器和联系人管理等系统应用,以及开发人员利用Java语言设计和编写的第三方应用(如JS应用)。对于应用程序层包含的每个JS应用而言,开发人员在开发该应用时,可以在该JS应用的JS bundle中预配置安卓view控件的plugin属性。
上述图1B中所示的JS引擎100、第一组件200、以及渲染引擎300可以部署在图1C中所示的应用程序框架层或者系统运行库层,在此不作限制。
下面对上述图1B中所示的JS引擎100、第一组件200、以及渲染引擎300配合实现本申请实施例提供的显示视图控件的方法的具体过程进行说明。
以视图控件为安卓view控件为例,图2为本申请实施例提供的显示视图控件的方法的流程示意图。如图2所示,该显示视图控件的方法可以包括S201-S210。
S201、JS引擎100对开发者开发的JS应用中的JS包(bundle)进行解析,得到安卓view控件的插件(plugin)属性、以及安卓view控件对应的JS对象及JS对象的样式和属性。
其中,开发者开发的JS应用是由开发人员基于安卓view控件所开发的应用。开发人员在开发JS应用时,可以在JS应用的HTML文件的代码中预设(预配置)安卓view控件的插件(plugin)属性。如:可以预配置安卓view控件的插件(plugin)属性为webview控件、mapview控件、cameraview控件、textview控件等。
JS引擎100可以将开发者开发的JS应用中的JS bundle加载到JS开发框架中,对JS bundle进行解析,得到安卓view控件对应的JS对象及JS对象的样式和属性、以及安卓view的插件(plugin)属性。
例如,开发者开发的JS应用可以是基于webview控件所开发的应用。JS引擎100对JS应用中的JS bundle进行解析,可以得到安卓view控件的插件(plugin)属性为webview控件,同时,可以得到webview控件对应的JS对象及JS对象的样式和属性。
示例性地,假设开发者设计(或开发)的JS应用中HTML文件的部分代码如下:
Figure PCTCN2022085167-appb-000015
则,JS引擎100可以解析得到该JS应用中安卓view控件的插件(plugin)属性为:webview控件。
假设开发者设计(或开发)的JS应用中HTML文件的部分代码如下:
Figure PCTCN2022085167-appb-000016
则,JS引擎100可以解析得到该JS应用中安卓view控件的插件(plugin)属性为:mapview控件。
S202、JS引擎100根据安卓view控件对应的JS对象及JS对象的样式和属性,生成安卓view控件对应的前端文档对象模型(document object model,DOM)节点。
可选地,JS引擎100根据安卓view控件对应的JS对象及JS对象的样式和属性,生成安卓view控件对应的前端DOM节点的步骤,可以包括:JS引擎100根据安卓view控件对应的JS对象生成VDOM节点的树形结构(简称VDOM树),VODM树包括多个VDOM节点。JS引擎100将JS对象的样式和属性设置给对应的VODM节点,并对VDOM树上的每个VDOM节点进行数据绑定。JS引擎100在生成VDOM节点的同时,同步创建对应的前端DOM节点。
其中,VODM树也即模拟DOM树的JS对象树。数据绑定是指在应用程序UI界面与数据源建立连接的过程,用于当每个VDOM节点的数据发生变化时,对应的UI节点会自动绘制更新。
JS引擎100根据安卓view控件对应的JS对象及JS对象的样式和属性,生成安卓view控件对应的前端DOM节点后,第一组件200可以根据前端DOM节点依次创建对应的后端组件(component)节点、元素(element)节点、以及绘制(render)节点,来完成布局的计算,组件样式和属性的更新等。例如,第一组件200可以执行S203-S205。
S203、第一组件200将安卓view控件对应的前端DOM节点对接到后端组件 (component)节点。
S204、第一组件200根据后端组件节点创建对应的元素(element)节点,并在元素节点中创建对应的绘制(render)节点。
S205、第一组件200在绘制节点计算安卓view控件对应的布局信息。
其中,布局信息可以包括位置和尺寸,尺寸也即布局大小。例如,第一组件200可以在绘制节点计算安卓view控件对应的每个DOM节点的位置和尺寸。
S206、第一组件200根据安卓view控件的插件(plugin)属性,创建安卓view控件对应的第三方插件,并创建安卓view控件对应的纹理数据。
示例性地,如S201的举例中所述,假设开发者设计(或开发)的JS应用中HTML文件的部分代码如下:
Figure PCTCN2022085167-appb-000017
则,JS引擎100可以解析得到该JS应用中安卓view控件的插件(plugin)属性为:webview控件。对此,S206中第一组件200根据安卓view控件的插件(plugin)属性为webview控件,创建的安卓view控件对应的第三方插件可以为:webview插件。
又例如,如S201的举例中所述,假设开发者设计(或开发)的JS应用中HTML文件的部分代码如下:
Figure PCTCN2022085167-appb-000018
则,JS引擎100可以解析得到该JS应用中安卓view控件的插件(plugin)属性为:mapview控件。对此,S206中第一组件200根据安卓view控件的插件(plugin)属性为mapview控件,创建的安卓view控件对应的第三方插件可以为:mapview插件。
S207、第一组件200生成安卓view控件对应的纹理数据的纹理标识,并根据安卓view控件对应的纹理数据的纹理标识,将安卓view控件对应的纹理数据注册到图形处理器(graphics processing unit,GPU)中。
示例性地,安卓view控件对应的纹理数据的纹理标识可以是纹理数据的纹理ID,纹理ID可以表示纹理数据的唯一身份信息。根据安卓view控件对应的纹理数据的纹 理标识,将安卓view控件对应的纹理数据注册到GPU中后,渲染引擎300后续可以根据安卓view控件对应的纹理数据的纹理标识,从GPU索引/获取安卓view控件对应的纹理数据。
需要说明的是,本申请对S205、以及S206-S207的执行顺序不作限制。例如,一些实施方式中,S205与S206-S207可以是同步执行。另外一些实施方式中,S205也可以在S206-S207之前或之后执行。
S208、第一组件200向渲染引擎300发送安卓view控件对应的布局信息、以及安卓view控件对应的纹理数据的纹理标识。
S209、第一组件200根据安卓view控件对应的第三方插件,显示安卓view控件对应的纹理数据。
可选地,第一组件200根据安卓view控件对应的第三方插件,显示安卓view控件对应的纹理数据的步骤,可以包括:第一组件200获取画布纹理(surfaceTexture)对象。第一组件200在第三方插件中创建虚显控制器(virtual display controller),并根据surfaceTexture对象中的画布(surface)、以及安卓view控件对应的布局信息创建对应的虚显控件(virtual display)。第一组件200通过虚显控制器将第三方插件与虚显控件进行合成,并创建对应的显示控件(presentation)。第一组件200通过显示控件将第三方插件添加到显示控件中的视图对应的容器上,并将安卓view控件对应的纹理数据送至surfaceTexture对象中合成到surface上。
示例性地,图3为本申请实施例提供的第一组件200的结构示意图。下面结合图3所示的结构,对上述S203-S207、以及S209所述的过程进行具体说明。如图3所示,第一组件200可以包括:
组件模块210,组件模块210也可以称为NativeTextureComponent模块,可以用于创建安卓view控件对应的component节点,将JS引擎100从JS bundle中解析的JS对象的样式和属性设置给对应的component节点。
元素模块220,元素模块220也可以称为NativeTextureElement模块,可以用于创建component节点对应的element节点,并将element节点挂载在element树上,以便于整颗树的布局计算及相关样式属性的更新。
绘制模块230,绘制模块230也可以称为NativeTextureRender模块,可以用于创建安卓view控件对应显示的render节点,并在render节点计算安卓view控件对应的布局信息。绘制模块230在计算完安卓view控件对应的布局信息后,可以创建一个资源管理模块231。资源管理模块231也可以称为NativeDelegate模块或资源管理器。
第一组件200还包括:资源注册模块240、纹理插件模块250、纹理模块260。资源注册模块也可以称为ResourceRegister模块,纹理插件模块也可以称为TexturePlugin模块,纹理模块也可以称为Texture模块。
资源管理模块231可以创建安卓view控件对应的纹理数据,生成安卓view控件对应的纹理数据的纹理标识,并基于资源注册模块240和纹理插件模块250将安卓view控件对应的纹理数据注册到GPU中。纹理模块260可以创建对应的画布纹理(surfaceTexture)对象。
绘制模块230在创建资源管理模块231的同时,可以根据S201中JS引擎100解 析得到的安卓view控件的插件(plugin)属性,从第一组件200中匹配出与安卓view控件的插件(plugin)属性对应的安卓view控件插件模块和安卓view控件模块,并基于异步回调技术,通过与安卓view控件的插件(plugin)属性对应的安卓view控件插件模块和安卓view控件模块创建安卓view控件对应的第三方插件。
其中,安卓view控件插件模块和安卓view控件模块可以是由操作系统的开发人员预设在第一组件200中的。例如,第一组件200中可以预设有webview控件对应的webview控件插件模块和webview控件模块,当S201中JS引擎100解析得到的安卓view控件的插件(plugin)属性为webview控件时,绘制模块230可以从第一组件200中匹配出与安卓view控件的插件(plugin)属性对应的webview控件插件模块(WebviewPlugin模块)和webview控件模块(Webview模块)。又例如,第一组件200中还可以预设有mapview控件对应的mapview控件插件模块(mapviewPlugin模块)和mapview控件模块(mapview模块),当S201中JS引擎100解析得到的安卓view控件的插件(plugin)属性为mapview控件时,绘制模块230可以从第一组件200中匹配出与安卓view控件的插件(plugin)属性对应的mapview控件插件模块和mapview控件模块等。
绘制模块230在创建安卓view控件对应的第三方插件后,可以在第三方插件中,基于surfaceTexture对象、以及与安卓view控件的插件(plugin)属性对应的安卓view控件插件模块创建虚显控制器(virtual display controller);并根据surfaceTexture对象中的surface、以及安卓view控件对应的布局信息创建对应的虚显控件(virtual display)。虚显控制器将第三方插件与虚显控件进行合成,并创建对应的显示控件(presentation)。显示控件可以将第三方插件添加到显示控件中的视图对应的容器上,并将安卓view控件对应的纹理数据送至到surfaceTexture对象中合成到surface上。
S210、渲染引擎300根据安卓view控件对应的纹理数据的纹理标识从GPU获取安卓view控件对应的纹理数据,并根据安卓view控件对应的布局信息,对安卓view控件对应的纹理数据进行渲染。
可选地,渲染引擎300根据安卓view控件对应的纹理数据的纹理标识从GPU获取安卓view控件对应的纹理数据,并根据安卓view控件对应的布局信息,对安卓view控件对应的纹理数据进行渲染的步骤,可以包括:渲染引擎300从surfaceTexture对象中获取surface,并根据安卓view控件对应的纹理数据的纹理标识从GPU获取安卓view控件对应的纹理数据。渲染引擎300根据安卓view控件对应的布局信息,对安卓view控件对应的纹理数据进行渲染。
下面以开发者在JS应用中设置插件(plugin)属性为webview控件为例,对上述图2所示的本申请实施例提供的显示安卓view控件的方法的具体应用过程进行示例性说明。
本申请实施例中,当开发者在JS应用中设置插件(plugin)属性为webview控件时,JS引擎100对开发者开发的JS应用中的JS包(bundle)进行解析,可以得到安卓view控件的插件(plugin)属性为webview控件,以及webview控件对应的JS对象及JS对象的样式和属性。然后,JS引擎100可以根据webview控件对应的JS对象及JS对象的样式和属性,生成webview控件对应的前端DOM节点。第一组件200可 以将webview控件对应的前端DOM节点对接到后端component节点,根据后端component节点创建对应的element节点,并在element节点中创建对应的render节点。然后,第一组件200可以在render节点计算webview控件对应的布局信息。另外,第一组件200可以创建webview控件对应的webview插件和纹理数据。在创建webview控件对应的纹理数据后,第一组件200可以生成webview控件对应的纹理数据的纹理标识,并根据webview控件对应的纹理数据的纹理标识,将webview控件对应的纹理数据注册到GPU中。
之后,第一组件200可以向渲染引擎300发送webview控件对应的布局信息、以及webview控件对应的纹理数据的纹理标识。第一组件200还可以获取画布纹理(surfaceTexture)对象,在webview插件中创建虚显控制器(virtual display controller),并根据surfaceTexture对象中的画布(surface)、以及webview控件对应的布局信息创建对应的虚显控件(virtual display)。在创建虚显控件后,第一组件200可以通过虚显控制器将webview插件与虚显控件进行合成,并创建对应的显示控件(presentation)。创建对应的显示控件后,第一组件200可以通过显示控件将webview插件添加到显示控件中的视图对应的容器上,并将webview控件对应的纹理数据送至surfaceTexture对象中合成到surface上。渲染引擎300可以根据webview控件对应的纹理数据的纹理标识从GPU获取webview控件对应的纹理数据,并根据webview控件对应的布局信息,对webview控件对应的纹理数据进行渲染。从而,JS框架可以实现支持显示webview控件。
示例性地,图4为本申请实施例提供的显示安卓view控件的渲染效果示意图。如图4所示,当开发者设计(或开发)的JS应用的HTML文件的代码中预设的安卓view控件的plugin属性为webview控件时(代码具体可以参见前述实施例中所述),通过本申请实施例所述的方法可以渲染出类似于如图4所示的网页页面。应当理解,不同的JS应用或JS应用请求不同的网页地址时,图4所示的页网页面中的页面内容不同。图4仅作为示例进行说明。
类似地,当开发者在JS应用中设置插件(plugin)属性为其他安卓view控件(如:mapview控件)时,操作系统的开发框架均可以按照前述实施例所述的方式,实现支持其他安卓view控件的显示。
示例性地,图5为本申请实施例提供的显示安卓view控件的另一渲染效果示意图。如图5所示,当开发者设计(或开发)的JS应用的HTML文件的代码中预设的安卓view控件的plugin属性为mapview控件时(代码具体可以参见前述实施例中所述),通过本申请实施例所述的方法可以渲染出类似于如图5所示的地图页面。图5同样仅作为示例进行说明。
本申请实施例提供的显示安卓view控件的方法中,JS应用的开发人员只需要在JS应用的HTML文件的代码中设置安卓view控件的插件(plugin)属性,即可使得操作系统侧实现支持显示不同的安卓view控件。操作系统侧按照本申请实施例提供的显示安卓view控件的方法来显示不同的安卓view控件时,不需要考虑不同安卓view控件的适配对接问题,整个实现流程可以由操作系统的开发框架自动完成,JS应用的开发人员可以不感知,大大降低了适配操作系统的JS应用的开发难度,有利于吸引更多 的开发者参与OS生态的构建。
另外,操作系统侧的开发框架可以实现一次开发,可部署各种复杂的安卓view的目标,大大降低开发成本。JS应用的开发人员在JS应用的HTML文件的代码中设置安卓view控件的插件(plugin)属性,操作系统的开发框架自动根据开发人员设置的安卓view控件的插件(plugin)属性匹配对应的第三方插件,还可以避免开发人员针对不同的安卓view控件进行二次开发,能够有效提高开发效率。
进一步地,本申请实施例提供的显示安卓view控件的方法中,通过将安卓view控件对应的纹理数据(也可以简称为安卓侧的纹理数据)直接注册到GPU中,可以避免在CPU中创建纹理缓存。渲染引擎可以直接从GPU中获取安卓view控件对应的纹理数据进行渲染,不需要CPU和GPU之间进行纹理数据的传递,可以节省CPU在不同纹理缓存之间的拷贝时间以及纹理上传到GPU的时间,大大提高渲染性能。同时,还节省了纹理数据占用的内存和GPU显存。
需要说明的是,前述实施例中虽然以安卓view控件为webview控件或mapview控件进行了示例性说明。但应当理解,在其他一些场景中,操作系统通过本申请实施例提供的显示视图控件的方法可以显示的视图控件还可以是cameraview控件、textview控件、直播视图控件、广告视图控件、投屏视图控件等其他安卓view控件,或者,鸿蒙视图控件、其他用java语言开发的视图控件等其他视图控件。当视图控件为前述其他类型时,其实现原理与前述实施例中所述的显示webview控件或mapview控件的原理相同,不再一一赘述。
另外,还应当理解,上述说明中,当操作系统是除鸿蒙 TM系统之外的其他系统时,JS开发框架可以相应被替换为其他系统对应的其他语言的开发框架,JS引擎可以是对其他语言代码进行解析的引擎。例如,ios TM系统对应的开发框架可以是Swift开发框架,Swift开发框架可以包括Swift引擎、第一组件、渲染引擎等。
本申请实施例中,开发者基于视图控件开发的应用可以称为第一应用,第一应用对应的打包文件(如JS应用的JS bundle)可以称为第一打包文件。操作系统的开发框架中对第一应用进行解析的引擎(如鸿蒙 TM系统中的JS引擎)可以称为第一引擎,渲染引擎可以称为第二引擎。第一引擎对第一应用中的第一打包文件进行解析,得到的对象(如JS对象)可以称为第一对象。
对应于前述实施例中所述的显示视图控件的方法,本申请实施例还提供一种终端设备,终端设备可以包括第一引擎、第一组件、以及第二引擎。
其中,第一引擎用于确定第一应用对应的视图控件的插件属性、以及视图控件对应的前端文档对象模型节点。第一组件用于根据视图控件对应的前端文档对象模型节点,确定视图控件对应的布局信息;根据视图控件的插件属性,创建视图控件对应的第三方插件和纹理数据,并将视图控件对应的纹理数据注册到GPU中。第一组件还用于向第二引擎发送视图控件对应的布局信息;根据视图控件对应的第三方插件,显示视图控件对应的纹理数据。第二引擎用于从GPU获取视图控件对应的纹理数据,并根据视图控件对应的布局信息对视图控件对应的纹理数据进行渲染。
其中,第一应用可以为开发人员基于视图控件开发的应用。
可选地,视图控件可以包括网页视图控件、地图视图控件、相机视图控件、视频 视图控件、文本视图控件、直播视图控件、广告视图控件、投屏视图控件等中的任意一种。
一些实施例中,视图控件可以是安卓视图控件、或者鸿蒙视图控件、又或者其他java语言开发的视图控件,在此不作限制。
第一应用的第一打包文件包括第一应用对应的视图控件的插件属性,视图控件的插件属性是预配置的。例如,第一应用的开发人员可以在第一应用的第一打包文件中设置视图控件的插件属性为网页视图控件或地图视图控件,对视图控件的插件属性进行预配置。
可选地,第一组件具体可以用于生成视图控件对应的纹理数据的纹理标识,并根据视图控件对应的纹理数据的纹理标识,将视图控件对应的纹理数据注册到GPU中。第一组件还用于向第二引擎发送视图控件对应的纹理数据的纹理标识。第二引擎具体可以用于根据视图控件对应的纹理数据的纹理标识,从GPU获取视图控件对应的纹理数据。
可选地,第一组件具体可以用于获取画布纹理对象;在视图控件对应的第三方插件中创建虚显控制器,并根据画布纹理对象中的画布、以及视图控件对应的布局信息创建对应的虚显控件;通过虚显控制器将视图控件对应的第三方插件与虚显控件进行合成,并创建对应的显示控件;通过显示控件将视图控件对应的第三方插件添加到显示控件中的视图对应的容器上,并将视图控件对应的纹理数据送至画布纹理对象中合成到画布上。
可选地,第二引擎具体可以用于从画布纹理对象中获取画布,并从GPU获取视图控件对应的纹理数据;根据视图控件对应的布局信息对视图控件对应的纹理数据进行渲染。
可选地,第一引擎具体可以用于对第一应用中的第一打包文件进行解析,得到第一应用对应的视图控件的插件属性、以及视图控件对应的第一对象及第一对象的样式和属性;根据视图控件对应的第一对象及第一对象的样式和属性,生成视图控件对应的前端文档对象模型节点。
示例性地,第一应用可以为JS应用,第一打包文件可以为JS包。第一对象可以是JS对象。
可选地,第一组件具体可以用于将视图控件对应的前端文档对象模型节点对接到后端组件节点;根据后端组件节点创建对应的元素节点,并在元素节点中创建对应的绘制节点;在绘制节点计算所述视图控件对应的布局信息。
示例性地,视图控件对应的布局信息可以包括视图控件对应的位置和尺寸。
一些实施例中,第一组件可以包括:组件模块、元素模块、绘制模块、资源注册模块、纹理插件模块、纹理模块。
其中,组件模块可以用于创建视图控件对应的组件节点,将第一引擎解析的第一对象的样式和属性设置给对应的组件节点。元素模块可以用于创建组件节点对应的元素节点,并将元素节点挂载在元素树上,以便于整颗树的布局计算及相关样式属性的更新。绘制模块可以用于创建视图控件对应显示的绘制节点,并在绘制节点计算视图控件对应的布局信息。绘制模块在计算完视图控件对应的布局信息后,可以创建一个 资源管理模块。
资源注册模块可以创建视图控件对应的纹理数据,生成视图对应的纹理数据的纹理标识,并基于资源注册模块和纹理插件模块将视图控件对应的纹理数据注册到GPU中。纹理模块可以创建对应的画布纹理(surfaceTexture)对象。
绘制模块在创建资源管理模块的同时,可以根据第一引擎解析得到的视图控件的插件(plugin)属性,从第一组件中匹配出与视图控件的插件(plugin)属性对应的视图控件插件模块和视图控件模块,并基于异步回调技术,通过与视图控件的插件(plugin)属性对应的视图控件插件模块和视图控件模块创建视图控件对应的第三方插件。
绘制模块在创建视图控件对应的第三方插件后,可以在第三方插件中,基于surfaceTexture对象、以及与视图控件的插件(plugin)属性对应的视图控件插件模块创建虚显控制器(virtual display controller);并根据surfaceTexture对象中的surface、以及视图控件对应的布局信息创建对应的虚显控件(virtual display)。虚显控制器可以将第三方插件与虚显控件进行合成,并创建对应的显示控件(presentation)。显示控件可以将第三方插件添加到显示控件中的视图对应的容器上,并将视图控件对应的纹理数据送至到surfaceTexture对象中合成到surface上。
示例性地,终端设备的操作系统可以包括
Figure PCTCN2022085167-appb-000019
Figure PCTCN2022085167-appb-000020
等中的任意一种。
示例性地,第一引擎、第一组件、以及第二引擎均为终端设备的操作系统的开发框架中的代码。
示例性地,当操作系统的开发框架为JS开发框架时,第一引擎可以是JS引擎,第二引擎可以是渲染引擎。具体可以参见前述实施例中的图1B中所示。
应当理解,该终端设备可以通过第一引擎、第一组件、以及第二引擎实现本申请前述方法实施例中所述的显示视图控件的方法的全部功能。
可选地,本申请实施例还提供一种显示视图控件的装置,该装置可以用于实现前述方法实施例中所述的显示视图控件的方法。该装置的功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述功能相对应的模块或单元。
示例性地,该显示视图控件的装置可以包括第一引擎、第一组件、以及第二引擎等模块。该显示视图控件的装置可以通过第一引擎、第一组件、以及第二引擎等模块实现本申请前述方法实施例中所述的显示视图控件的方法的全部功能。
例如,第一引擎可以用于确定第一应用对应的视图控件的插件属性、以及视图控件对应的前端文档对象模型节点。第一组件可以用于根据视图控件对应的前端文档对象模型节点,确定视图控件对应的布局信息;根据视图控件的插件属性,创建视图控件对应的第三方插件和纹理数据,并将视图控件对应的纹理数据注册到GPU中。第一组件还可以用于向第二引擎发送视图控件对应的布局信息;根据视图控件对应的第三方插件,显示视图控件对应的纹理数据。第二引擎可以用于从GPU获取视图控件对应的纹理数据,并根据视图控件对应的布局信息对视图控件对应的纹理数据进行渲染。
其中,第一应用可以为开发人员基于视图控件开发的应用。
可选地,视图控件可以包括网页视图控件、地图视图控件、相机视图控件、视频视图控件、文本视图控件、直播视图控件、广告视图控件、投屏视图控件等中的任意一种。
一些实施例中,视图控件可以是安卓视图控件、或者鸿蒙视图控件、又或者其他java语言开发的视图控件,在此不作限制。
第一应用的第一打包文件包括第一应用对应的视图控件的插件属性,视图控件的插件属性是预配置的。例如,第一应用的开发人员可以在第一应用的第一打包文件中设置视图控件的插件属性为网页视图控件或地图视图控件,对视图控件的插件属性进行预配置。
可选地,第一组件具体可以用于生成视图控件对应的纹理数据的纹理标识,并根据视图控件对应的纹理数据的纹理标识,将视图控件对应的纹理数据注册到GPU中。第一组件还用于向第二引擎发送视图控件对应的纹理数据的纹理标识。第二引擎具体可以用于根据视图控件对应的纹理数据的纹理标识,从GPU获取视图控件对应的纹理数据。
可选地,第一组件具体可以用于获取画布纹理对象;在视图控件对应的第三方插件中创建虚显控制器,并根据画布纹理对象中的画布、以及视图控件对应的布局信息创建对应的虚显控件;通过虚显控制器将视图控件对应的第三方插件与虚显控件进行合成,并创建对应的显示控件;通过显示控件将视图控件对应的第三方插件添加到显示控件中的视图对应的容器上,并将视图控件对应的纹理数据送至画布纹理对象中合成到画布上。
可选地,第二引擎具体可以用于从画布纹理对象中获取画布,并从GPU获取视图控件对应的纹理数据;根据视图控件对应的布局信息对视图控件对应的纹理数据进行渲染。
可选地,第一引擎具体可以用于对第一应用中的第一打包文件进行解析,得到第一应用对应的视图控件的插件属性、以及视图控件对应的第一对象及第一对象的样式和属性;根据视图控件对应的第一对象及第一对象的样式和属性,生成视图控件对应的前端文档对象模型节点。
可选地,第一组件具体可以用于将视图控件对应的前端文档对象模型节点对接到后端组件节点;根据后端组件节点创建对应的元素节点,并在元素节点中创建对应的绘制节点;在绘制节点计算所述视图控件对应的布局信息。
示例性地,视图控件对应的布局信息可以包括视图控件对应的位置和尺寸。
一些实施例中,第一组件可以包括:组件模块、元素模块、绘制模块、资源注册模块、纹理插件模块、纹理模块。具体可以参考上述图3中所示。
其中,组件模块可以用于创建视图控件对应的组件节点,将第一引擎解析的第一对象的样式和属性设置给对应的组件节点。元素模块可以用于创建组件节点对应的元素节点,并将元素节点挂载在元素树上,以便于整颗树的布局计算及相关样式属性的更新。绘制模块可以用于创建视图控件对应显示的绘制节点,并在绘制节点计算视图控件对应的布局信息。绘制模块在计算完视图控件对应的布局信息后,可以创建一个资源管理模块。
资源注册模块可以创建视图控件对应的纹理数据,生成视图对应的纹理数据的纹理标识,并基于资源注册模块和纹理插件模块将视图控件对应的纹理数据注册到GPU中。纹理模块可以创建对应的画布纹理(surfaceTexture)对象。
绘制模块在创建资源管理模块的同时,可以根据第一引擎解析得到的视图控件的插件(plugin)属性,从第一组件中匹配出与视图控件的插件(plugin)属性对应的视图控件插件模块和视图控件模块,并基于异步回调技术,通过与视图控件的插件(plugin)属性对应的视图控件插件模块和视图控件模块创建视图控件对应的第三方插件。
绘制模块在创建视图控件对应的第三方插件后,可以在第三方插件中,基于surfaceTexture对象、以及与视图控件的插件(plugin)属性对应的视图控件插件模块创建虚显控制器(virtual display controller);并根据surfaceTexture对象中的surface、以及视图控件对应的布局信息创建对应的虚显控件(virtual display)。虚显控制器可以将第三方插件与虚显控件进行合成,并创建对应的显示控件(presentation)。显示控件可以将第三方插件添加到显示控件中的视图对应的容器上,并将视图控件对应的纹理数据送至到surfaceTexture对象中合成到surface上。
示例性地,终端设备的操作系统可以包括
Figure PCTCN2022085167-appb-000021
Figure PCTCN2022085167-appb-000022
等中的任意一种。
示例性地,第一引擎、第一组件、以及第二引擎均为终端设备的操作系统的开发框架中的代码。
应理解以上装置中单元(或称为模块)的划分(如将显示视图控件的装置划分为第一引擎、第一组件、以及第二引擎)仅仅是一种逻辑功能的划分,实际实现时可以全部或部分集成到一个物理实体上,也可以物理上分开。且装置中的单元可以全部以软件通过处理元件调用的形式实现;也可以全部以硬件的形式实现;还可以部分单元以软件通过处理元件调用的形式实现,部分单元以硬件的形式实现。
例如,各个单元可以为单独设立的处理元件,也可以集成在装置的某一个芯片中实现,此外,也可以以程序的形式存储于存储器中,由装置的某一个处理元件调用并执行该单元的功能。此外这些单元全部或部分可以集成在一起,也可以独立实现。这里所述的处理元件又可以称为处理器,可以是一种具有信号的处理能力的集成电路。在实现过程中,上述方法的各步骤或以上各个单元可以通过处理器元件中的硬件的集成逻辑电路实现或者以软件通过处理元件调用的形式实现。
在一个例子中,以上装置中的单元可以是被配置成实施以上方法的一个或多个集成电路,例如:一个或多个专用集成电路(application specific integrated circuit,ASIC),或,一个或多个数字信号处理器(digital signal process,DSP),或,一个或者多个现场可编辑逻辑门阵列(field programmable gate array,FPGA),或这些集成电路形式中至少两种的组合。
再如,当装置中的单元可以通过处理元件调度程序的形式实现时,该处理元件可以是通用处理器,例如中央处理器(central processing unit,CPU)或其它可以调用程序的处理器。再如,这些单元可以集成在一起,以片上系统(system-on-a-chip,SOC)的形式实现。
在一种实现中,以上装置实现以上方法中各个对应步骤的单元可以通过处理元件调度程序的形式实现。例如,该装置可以包括处理元件和存储元件,处理元件调用存储元件存储的程序,以执行以上方法实施例所述的方法。存储元件可以为与处理元件处于同一芯片上的存储元件,即片内存储元件。
在另一种实现中,用于执行以上方法的程序可以在与处理元件处于不同芯片上的存储元件,即片外存储元件。此时,处理元件从片外存储元件调用或加载程序于片内存储元件上,以调用并执行以上方法实施例所述的方法。
例如,本申请实施例还可以提供一种装置,如:电子设备。图6为本申请实施例提供的电子设备的结构示意图。如图6所示,该电子设备可以包括:处理器601;存储器602;以及计算机程序;其中,所述计算机程序存储在所述存储器602上,当所述计算机程序被所述处理器601执行时,使得所述电子设备实现如前述实施例所述的显示视图控件的方法。该存储器602可以位于该电子设备之内,也可以位于该电子设备之外。且该处理器601包括一个或多个。
示例性地,该电子设备可以是手机、平板电脑、可穿戴设备(例如智能手表、智能手环器等)、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等。
在又一种实现中,该装置实现以上方法中各个步骤的单元可以是被配置成一个或多个处理元件,这里的处理元件可以为集成电路,例如:一个或多个ASIC,或,一个或多个DSP,或,一个或者多个FPGA,或者这些类集成电路的组合。这些集成电路可以集成在一起,构成芯片。
例如,本申请实施例还提供一种芯片,该芯片可以应用于上述电子设备。芯片包括一个或多个接口电路和一个或多个处理器;接口电路和处理器通过线路互联;处理器通过接口电路从电子设备的存储器接收并执行计算机指令,以实现如前述实施例所述的显示视图控件的方法。
本申请实施例还提供一种计算机程序产品,包括计算机可读代码,当计算机可读代码在电子设备中运行时,使得电子设备实现如前述实施例所述的显示视图控件的方法。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。
基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,如:程序。该软 件产品存储在一个程序产品,如计算机可读存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
例如,本申请实施例还可以提供一种计算机可读存储介质,所述计算机可读存储介质包括计算机程序,当所述计算机程序在电子设备上运行时,使得所述电子设备实现如前述实施例所述的显示视图控件的方法。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (14)

  1. 一种显示视图控件的方法,其特征在于,所述方法应用于终端设备,所述终端设备包括:第一引擎、第一组件、以及第二引擎;所述方法包括:
    所述第一引擎确定第一应用对应的视图控件的插件属性、以及所述视图控件对应的前端文档对象模型节点;
    所述第一组件根据所述视图控件对应的前端文档对象模型节点,确定所述视图控件对应的布局信息;
    所述第一组件根据所述视图控件的插件属性,创建所述视图控件对应的第三方插件和纹理数据,并将所述视图控件对应的纹理数据注册到图形处理器GPU中;
    所述第一组件向所述第二引擎发送所述视图控件对应的布局信息;
    所述第一组件根据所述视图控件对应的第三方插件,显示所述视图控件对应的纹理数据;
    所述第二引擎从GPU获取所述视图控件对应的纹理数据,并根据所述视图控件对应的布局信息对所述视图控件对应的纹理数据进行渲染。
  2. 根据权利要求1所述的方法,其特征在于,所述第一组件将所述视图控件对应的纹理数据注册到GPU中,包括:
    所述第一组件生成所述视图控件对应的纹理数据的纹理标识,并根据所述视图控件对应的纹理数据的纹理标识,将所述视图控件对应的纹理数据注册到GPU中;
    所述方法还包括:
    所述第一组件向所述第二引擎发送所述视图控件对应的纹理数据的纹理标识;
    所述第二引擎从GPU获取所述视图控件对应的纹理数据,包括:
    所述第二引擎根据所述视图控件对应的纹理数据的纹理标识,从GPU获取所述视图控件对应的纹理数据。
  3. 根据权利要求1或2所述的方法,其特征在于,所述第一组件根据所述视图控件对应的第三方插件,显示所述视图控件对应的纹理数据,包括:
    所述第一组件获取画布纹理对象;
    所述第一组件在所述视图控件对应的第三方插件中创建虚显控制器,并根据所述画布纹理对象中的画布、以及所述视图控件对应的布局信息创建对应的虚显控件;
    所述第一组件通过所述虚显控制器将所述视图控件对应的第三方插件与所述虚显控件进行合成,并创建对应的显示控件;
    所述第一组件通过所述显示控件将所述视图控件对应的第三方插件添加到所述显示控件中的视图对应的容器上,并将所述视图控件对应的纹理数据送至所述画布纹理对象中合成到画布上。
  4. 根据权利要求3所述的方法,其特征在于,所述第二引擎从GPU获取所述视图控件对应的纹理数据,并根据所述视图控件对应的布局信息对所述视图控件对应的纹理数据进行渲染,包括:
    所述第二引擎从所述画布纹理对象中获取画布,并从GPU获取所述视图控件对应的纹理数据;
    所述第二引擎根据所述视图控件对应的布局信息对所述视图控件对应的纹理数据 进行渲染。
  5. 根据权利要求1-4任一项所述的方法,其特征在于,所述第一引擎确定第一应用对应的视图控件的插件属性、以及所述视图控件对应的前端文档对象模型节点,包括:
    所述第一引擎对第一应用中的第一打包文件进行解析,得到所述第一应用对应的视图控件的插件属性、以及所述视图控件对应的第一对象及第一对象的样式和属性;
    所述第一引擎根据所述视图控件对应的第一对象及第一对象的样式和属性,生成所述视图控件对应的前端文档对象模型节点。
  6. 根据权利要求5所述的方法,其特征在于,所述第一应用的第一打包文件包括所述预配置的所述第一应用对应的视图控件的插件属性。
  7. 根据权利要求1-6任一项所述的方法,其特征在于,所述第一组件根据所述视图控件对应的前端文档对象模型节点,确定所述视图控件对应的布局信息,包括:
    所述第一组件将所述视图控件对应的前端文档对象模型节点对接到后端组件节点;
    所述第一组件根据所述后端组件节点创建对应的元素节点,并在所述元素节点中创建对应的绘制节点;
    所述第一组件在所述绘制节点计算所述视图控件对应的布局信息。
  8. 根据权利要求7所述的方法,其特征在于,所述视图控件对应的布局信息包括所述视图控件对应的位置和尺寸。
  9. 根据权利要求1-8任一项所述的方法,其特征在于,所述视图控件包括以下任意一种:网页视图控件、地图视图控件、相机视图控件、视频视图控件、文本视图控件、直播视图控件、广告视图控件、投屏视图控件。
  10. 根据权利要求1-9任一项所述的方法,其特征在于,所述终端设备的操作系统包括以下任意一种:安卓系统、鸿蒙系统、ios系统、windows系统、mac系统、EMUI系统。
  11. 根据权利要求1-10任一项所述的方法,其特征在于,所述第一引擎为JS引擎,所述第二引擎为渲染引擎。
  12. 一种终端设备,其特征在于,包括:第一引擎、第一组件、以及第二引擎;
    所述第一引擎用于确定第一应用对应的视图控件的插件属性、以及所述视图控件对应的前端文档对象模型节点;
    所述第一组件用于根据所述视图控件对应的前端文档对象模型节点,确定所述视图控件对应的布局信息;根据所述视图控件的插件属性,创建所述视图控件对应的第三方插件和纹理数据,并将所述视图控件对应的纹理数据注册到图形处理器GPU中;
    所述第一组件还用于向所述第二引擎发送所述视图控件对应的布局信息;根据所述视图控件对应的第三方插件,显示所述视图控件对应的纹理数据;
    所述第二引擎用于从GPU获取所述视图控件对应的纹理数据,并根据所述视图控件对应的布局信息对所述视图控件对应的纹理数据进行渲染。
  13. 一种电子设备,其特征在于,包括:处理器;存储器;以及计算机程序;其中,所述计算机程序存储在所述存储器上,当所述计算机程序被所述处理器执行时, 使得所述电子设备实现如权利要求1-11任一项所述的方法。
  14. 一种计算机可读存储介质,所述计算机可读存储介质包括计算机程序,其特征在于,当所述计算机程序在电子设备上运行时,使得所述电子设备实现如权利要求1-11任一项所述的方法。
PCT/CN2022/085167 2021-05-31 2022-04-02 显示视图控件的方法及装置 WO2022252804A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110602158.6 2021-05-31
CN202110602158.6A CN115480762A (zh) 2021-05-31 2021-05-31 显示视图控件的方法及装置

Publications (1)

Publication Number Publication Date
WO2022252804A1 true WO2022252804A1 (zh) 2022-12-08

Family

ID=84322606

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/085167 WO2022252804A1 (zh) 2021-05-31 2022-04-02 显示视图控件的方法及装置

Country Status (2)

Country Link
CN (1) CN115480762A (zh)
WO (1) WO2022252804A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116860215A (zh) * 2023-08-29 2023-10-10 中国兵器装备集团兵器装备研究所 一种基于开源鸿蒙系统的地图呈现方法、装置和存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9070211B1 (en) * 2012-10-18 2015-06-30 Google Inc. Webview tag for a sandboxed multiprocess browser
US9508108B1 (en) * 2008-11-07 2016-11-29 Google Inc. Hardware-accelerated graphics for user interface elements in web applications
CN111966354A (zh) * 2020-08-17 2020-11-20 Oppo(重庆)智能科技有限公司 一种页面显示方法、装置及计算机可读存储介质

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9508108B1 (en) * 2008-11-07 2016-11-29 Google Inc. Hardware-accelerated graphics for user interface elements in web applications
US9070211B1 (en) * 2012-10-18 2015-06-30 Google Inc. Webview tag for a sandboxed multiprocess browser
CN111966354A (zh) * 2020-08-17 2020-11-20 Oppo(重庆)智能科技有限公司 一种页面显示方法、装置及计算机可读存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116860215A (zh) * 2023-08-29 2023-10-10 中国兵器装备集团兵器装备研究所 一种基于开源鸿蒙系统的地图呈现方法、装置和存储介质
CN116860215B (zh) * 2023-08-29 2023-12-08 中国兵器装备集团兵器装备研究所 一种基于开源鸿蒙系统的地图呈现方法、装置和存储介质

Also Published As

Publication number Publication date
CN115480762A (zh) 2022-12-16

Similar Documents

Publication Publication Date Title
WO2021036735A1 (zh) 显示用户界面的方法及电子设备
WO2020063019A1 (zh) 信息处理方法、装置、存储介质、电子设备及系统
WO2020253758A1 (zh) 一种用户界面布局方法及电子设备
WO2018161813A1 (zh) 一种资源加载方法及装置
WO2021190354A1 (zh) 一种采集追踪trace调用链的方法和电子设备
US10235945B2 (en) Apparatus and method for controlling display in electronic device having processors
EP3115912A2 (en) Method for displaying web content and electronic device supporting the same
WO2021013019A1 (zh) 一种图片处理方法及装置
WO2021051982A1 (zh) 调用硬件接口的方法及电子设备
WO2024046084A1 (zh) 一种用户界面显示方法及相关装置
WO2021042991A1 (zh) 一种非侵入式交互方法及电子设备
WO2022252804A1 (zh) 显示视图控件的方法及装置
KR20140142116A (ko) 텍스트 변환 서비스를 제공하는 전자장치 및 방법
WO2021057390A1 (zh) 一种启动快应用的方法及相关装置
WO2024124833A1 (zh) 计算函数构建方法、计算引擎、电子设备及可读存储介质
CN115576623B (zh) 一种应用程序启动方法及装置
WO2022057663A1 (zh) 一种生成文件的方法及电子设备
CN115421693A (zh) 微前端架构的实现方法、装置、计算机设备和存储介质
CN114741121A (zh) 用于模块加载的方法与装置、电子设备
CN115543276A (zh) 一种用于实现软件开发的方法、系统以及电子设备
WO2022083477A1 (zh) 一种mvvm架构的应用的开发方法及终端
CN117130688B (zh) 快应用卡片加载方法、电子设备及存储介质
WO2022188750A1 (zh) 一种应用运行方法及电子设备
WO2023280141A1 (zh) 刷新用户界面的方法和电子设备
WO2022188766A1 (zh) 一种应用程序的显示方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22814851

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22814851

Country of ref document: EP

Kind code of ref document: A1