CN113590238A - Display control method, cloud service method, device, electronic equipment and storage medium - Google Patents

Display control method, cloud service method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113590238A
CN113590238A CN202010368264.8A CN202010368264A CN113590238A CN 113590238 A CN113590238 A CN 113590238A CN 202010368264 A CN202010368264 A CN 202010368264A CN 113590238 A CN113590238 A CN 113590238A
Authority
CN
China
Prior art keywords
interface
script file
display interface
target scene
scene display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010368264.8A
Other languages
Chinese (zh)
Inventor
池亮
郭万永
郑剑杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202010368264.8A priority Critical patent/CN113590238A/en
Publication of CN113590238A publication Critical patent/CN113590238A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides a display control method, a cloud service device, electronic equipment and a storage medium. The display control method comprises the following steps: responding to a target scene trigger event, and acquiring a drawing script file for a target scene display interface; and executing the drawing script file, and dynamically displaying the target scene display interface through the at least one interface element dynamic drawing component. Due to the fact that different drawing script files can achieve dynamic drawing of the corresponding target scene display interface, display diversity of the scene display interface is improved.

Description

Display control method, cloud service method, device, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a display control method, a cloud service device, electronic equipment and a storage medium.
Background
The system of the embedded device is usually a customized system, and different types of embedded devices have great differences in hardware characteristics such as CPU (central processing unit) computation power, storage space, and memory size. Furthermore, the GUI (graphical user interface) requirements of embedded devices vary due to different service requirements.
Generally, the graphical user interface of an embedded device cannot meet the higher requirements of the user for the display.
Disclosure of Invention
Embodiments of the present invention provide a display control method, a cloud service method, an apparatus, an electronic device, and a storage medium to solve or alleviate the above problems.
According to a first aspect of embodiments of the present invention, there is provided a display control method including: responding to a target scene trigger event, and acquiring a drawing script file for a target scene display interface; and executing the drawing script file, and dynamically displaying the target scene display interface through the at least one interface element dynamic drawing component.
According to a second aspect of the embodiments of the present invention, there is provided a display control method including: responding to a target scene trigger event, and acquiring a drawing script file for a target scene display interface; and executing the drawing script file, and displaying the target scene display interface through a target interface rendering engine in the multiple interface rendering engines.
According to a third aspect of the embodiments of the present invention, there is provided a display control method including: responding to a target scene trigger event, and determining an access path of drawing data of a target scene display interface; and acquiring the drawing data based on the access path so as to dynamically display the target scene display interface.
According to a fourth aspect of the embodiments of the present invention, there is provided a cloud service method, including: responding to the target scene service request, and determining a drawing script file for a target scene display interface; and returning the drawing script file so as to dynamically display the target scene display interface by executing the drawing script file.
According to a fifth aspect of the embodiments of the present invention, there is provided a cloud service method, including: receiving target scene display effect data uploaded based on the scene display effect template; generating a drawing script file for dynamically displaying a target scene display interface based on the target scene display effect data; and storing the drawing script file.
According to a sixth aspect of the embodiments of the present invention, there is provided a display control apparatus including: the determining module is used for responding to a target scene trigger event and determining an access path of drawing data of a target scene display interface; and the acquisition module acquires the drawing data based on the access path so as to dynamically display the target scene display interface.
According to a seventh aspect of the embodiments of the present invention, there is provided a cloud service apparatus, including: the determining module is used for responding to the target scene service request and determining a drawing script file for a target scene display interface; and the return module returns the drawing script file so as to dynamically display the target scene display interface by executing the drawing script file.
According to an eighth aspect of the embodiments of the present invention, there is provided a cloud service apparatus, including: the receiving module is used for receiving target scene display effect data uploaded based on the scene display effect template; the generation module is used for generating a drawing script file for dynamically displaying a target scene display interface based on the target scene display effect data; and the storage module is used for storing the drawing script file.
According to a ninth aspect of the embodiments of the present invention, there is provided a display control apparatus including: the acquisition module is used for responding to a target scene trigger event and acquiring a drawing script file for a target scene display interface; and the execution module executes the drawing script file and dynamically displays the target scene display interface through the at least one interface element dynamic drawing component.
According to a tenth aspect of the embodiments of the present invention, there is provided a display control apparatus including: the acquisition module is used for responding to a target scene trigger event and acquiring a drawing script file for a target scene display interface; and the execution module executes the drawing script file and displays the target scene display interface through a target interface rendering engine in the multiple interface rendering engines.
According to an eleventh aspect of embodiments of the present invention, there is provided an electronic apparatus, including: one or more processors; a computer readable medium configured to store one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the method according to the first or second aspect.
According to a twelfth aspect of embodiments of the present invention, there is provided a computer readable medium, on which a computer program is stored, which when executed by a processor, implements the method according to the first or second aspect.
The scheme of the embodiment of the invention can respond to the target scene trigger event and acquire the drawing script file for the target scene display interface; and executing the drawing script file, and dynamically displaying the target scene display interface through the at least one interface element dynamic drawing component. Due to the fact that different drawing script files can achieve dynamic drawing of the corresponding target scene display interface, display diversity of the scene display interface is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present invention, and it is also possible for a person skilled in the art to obtain other drawings based on the drawings.
Fig. 1 is a schematic diagram of a network architecture to which a display control method according to an embodiment of the present invention is applied;
FIG. 2A is a schematic flow chart of a display control method according to another embodiment of the invention;
FIG. 2B is a schematic block diagram of a display control method according to another embodiment of the present invention;
FIG. 3A is a schematic block diagram of a display control method according to another embodiment of the present invention;
FIG. 3B is a schematic block diagram of a display control method according to another embodiment of the invention;
FIG. 4A is a schematic flow chart of a display control method according to another embodiment of the invention;
FIG. 4B is a schematic flow chart of a display control method according to another embodiment of the invention;
FIG. 5 is a schematic block diagram of a display control apparatus according to another embodiment of the present invention;
FIG. 6A is a schematic block diagram of a display control apparatus according to another embodiment of the present invention;
FIG. 6B is a schematic block diagram of a display control apparatus according to another embodiment of the present invention;
FIG. 7A is a schematic flow chart diagram of a cloud service method according to another embodiment of the present invention;
FIG. 7B is a schematic flow chart diagram of a cloud service method according to another embodiment of the present invention;
fig. 8A is a schematic block diagram of a cloud service apparatus of another embodiment of the present invention;
fig. 8B is a schematic block diagram of a cloud service apparatus according to another embodiment of the present invention;
FIG. 9 is a schematic block diagram of an electronic device of another embodiment of the present invention;
fig. 10 is a hardware configuration of an electronic device according to another embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present invention, the technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments of the present invention shall fall within the scope of the protection of the embodiments of the present invention.
The following further describes specific implementation of the embodiments of the present invention with reference to the drawings.
Fig. 1 is a schematic diagram of a network architecture to which a display control method according to an embodiment of the present invention is applied. As shown, a user 20 controls the electronic device 10 through the human-machine interface 11. The electronic device 10 may be an embedded device or an internet of things device, etc. The electronic device 10 has a display module 13, a rendering module 12 and a network interface 14. It should be appreciated that the network interface 14 may send voice control information for the user 20 with respect to the electronic device 10 to the server 60 for voice recognition. The above-described voice information is merely exemplary, and the operation instruction of the user 20 for the electronic device 10 may be other forms of messages. For example, biometric information instructions such as fingerprint recognition or face recognition, etc. The voice recognition function of the server 60 is merely exemplary, and for example, the server 60 may further include a recognition server for recognizing the biometric information. Correspondingly, the server 60 further comprises a storage means 61. The storage 61 may store, for example, speech recognition samples, biometric information recognition samples, and the like. It should also be understood that operational instructions entered by the user 20 at the human interface may also be responded to via recognition locally by the electronic device 10. For example, the operation instructions may include, but are not limited to, gesture instructions, touch instructions, streaming device input instructions, gesture instructions, remote control instructions, and the like.
The server 60 may communicate with the electronic device 10 through a network 30 such as the internet to perform transmission of control messages or transmission of data. In addition, the server 50 is a drawing script file server, which includes a drawing script file storage device 51. The storage device 51 stores drawing script files such as various scenes. It should be understood that the script file stored in the storage 51 may be updated on the server side 50, or may be updated by sending a message through the electronic device 10. Alternatively, the script file may be updated by a third party device via the network 30. The embodiment of the present invention is not limited thereto. The network 30 described above is merely exemplary, and alternatively, the network 30 may be a network such as a mobile network or other heterogeneous network instead of the internet described above. In some embodiments, the server 50 and the server 60 may communicate with the electronic device end-to-end, bypassing the network 30, or via other networks.
The network interface 14 may be an integrated network interface for communicating with the server 50 and the server 60. The network interface 14 may also be configured as a separate interface, i.e., the network interface 14 is configured as two network interface modules, wherein a first network interface module communicates with the service 50. For example, the second network interface module communicates with the server 60. The display module 13 may include a rendering engine. The display module 13 may also include a graphics engine or the like. Rendering module 12 may include a scripting engine and rendering components, among others. It should be understood that the above-described configuration is merely exemplary, and for example, the display management module may be configured to implement a part of the functions of at least one of the network interface 14, the display module 13, or the rendering module 12. The embodiment of the present invention is not limited thereto. Various implementations of embodiments of the present invention will be specifically illustrated and described below, and it should be understood that the display control method of embodiments of the present invention may be applied to the above-described network architecture, and may also be applied to other network architectures.
Fig. 2A is a schematic flow chart of a display control method according to another embodiment of the invention. The display control method of fig. 2A includes:
210: and responding to the target scene trigger event, and acquiring a drawing script file for the target scene display interface.
It should be understood that the display control method of the embodiment of the present invention can be applied to any electronic device. For example, the display control method may be applied to an embedded device. For example, the electronic device includes, but is not limited to, a network device having a communication function, such as an internet of things device, an internet device, and the like, a stand-alone device that does not access a network, and the like. For example, the display control method may be applied to an internet of things device such as a smart device. For example, the smart devices include, but are not limited to, smart cameras, smart windows and doors, smart doorbells, smart speakers, smart security devices, smart media devices, smart display screens, and the like.
For example, the Internet of things device or the embedded device may be installed with an embedded operating system such as a LINUX operating system, or a real-time operating system (RTOS) such as uc/OS, FreeRTOS, TI DSP/BIOS, RT-Thread. Alternatively, the above-described device may be applied to a single chip microcomputer. It should be understood that the display control method of the embodiment of the present invention may be applied to networks such as the internet of things and the internet. Including but not limited to local area networks, metropolitan area networks, wide area networks, and the like. The display control method of the embodiment of the invention can also be applied to local electronic equipment to replace the network. In other words, the display control method may be performed offline or offline. The electronic device performing the display control method may include a display part such as a display module, a network part such as a network module, and the like. The display components described above include any pixel-based or non-pixel-based LCD screen, LED screen, or low-cost dot matrix screen. For example, each pixel in the dot matrix screen or dot in the dot matrix may include on and off modes. For example, each pixel or pixel may include both light and dark modes. For example, each pixel or pixel may include a plurality of different brightness levels.
It should also be understood that the target scenario may be a usage scenario or an application scenario. The target scene may also be any part of the usage scene, or any part of the application scene. Such as a voice recognition scenario, a device wake scenario, a music play scenario, etc. For example, any of the scenarios described above. For example, a scene triggered by a hit event of any speech information in speech recognition, a musical segment of a paragraph in music playing, etc.
In addition, a call command to at least one interface element dynamic rendering component may be included in the rendering script file. For example, the at least one interface element dynamic rendering component may include at least one of a point dynamic rendering component, a line dynamic rendering component, and a polygon dynamic rendering component. For example, each dynamic rendering component may be one or more. For example, the call command for the at least one interface element dynamic rendering component may be a call command for directly calling the at least one interface element dynamic rendering component, or may be a call command for indirectly calling the at least one interface element dynamic rendering component. For example, the drawing script file may include a call command for an adaptation interface configured for a variety of interface rendering engines, thereby enabling decoupling between each interface element dynamic drawing component and the drawing script file. For example, the interface rendering engine is to run the at least one interface element dynamic rendering component. For example, the at least one interface element dynamic rendering component is a plurality of interface element dynamic rendering components, each interface element dynamic rendering component comprising a plurality of interface element dynamic rendering components. It should be appreciated that the various interface element dynamic rendering components described above are respectively directed to various interface elements, for example, basic interface elements such as points, lines, faces, or graphical interface elements such as circles, rectangles, polygons, and the like. For example, the electronic device includes a plurality of interface element rendering engines that respectively correspond to the plurality of interface element dynamic rendering components. For example, the draw script file includes a call command that calls the various interface element rendering engines. For example, each interface element rendering engine is configured to execute at least one interface element belonging to the same type of interface element. For example, each interface element rendering engine may be configured to have a calling interface for at least one interface element belonging to that interface element, thereby enabling decoupling between the dynamic drawing components and the drawing script file for each interface element of the same interface element.
It should also be appreciated that in response to a target scene trigger event, retrieving a draw script file for a target scene display interface includes: and responding to a trigger event aiming at the first scene, and acquiring a drawing script file of a display interface for the second scene. For example, the first scene and the second scene may be the same or different.
220: and executing the drawing script file, and dynamically displaying the target scene display interface through the at least one interface element dynamic drawing component.
For example, the at least one interface element dynamic rendering component dynamically displays the target scene display interface by controlling a brightness level. For example, the brightness level may be two levels (e.g., on and off) or a plurality of different levels.
Due to the fact that different drawing script files can achieve dynamic drawing of the corresponding target scene display interface, display diversity of the scene display interface is improved.
Fig. 2B is a schematic block diagram of a display control method according to another embodiment of the invention. As shown, the electronic device receives a draw script file from a network interface. The script engine is used for operating the received drawing script file. The dynamic rendering component 1, the dynamic rendering component 2, and the dynamic rendering component 3 may be independent components. In other words, there is no call relation between the dynamic rendering components described above, thereby providing the operating efficiency of the display control program. Each dynamic rendering component described above may have a basic rendering function, thereby achieving decoupling of the dynamic rendering component from the script engine. It should be appreciated that the scripting engine may not directly invoke either the dynamic rendering component or the UI engine in order to achieve deep decoupling. Alternatively, the script engine may also invoke the UI engine to improve script rendering efficiency. It should also be understood that the script engine may run and parse the draw script file. The draw script file may directly invoke the UI engine and dynamic draw component. The drawing script file may also call only the dynamic drawing components, and each dynamic drawing component is configured with a call function of the UI engine.
In another implementation manner of the present invention, for obtaining the drawing script file for the target scene display interface, the drawing script file for the target scene display interface may be obtained from a local storage space. The drawing script file may also be received from other devices. For example, it may be imported into the electronic device from another device based on wired transmission. For example, the drawing script file is acquired to a local storage medium through a communication connection with a secondary storage medium such as a flash memory, a hard disk, or the like. For example, the drawing script file is stored in a local storage medium. For example, a draw script file is loaded from a local storage medium into a script engine running in memory. For example, the script engine runs directly into an operating system (e.g., an embedded system such as a real-time operating system or other embedded system) of the electronic device. For example, the script engine runs directly in the underlying operating environment of the electronic device, e.g., is installed in the memory of the electronic device, and is directly adapted to the underlying hardware device. For example, the script engine calls a driver abstraction interface of the display component. For example, the driver interface is used for rendering and display. For example, the scripting engine calls an interface rendering engine that calls a driver abstraction interface, which may be used for display. For example, the script engine may be configured as a portable script engine.
It should also be understood that an interface rendering engine may be used to build the embedded graphical user interface. The interface rendering engine may be configured to be decoupled from the underlying operating environment, such as an operating system or hardware, or may be configured as a portable interface rendering engine. The interface rendering engine may include a variety of different underlying graphical elements that may be used for invocation including, but not limited to, buttons, charts, images, lists, sliders, switches, keyboards, and the like. It should also be understood that the interface rendering engine may be used to implement a variety of different underlying graphical effects. The image effects include, but are not limited to, animation, antialiasing, opacity, smooth scrolling, and the like effects.
It should also be understood that the interface rendering engine may have interfaces for communicating with a variety of different I/O devices. Including but not limited to a touchpad, mouse, keyboard, encoder, etc. The interface rendering engine may be configured to display a display in multiple languages. The interface rendering engine may include an adaptation interface configured for a variety of processors, microcontrollers, displays, and the like. The interface rendering engine may support single frame buffering and multiple frame buffering.
It should also be understood that for obtaining a drawing script file for a target scene display interface, the drawing script file may be obtained from a server. For example, the drawing script file may be retrieved based on a resource location of the drawing script file. The resource location may be the same resource locator (URL) based on the internet, or may also be accessed in other positioning manners, for example, in a network, based on a positioning identifier defined by at least one of a network layer, a transport layer, a session layer, or an application layer. For example, the above-mentioned positioning manner may be added with a prefix or a suffix based on a network address field in the data packet, so that the current network transmission protocol realizes better compatibility. For example, the resource location identifier may be encrypted by using a symmetric key or an asymmetric key, so as to implement secure transmission of data, and further implement secure access to a resource. For example, a cipher text is generated using a hash function and the generated random number.
It should also be understood that for obtaining the drawing script file from the server, the drawing script file may be obtained directly by sending a request to the server. Or sending a request to the server to obtain the resource location of the drawing script file. For example, a resource location request is sent to a first server, and resource location information of a target resource (e.g., a drawing script file) in a second server sent by the first server is received. For example, a resource request is sent to the second server, and a drawing script file sent by the second server is received. For example, the first server and the second server are the same server. For example, the first service end and the second service end are different service ends. For example, a first resource request is sent to a first server, and the first server responds to the resource request and sends a second resource request to a second server, where the first server stores resource location information of a target resource in the second server. For example, based on the resource location information in the second server, the first server sends a resource request to the second server, and receives a target resource sent by the second server.
It should also be appreciated that if a resource location of a draw script file or draw script file is not obtained after sending a resource request in response to a target scene trigger event, a local static display file may be obtained. For example, the static display file is rendered by using the interface rendering engine.
As an example, in response to a target scene trigger event, obtaining a draw script file for a target scene display interface includes: responding to a target scene trigger event, and judging that information indicating the resource position of the drawing script file at the resource server exists; and if the information indicating the resource position of the drawing script file at the resource server exists, accessing the resource position of the drawing script file at the resource server to acquire the drawing script file from the resource server.
In another implementation manner of the present invention, for example, if there is no information indicating the resource location of the drawing script file at the resource server, a resource request may be sent to the resource server, and the drawing script file returned by the resource server may be directly received.
As an example, in response to a target scene trigger event, obtaining a drawing script file for a target scene display interface, further includes: and if the information indicating the resource position of the drawing script file at the resource server does not exist, sending a resource acquisition request to the resource server so as to acquire the information.
In another implementation of the invention, the electronic device can include at least a preset display area (e.g., a sub-window). For example, different preset display areas may display different content, e.g., different scene display interfaces, in parallel. In other words, contents belonging to different scenes may be displayed in parallel. Alternatively, different scene display interfaces may display different portions of the same scene display interface. For example, the scene display interface may be divided into a plurality of scene display interface portions. For example, a plurality of display areas respectively display the plurality of scene display interface portions. For example, the scene display interface may be presented in a scene display interface unit. For example, a plurality of display regions display a portion of all scene display interface units such that the plurality of display regions are merged to render the scene display interface.
For example, the different display regions may be displayed independently. For example, the plurality of display areas respectively correspond to the plurality of scene display interface queues. For example, each display interface queue may include respective timer logic. For example, the plurality of display regions includes a first display region and a second display region. For example, the first display area displays the first scene interface display queue in order. The second display area displays a second scene interface display queue in order. The plurality of scene display interface queues may be controlled by a scene display interface queue control module. For example, the first scene display queue is used for displaying a plurality of first scene display interfaces in the first scene interface display queue. For example, the second scene display queue is used for displaying a plurality of second scene display interfaces in the second scene interface display queue. And the different first scene display interfaces have first switching time. For example, a second switching opportunity exists between different second scene display interfaces. The first switching occasion may be the same as or different from the second switching occasion. For example, when the first switching time is the same as the second switching time, the first scene display interface and the second scene display interface respectively switched to may be merged into the same scene display interface. Alternatively, the first scene display interface and the second scene display interface may be part of the same scene display interface.
For example, multiple scenes may have respective priorities. For example, the priority described above may be set by the user. For example, the respective priorities described above may be determined by the amount of information in the display interface. Alternatively, the priorities may be determined by the complexity of the display interface. For example, scenes with a larger amount of information have a higher priority, in other words, are displayed with a larger display area. For example, a scene interface with higher presentation requirements for clarity has higher priority, in other words, is displayed with a larger display area. For example, a news play scene includes text information. For example, rhythm information is presented in a music play scene. For example, a general user may set a music playing scene to a lower priority and a news playing scene to a higher priority. For example, a professional musician may set a higher priority to music playback scenes. For example, when switching to a scene with a high priority, the plurality of display areas are switched to a scene display interface that simultaneously displays the priority. Optionally, more than two display areas are used for displaying the scene display interface, so that more information details are presented for the user by using the display space of the electronic equipment. For example, each of the display regions described above may serve as a display unit. For example, each display unit may comprise a plurality of pixels or pixels. For example, a plurality of display areas may be used to display different scene display interfaces based on the same switching timing. For example, a plurality of display areas may be used to display different scene display interfaces based on different switching occasions, so that a plurality of scenes are presented by using the same electronic device.
For example, the scene display interfaces in the scene display interface queue may be sequentially displayed based on the pre-arranged switching timings using the same display area. In other words, different switching occasions may be queue managed. For example, in the display process of the current scene, when the next scene trigger event is monitored, the next scene trigger event can be immediately switched to the next scene display interface. For example, in the display process of the current scene, when a next scene trigger event is monitored, a switching time may be set for the display process of the next scene, and when the switching time arrives, the interface is switched from the display interface of the current scene to the display interface of the next scene. For example, in the display process of the current scene, when a next scene trigger event is monitored, split-screen display may be triggered, the current scene display interface is continuously displayed in one split screen after split-screen display processing, and the next scene display interface is displayed in another split screen. For example, the split-screen display processing described above includes, but is not limited to, even-screen split such as two-screen split, four-screen split, odd-screen split, irregular split, and the like. For example, the display control module determines a priority of a triggering event in response to the triggering event. For example, the display control module determines at least one of the number of display areas and the location of the display areas based on the priority of the triggering event.
As an example, in response to a target scene trigger event, obtaining a draw script file for a target scene display interface includes: responding to a current scene trigger event, switching from a last scene display interface to a current scene display interface, and acquiring a drawing script file for the current scene display interface, wherein the dynamic display of the target scene display interface comprises the following steps: and dynamically displaying the current scene display interface.
For example, the process of switching from the previous scene display interface to the current scene display interface and the process of acquiring the draw script file for the current scene display interface may be executed in parallel or may be executed in series.
As an example, in response to a current scene trigger event, switching from a previous scene display interface to a current scene display interface, and acquiring a draw script file for the current scene display interface, includes: responding to a current scene trigger event, adding a current scene display interface into a scene display interface queue, and acquiring a drawing script file for the current scene display interface; and when the drawing script file of the current scene display interface is acquired, updating the scene display interface queue so as to switch from the last scene display interface to the current scene display interface.
For example, the process of adding the current scene display interface into the scene display interface queue and the process of acquiring the drawing script file for the current scene display interface may be executed in parallel or may be executed in series.
As an example, in response to a current scene trigger event, switching from a previous scene display interface to a current scene display interface, and acquiring a draw script file for the current scene display interface, includes: responding to a current scene trigger event, and adding a current scene display interface into a scene display interface queue; switching from the last scene display interface to the current scene display interface by updating the scene display interface queue so as to trigger an acquisition request of a drawing script file of the current scene display interface; and responding to the acquisition request to acquire the drawing script file.
As an example, updating the scene display interface queue to switch from the previous scene display interface to the current scene display interface includes: acquiring the priority of a current scene display interface; and updating the scene display interface queue based on the priority. The last scene display interface comprises a first display interface and a second display interface, the first display interface corresponds to a first display interface queue, the second display interface corresponds to a second display interface queue, and the method further comprises: and responding to the trigger operation of the current scene, switching the first display interface to the current display interface, or switching the second display interface to the current display interface, or switching both the first display interface and the second display interface to the current display interface.
In another implementation of the invention, the user may send any instruction to the electronic device. For example, a user sends a touch instruction to the electronic device. For example, a user inputs gesture instructions, voice instructions, gesture instructions, and the like to the electronic device. For example, the electronic device may include a human-machine interface. For example, the electronic device may include any one of a light sensor, a sound sensor, a gas sensor, and a chemical sensor. For example, the electronic device may include at least one of a position sensor, a proximity sensor, a pressure sensor, an optical sensor, a capacitive sensor, an electromagnetic sensor. For example, the user may input any instruction of expression, voice, gesture, biometric image information such as fingerprint, face, and the like to the electronic device. In the electronic device, an analog/digital converter may be connected to any of the sensors described above. For example, the above-described sensor may have a function of an analog/digital converter or a digital/analog converter, or a function of converting an analog signal and a digital signal into each other. For example, an analog/digital converter or a digital/analog converter is connected to a microprocessor (DSP) or a Microcontroller (MCU). For example, the microprocessor is connected with a processor (for example, a CPU (central processing unit) such as an ARM or X86). For example, the processor is connected to a network module of the electronic device. The network module can be used for communicating with a server for user information identification, so as to send information identified by the sensor or information processed by the microprocessor to the server and obtain an identification result from the server. It will be appreciated that a DSP (microprocessor, or digital signal processor) may also be coupled to a network block (e.g., a network element or network interface) alone, as described above. The microprocessor may transmit the sensing information or the processed sensing information (e.g., noise reduction processing, compression processing) to the network module via a processor (e.g., CPU). Alternatively, the microprocessor may directly transmit the sensing information or the processed sensing information to a network module. For example, the microprocessor may receive the recognition result directly from the network module. For example, the microprocessor may receive the recognition result from the processor.
It should also be understood that the above-described server may configure different computing resources or interfaces for different types of identification. For example, a plurality of servers configured to identify different types of sensed information. For example, the network interface transmits the corresponding sensing information to the corresponding server. For example, the server stores a neural network recognition model. For example, the server may update the neural network model according to sensing information sent from the network interface. In one example, a user inputs a voice instruction. For example, the server recognizes the recognized text based on the voice command, and returns the recognized text to the electronic device. For example, the electronic device locally stores a mapping relationship between text information or keyword information and a scene. For example, if the target keyword is included in the recognition text, the target keyword is determined as a scene trigger event. For example, if the recognition text includes a first keyword and a second keyword, both the first keyword and the second keyword are determined as the target scene trigger event. For example, the two events described above may be added to two scene display interface queues, accordingly. For example, the first keyword may be determined to have a higher priority than the second keyword. Accordingly, a first keyword may be determined as the target scenario trigger event. For example, the second keyword may be determined as the next scenario trigger event. For example, the second keyword may be ignored. In another example, the server may perform keyword extraction based on the identified file. For example, the extracted keywords are returned to the electronic device. For example, information indicating the above-described keywords, such as the identifications of the keywords, is returned. For example, the server may return context information, e.g., an identification of a context, to the electronic device. For example, the server may further store a corresponding relationship between the keyword information and the scene information. For example, the electronic device may send a feedback message to the server based on the result of the user's use (e.g., whether to change or repeat the entered instruction, or the number of times to change the entered instruction or repeat the entered instruction). For example, the feedback message includes information about a change input instruction or a repeat input instruction. For example, the server updates the mapping relationship or the neural network model based on the feedback message. For example, a lower weight or priority may be configured for parameters associated with the change input instruction or the repeat input instruction. Alternatively, parameters associated with unaltered input instructions or unrepeated input instructions are configured with a higher weight or priority. The server may update the correspondence based on the information in the feedback message. For example, the local electronic device may update the above-described correspondence relationship indicating the keyword information and the scene information based on the above-described information related to the change input instruction or the repeat input instruction.
As an example, the method further comprises: and responding to the target voice instruction, and sending the voice fragment in the target voice instruction to the voice recognition server so that the voice recognition server returns a target scene trigger event, wherein the target scene trigger event is used for triggering a target scene indicated by the recognition result of the voice fragment.
For example, the voice recognition server may be the same as the resource server, or may be a different server. For example, the voice recognition server may be the same as the resource server, and the voice recognition server recognizes the recognition result of the voice segment and returns resource location information of the drawing script file in the voice recognition server or the drawing script file to the electronic device. For example, the voice recognition server and the resource server are different servers, when recognizing the recognition result, the voice recognition server responds to the recognition result, sends resource location information to the resource server, and forwards the resource location information to the electronic device, and the electronic device obtains the drawing script file by sending a drawing script file request to the resource server. For example, sending to a resource server sends resource location information to the resource server and forwards the resource location information to the electronic device and forwards the draw script file to the electronic device.
In another implementation of the present invention, a call command to at least one interface display graphical component may be included in the draw script file. For example, the drawing script file may include a command for calling at least one interface display graphics component by at least one interface element dynamic drawing component, so as to improve the rendering processing efficiency of the target scene display interface. For example, the drawing script file may include a call command of the at least one interface display graphics component to the at least one interface element dynamic drawing component, so as to improve the dynamic drawing processing efficiency of the target scene display interface.
It should also be appreciated that the at least one interface element dynamic rendering component may be configured with an interface element dynamic rendering component interface. For example, a call command to the interface element dynamic rendering component interface may be included in the rendering script file. For example, at least one interface display graphical component may be configured with an interface display graphical component interface. The drawing script file may include a call command for the interface of the interface display graphic component.
For example, a command for calling the interface display graphic component interface by the at least one interface element dynamic rendering component can be included in the rendering script file. For example, a command for calling the interface display graphic component interface to the at least one interface element dynamic rendering component may be included in the rendering script file.
It is also to be appreciated that the at least one interface element dynamic rendering component or interface element dynamic rendering component described above can be executed by a scripting engine. The at least one interface display graphical component or the interface display graphical component interface described above may be executed by an interface rendering engine. The at least one interface element dynamic rendering component and the at least one interface display graphical component described above may be included in a rendering script file for execution with respective scripting engines and interface rendering engines. In addition, the call to at least one interface display graphic component can be realized by an interface display graphic component interface, and correspondingly, only a call command to the interface display graphic component interface can be included in the drawing script text.
For example, the various interface element dynamic rendering components described above are respectively directed to various interface elements, e.g., basic interface elements such as points, lines, faces, or graphical interface elements such as circles, rectangles, polygons, and the like. For example, the graphical elements include, but are not limited to, buttons, charts, images, lists, sliders, switches, keyboards, and the like.
In other words, the interface element may be dynamically rendered based on the graphical element (e.g., the interface element is a portion of the image element) when the script file is executed. For example, when a script file is executed, an image is determined, and dynamic rendering is performed based on a point, a line, or the like in the image. For example, it is also possible to directly dynamically draw based on at least one interface element.
As an example, executing a draw script file to dynamically display a target scene display interface via at least one interface element dynamic drawing component includes: executing the drawing script file to call at least one interface display graphic component through at least one interface element dynamic drawing component; and executing at least one interface display graphic component to dynamically display the target scene display interface.
In another implementation of the present invention, after the current execution of the draw script file is completed, it may be determined whether the next scene display interface is the same as the current scene display interface. If so, the drawing script file is executed again. For example, the current scene display interface is implemented by looping through the same target draw script file. For example, after the current execution of the drawing script file is completed, the next execution of the drawing script file is entered. For example, after the current execution of the draw script file is completed, a static display interface is displayed. For example, when the next scene trigger event is monitored, the current drawing script file is deleted. For example, when switching from the current scene display interface to the next scene display interface, the current drawing script file is deleted. For example, the scene display interface queue and the drawing script file queue are managed in a unified manner. For example, a first pointer variable and a second pointer variable are defined. For example, the first pointer variable and the second pointer variable have the same direction of movement. In the target task queue, a queue item on one side of the moving direction of the first pointer variable is used as a scene display interface queue. The queue entry on the other side of the moving direction of the second pointer variable serves as a draw script file queue. Optionally, the first pointer variable and the second pointer variable have opposite movement directions. In the target task queue, a queue item on one side of the moving direction of the first pointer variable is used as a scene display interface queue. The queue entries on the same side of the moving direction of the second pointer variable serve as a drawing script file queue. For example, there are N queue entry intervals between the first pointer variable and the second pointer variable. For example, a queue entry interval, that is, after the next scene display interface is updated, the drawing script file of the scene display interface is deleted. Because the task queue is adopted to carry out unified management on the two queues, the calculation expense of the electronic equipment is saved.
As an example, the method further comprises: and deleting the drawing script file after finishing the dynamic display of the target scene display interface.
In another implementation of the present invention, the method may further comprise determining brightness information of the ambient light. For example, the display brightness of the target scene display interface is adjusted according to the brightness information of the ambient light. For example, the electronic apparatus is arranged with at least one photosensitive member. For example, the electronic apparatus is arranged with a plurality of photosensitive members at a plurality of positions, respectively. For example, the luminance information described above includes luminance distribution information. For example, based on the sensing values of the plurality of photosensitive members described above, the luminance distribution of the ambient light is determined. For example, based on a display part of an electronic device, an array of photosensitive devices is arranged, and the array of photosensitive devices may be a one-dimensional array or a multi-dimensional array. Wherein each photosensitive device senses brightness information of its ambient light. For example, the display section includes therein a plurality of display luminance control units corresponding to a plurality of photosensitive members, such as the above-described photosensitive members and display luminance control units in one-to-one correspondence. For example, a plurality of display luminance control units are used to control backlight display of the display section, or luminance control of pixels or pixels. For example, different ones of the array of photosensitive devices may exhibit different brightness distributions in a target dimension. For example, the brightness of the plurality of light sensing devices gradually becomes brighter, darker, brighter then darker, darker then brighter, etc. based on the brightness distribution of the ambient light.
For example, a first photosensitive member on one side of the electronic apparatus detects a first sensing value. For example, a second photosensitive member located on the other side of the electronic apparatus detects a second sensing value. For example, the target scene display interface is dynamically displayed based on the first sensing value and the second sensing value. For example, on the display means, display is performed based on the luminance of the first sensing value in a first area closer to the first photosensitive member, and correspondingly, display is performed based on the luminance of the second sensing value in a second area closer to the second photosensitive member. For example, the first sensing value is larger than the second sensing value, and accordingly, the brightness of the first area is larger than that of the second area. Due to the display brightness control method, the brightness of the target scene display interface in the electronic equipment is consistent with the brightness of the ambient light, and the target scene display interface is integrated with the ambient light.
It should also be understood that the above-described luminance sensing may be performed periodically or non-periodically. For example, different brightness sensing periods may be determined based on different environments. For example, different periods are set for different photosensitive devices.
As an example, the method further comprises: determining a mapping relation between the ambient light brightness and the display brightness of the target scene display interface; determining the current display brightness corresponding to the detected current environment light brightness based on the mapping relation, wherein the dynamic display of the target scene display interface comprises: and dynamically displaying the target scene display interface at the current display brightness.
For example, the mapping relationship may be stored in a local storage space. The mapping relationship can also be obtained from the server. For example, mapping relationships based on different scene topics can be stored at the server. For example, different mappings correspond to different scene topics, e.g., party, work, study, etc. The mapping relationship may also be updated. For example, the setting may be performed by the user at a mobile phone terminal. Or the user locally updates the mapping relation in the setting component of the electronic equipment or updates the mapping relation from the server side.
In another implementation manner of the present invention, the interface rendering engine and the script engine may be one or more, and the embodiment of the present invention does not limit this. For example, a first adaptation framework may be configured for a variety of scripting engines. For example, a second adaptation framework may be configured for a variety of interface rendering engines. For example, the configuration of at least one of the first and second fitting frames described above may be performed. Thus, the commands in the drawing script file may be different without depending on different kinds of interface rendering engines and script engines. In other words, because the first adaptation framework is adapted to various script engines and the second adaptation framework is adapted to various interface rendering engines, compatibility with different script engines or different interface rendering engines can be realized without reconfiguring the drawing script file. For example, existing draw script file code or draw script library functions may be saved on the server side while the script engine or interface rendering engine is replaced on the electronic device side. For example, the pre-or post-replacement script engines described above include, but are not limited to, a JS engine, a QuickJS engine, a V8 engine, a JERRYSCRIPT engine, and the like. The interface rendering engine before or after replacement includes, but is not limited to, a UI engine, a UI rendering engine, a LittlevGL engine, and the like.
As an example, executing a draw script file to dynamically display a target scene display interface via at least one interface element dynamic drawing component includes: calling a target script engine in the script engines via an adaptation framework configured for the script engines; and interpreting and executing the drawing script file through the target script engine, and calling the interface rendering engine to dynamically display the target scene display interface, wherein the interface rendering engine is used for executing at least one interface element dynamic drawing component.
In another implementation of the present invention, the initialization of the interface rendering engine, the initialization of the script engine, the driving of the photosensitive member, the driving of the display member, and the like described above may be performed individually. Unified initialization may also be performed by the operating system or the underlying runtime environment. For example, a power-on self-test after power-on may be performed to initialize hardware and start the operating system.
As an example, the method further comprises: and responding to the starting triggering event, and executing power-on self-test to start the target script engine and the interface rendering engine.
In another implementation manner of the present invention, the interface rendering engine described above can also be used for static display. For example, a display file such as a power-on display screen may be configured in advance in a local storage medium (e.g., ROM). The display file may also be implemented as a script file, and stored in a local storage medium, or the script file may be acquired from a server. For example, the power-on display file may also be a dynamic file stored in a local storage medium. For example, an interface rendering engine may be used to directly perform rendering on the boot display screen, or the interface element drawing component may be called to perform rendering.
As an example, the method further comprises: responding to a startup trigger event, and determining a startup display picture script file; and drawing the display picture file of the opening machine through the interface rendering engine.
Fig. 3A is a schematic block diagram of a display control method according to another embodiment of the invention. As shown, the rendering engine may be an adaptive interface configured for a variety of scripting engines. In addition, the rendering engine adaptation layer may also be an adaptation interface configured for a variety of UI (user interface) engines. In addition, the extension module may include components configured for dynamic rendering such that rendering script files make calls thereto. Additionally, the UI engine may be used for static displays. For example, the rendering engine adaptation may transmit a draw script file to the script engine. The rendering engine adaptation layer may also transmit the display file to the UI engine. In addition, the UI engine may also be invoked by methods or functions in the extension module (e.g., dynamic rendering components).
Fig. 3B is a schematic block diagram of a display control method according to another embodiment of the invention. As shown, the rendering engine adaptation layer may be connected with the display management layer. The display management layer may include a network interface to receive the drawing script file and a function of window management. For example, there may be one-to-one, one-to-many, many-to-one, many-to-many correspondence between the drawing script file and the window. For example, window management includes management of scene display interface queues and/or management of interface display areas. For example, the extension component may include an interface element component and may also include a graphical component. It should also be understood that, although not shown, in this example, the display management layer may also include other functionality. In other examples, the functions of the network interface and the window management described above may also be configured separately, i.e., may not be configured in the display management layer.
Fig. 4A is a schematic flowchart of a display control method according to another embodiment of the invention. The display control method of fig. 4A may be applied to any suitable electronic device having data processing capabilities, including but not limited to: the system comprises the Internet of things equipment, embedded equipment, intelligent equipment, a server, a mobile terminal (such as a mobile phone, a PAD and the like), a PC and the like. The intelligent device includes but is not limited to an intelligent transportation device, an intelligent household device, a public safety device and the like. The intelligent household equipment comprises but is not limited to an intelligent air conditioner, an intelligent bulb, an intelligent desk and chair, an intelligent television, an intelligent sound box, an intelligent instrument, an intelligent camera, an intelligent window sensor, an intelligent doorbell, an intelligent detector, other intelligent safety equipment and the like. The embodiment of the present invention is not limited thereto.
The method comprises the following steps:
410: responding to a target scene trigger event, and acquiring a drawing script file for a target scene display interface;
420: and executing the drawing script file, and displaying a target scene display interface through a target interface rendering engine in the multiple interface rendering engines.
Because the drawing script file comprises the calling command of the adaptive interface configured for various interface rendering engines, on one hand, the display diversity of the drawing script file is realized; on the other hand, the decoupling between the drawing script file and the interface rendering engine is realized.
In another implementation manner of the present invention, executing a draw script file, and displaying a target scene display interface via a target interface rendering engine in multiple interface rendering engines includes: executing the drawing script file, and calling a target interface rendering engine in the multiple interface rendering engines; and executing at least one interface element dynamic drawing component through the target interface rendering engine so as to dynamically display the target scene display interface. In addition, the drawing script file can comprise calling commands of the adaptive interfaces configured for various interface rendering engines.
Fig. 4B is a schematic flowchart of a display control method according to another embodiment of the invention. The display control method of fig. 4B may be applied to any suitable electronic device having data processing capabilities, including but not limited to: the system comprises the Internet of things equipment, embedded equipment, intelligent equipment, a server, a mobile terminal (such as a mobile phone, a PAD and the like), a PC and the like. The intelligent device includes but is not limited to an intelligent transportation device, an intelligent household device, a public safety device and the like. The intelligent household equipment comprises but is not limited to an intelligent air conditioner, an intelligent bulb, an intelligent desk and chair, an intelligent television, an intelligent sound box, an intelligent instrument, an intelligent camera, an intelligent window sensor, an intelligent doorbell, an intelligent detector, other intelligent safety equipment and the like. The embodiment of the invention does not limit the method, and the method comprises the following steps:
430: responding to a target scene trigger event, and determining an access path of drawing data of a target scene display interface;
440: and acquiring drawing data based on the access path so as to dynamically display the target scene display interface.
In one example, the access path may indicate access to the resource server to obtain rendering data. The resource server may be a cloud server such as a public cloud, a private cloud, a proprietary cloud, or a hybrid cloud. In another example, the access path may indicate access to a local memory space to retrieve rendering data.
The access path of the drawing data of the target scene display interface is determined in response to the target scene trigger event, so that flexible access and acquisition of the drawing data are realized. Furthermore, dynamic display of the target scene display interface is facilitated.
In a time display scene, a weather display scene, an environment temperature display scene, an alarm clock display scene and a music rhythm display scene, local storage space can be accessed to obtain drawing data; the resource server can also be accessed to obtain drawing data.
In addition, in response to a target scene trigger event, determining an access path to rendering data of a target scene display interface may include: and determining an access path of the drawing data of the target scene display interface according to the type of the target scene trigger event. For example, the scenes may include a first scene and a second scene, and for a trigger event of the first scene, a local storage space may be accessed to obtain the rendering data. And for the trigger event of the second scene, the resource server can be accessed to obtain drawing data. For example, the size of the drawing file required for the second scene is larger than the size of the drawing file of the second scene. In addition, the drawing data can be acquired from a local storage space, and the resource server is accessed to acquire the drawing data. For example, a weather display scene or an environmental temperature display scene, etc., which have a high requirement for real-time information, may access the resource server to obtain partial rendering data indicating the real-time information. Meanwhile, a weather display scene or an environment temperature display scene and the like have a specific mode for the dynamically drawn interface element, so that data of the local indication interface element can be acquired. In addition, the data indicating the local interface element may be updated from the cloud server, may be updated based on an external device, and may be coupled in the interface drawing module.
In addition, the drawing files of the music rhythm display scene and the like are large, and the resource server can be accessed to obtain drawing data. In one example, for musical rhythm display scenes and the like, rendering data may also be retrieved from local storage space. In another example, the resource server may be preferentially selected to access and obtain the drawing data, and when the drawing data obtaining fails to be completed within a preset time due to network delay or the like, the drawing data may be obtained from a local storage space. In another example, drawing data may be preferentially acquired from a local storage space, and in a case that the local storage space stores the drawing data, the resource server may be accessed to acquire the drawing data.
In addition, the rendering file of the alarm clock display scene and the like is small, and rendering data can be acquired from a local storage space. In another example, the resource server may be preferentially selected to access and obtain the drawing data, and when the drawing data obtaining fails to be completed within a preset time due to network delay or the like, the drawing data may be obtained from a local storage space. In another example, drawing data may be preferentially acquired from a local storage space, and in a case that the local storage space stores the drawing data, the resource server may be accessed to acquire the drawing data.
In addition, when the resource server is accessed to obtain drawing data, the drawing data obtained each time may be the same or different for the same scene trigger event. For example, in a musical rhythm display scene or the like, different drawing data may be displayed for a plurality of scene trigger events.
Further, triggers for the same scenario at different times may correspond to different trigger events. The triggering of different locations for the same scenario may correspond to different triggering events. Different trigger conditions for the same scenario may correspond to different trigger events. For example, a first voice trigger and a second voice trigger for a music scene may correspond to different trigger events. The first voice trigger and the second voice trigger indicate the same content, e.g., play the target track. The first and second voice triggers may have different durations, e.g., indicating different user emotional information. Different user emotional information may correspond to different triggering events. Different trigger events may correspond to different pulse draw files.
In addition, in response to a target scene trigger event, determining an access path to rendering data of a target scene display interface may include: and determining an access path of drawing data of the target scene display interface according to the opportunity of triggering the event of the target scene.
In addition, in response to a target scene trigger event, determining an access path to rendering data of a target scene display interface may include: and determining an access path of the drawing data of the target scene display interface according to the recognition result of the target scene trigger event.
In another implementation manner of the present invention, the obtaining the drawing data based on the access path to dynamically display the target scene display interface includes: based on the access path, acquiring a drawing script file comprising the data from the resource server; and executing the drawing script file through a script engine, and dynamically displaying the target scene display interface.
In another implementation manner of the present invention, the obtaining the drawing data based on the access path to dynamically display the target scene display interface includes: obtaining the drawing data from a local storage space; and dynamically displaying the target scene display interface based on the drawing data through a user interface engine.
Fig. 5 is a schematic block diagram of a display control apparatus according to another embodiment of the present invention. The display control apparatus of fig. 5 may be any suitable electronic device having data processing capabilities, including but not limited to: the system comprises the Internet of things equipment, embedded equipment, intelligent equipment, a server, a mobile terminal (such as a mobile phone, a PAD and the like), a PC and the like. The intelligent device includes but is not limited to an intelligent transportation device, an intelligent household device, a public safety device and the like. The intelligent household equipment comprises but is not limited to an intelligent air conditioner, an intelligent bulb, an intelligent desk and chair, an intelligent television, an intelligent sound box, an intelligent instrument, an intelligent camera, an intelligent window sensor, an intelligent doorbell, an intelligent detector, other intelligent safety equipment and the like. The embodiment of the present invention is not limited thereto. The device includes:
the obtaining module 510, responding to the target scene trigger event, obtaining a drawing script file for the target scene display interface
And the execution module 520 executes the drawing script file and dynamically displays the target scene display interface through the at least one interface element dynamic drawing component.
Due to the fact that different drawing script files can achieve dynamic drawing of the corresponding target scene display interface, display diversity of the scene display interface is improved.
In another implementation manner of the present invention, the obtaining module is specifically configured to: responding to a target scene trigger event, and judging that information indicating the resource position of the drawing script file at the resource server exists; and if the information indicating the resource position of the drawing script file at the resource server exists, accessing the resource position of the drawing script file at the resource server to acquire the drawing script file from the resource server.
In another implementation manner of the present invention, the obtaining module is further configured to: and if the information indicating the resource position of the drawing script file at the resource server does not exist, sending a resource acquisition request to the resource server so as to acquire the information.
In another implementation manner of the present invention, the obtaining module is specifically configured to: responding to a current scene trigger event, switching from a previous scene display interface to a current scene display interface, and acquiring a drawing script file for the current scene display interface, wherein the execution module is specifically configured to: and dynamically displaying the current scene display interface.
In another implementation manner of the present invention, the obtaining module is specifically configured to: responding to a current scene trigger event, adding a current scene display interface into a scene display interface queue, and acquiring a drawing script file for the current scene display interface; and when the drawing script file of the current scene display interface is acquired, updating the scene display interface queue so as to switch from the last scene display interface to the current scene display interface.
In another implementation manner of the present invention, the obtaining module is specifically configured to: responding to a current scene trigger event, and adding a current scene display interface into a scene display interface queue; switching from the last scene display interface to the current scene display interface by updating the scene display interface queue so as to trigger an acquisition request of a drawing script file of the current scene display interface; and responding to the acquisition request to acquire the drawing script file.
In another implementation manner of the present invention, the apparatus further includes a transceiver module: and responding to the target voice instruction, and sending the voice fragment in the target voice instruction to the voice recognition server so that the voice recognition server returns a target scene trigger event, wherein the target scene trigger event is used for triggering a target scene indicated by the recognition result of the voice fragment.
In another implementation manner of the present invention, the execution module is specifically configured to: executing the drawing script file to call at least one interface display graphic component through at least one interface element dynamic drawing component; and executing at least one interface display graphic component to dynamically display the target scene display interface.
In another implementation manner of the present invention, the execution module is further configured to: and deleting the drawing script file after finishing the dynamic display of the target scene display interface.
In another implementation of the present invention, the apparatus further includes a determining module: determining a mapping relation between the ambient light brightness and the display brightness of the target scene display interface; determining the current display brightness corresponding to the detected current ambient light brightness based on the mapping relationship, wherein the execution module is specifically configured to: and dynamically displaying the target scene display interface at the current display brightness.
In another implementation manner of the present invention, the execution module is specifically configured to: calling a target script engine in the script engines via an adaptation framework configured for the script engines; and interpreting and executing the drawing script file through the target script engine, and calling the interface rendering engine to dynamically display the target scene display interface, wherein the interface rendering engine is used for executing at least one interface element dynamic drawing component.
In another implementation manner of the present invention, the apparatus further includes a starting module: and responding to the starting triggering event, and executing power-on self-test to start the target script engine and the interface rendering engine.
In another implementation manner of the present invention, the obtaining module is further configured to: responding to a startup trigger event, and determining a startup display picture script file; the execution module is further to: and drawing the display picture file of the opening machine through the interface rendering engine.
In another implementation of the invention, the at least one interface element dynamic rendering component comprises at least one of a point dynamic rendering component, a line dynamic rendering component, and a polygon dynamic rendering component.
The method of the present embodiment may be performed by any suitable electronic device having data processing capabilities, including but not limited to: server, mobile terminal (such as mobile phone, PAD, etc.), PC, etc.
The apparatus of this embodiment is used to implement the corresponding method in the foregoing method embodiments, and has the beneficial effects of the corresponding method embodiments, which are not described herein again. In addition, the functional implementation of each module in the apparatus of this embodiment can refer to the description of the corresponding part in the foregoing method embodiment, and is not described herein again.
Fig. 6A is a schematic block diagram of a display control apparatus according to another embodiment of the present invention. The display control apparatus of fig. 6A may be any suitable electronic device having data processing capabilities, including but not limited to: the system comprises the Internet of things equipment, embedded equipment, intelligent equipment, a server, a mobile terminal (such as a mobile phone, a PAD and the like), a PC and the like. The intelligent device includes but is not limited to an intelligent transportation device, an intelligent household device, a public safety device and the like. The intelligent household equipment comprises but is not limited to an intelligent air conditioner, an intelligent bulb, an intelligent desk and chair, an intelligent television, an intelligent sound box, an intelligent instrument, an intelligent camera, an intelligent window sensor, an intelligent doorbell, an intelligent detector, other intelligent safety equipment and the like. The embodiment of the present invention is not limited thereto. The device includes:
the obtaining module 610 is used for responding to a target scene trigger event and obtaining a drawing script file for a target scene display interface;
and the execution module 620 executes the drawing script file, and displays a target scene display interface through a target interface rendering engine in the multiple interface rendering engines.
It should be understood that the draw script file may include call commands for an adaptation interface configured for a variety of interface rendering engines. Because the drawing script file comprises the calling command of the adaptive interface configured for various interface rendering engines, on one hand, the display diversity of the drawing script file is realized; on the other hand, the decoupling between the drawing script file and the interface rendering engine is realized.
In another implementation manner of the present invention, the execution module is specifically configured to: executing the drawing script file, and calling a target interface rendering engine in the multiple interface rendering engines; and executing at least one interface element dynamic drawing component through the target interface rendering engine so as to dynamically display the target scene display interface.
The apparatus of this embodiment is used to implement the corresponding method in the foregoing method embodiments, and has the beneficial effects of the corresponding method embodiments, which are not described herein again. In addition, the functional implementation of each module in the apparatus of this embodiment can refer to the description of the corresponding part in the foregoing method embodiment, and is not described herein again.
Fig. 6B is a schematic block diagram of a display control apparatus according to another embodiment of the present invention. The display control apparatus of FIG. 6B may be any suitable electronic device having data processing capabilities, including but not limited to: the system comprises the Internet of things equipment, embedded equipment, intelligent equipment, a server, a mobile terminal (such as a mobile phone, a PAD and the like), a PC and the like. The intelligent device includes but is not limited to an intelligent transportation device, an intelligent household device, a public safety device and the like. The intelligent household equipment comprises but is not limited to an intelligent air conditioner, an intelligent bulb, an intelligent desk and chair, an intelligent television, an intelligent sound box, an intelligent instrument, an intelligent camera, an intelligent window sensor, an intelligent doorbell, an intelligent detector, other intelligent safety equipment and the like. The embodiment of the present invention is not limited thereto. The device includes:
the determining module 630, in response to the target scene trigger event, determines an access path to the drawing data of the target scene display interface;
the obtaining module 640 obtains the drawing data based on the access path to dynamically display the target scene display interface.
In another implementation manner of the present invention, the obtaining module is specifically configured to: based on the access path, acquiring a drawing script file comprising the data from the resource server; and executing the drawing script file through a script engine, and dynamically displaying the target scene display interface.
In another implementation manner of the present invention, the obtaining module is specifically configured to: obtaining the drawing data from a local storage space; and dynamically displaying the target scene display interface based on the drawing data through a user interface engine.
The apparatus of this embodiment is used to implement the corresponding method in the foregoing method embodiments, and has the beneficial effects of the corresponding method embodiments, which are not described herein again. In addition, the functional implementation of each module in the apparatus of this embodiment can refer to the description of the corresponding part in the foregoing method embodiment, and is not described herein again.
Fig. 7A is a schematic flowchart of a cloud service method according to another embodiment of the present invention. The cloud service method of fig. 7A may be applied to a cloud server such as a public cloud, a private cloud, or a hybrid cloud, and may also be applied to other terminal devices. The method comprises the following steps:
710: and responding to the target scene service request, and determining a drawing script file for the target scene display interface.
720: and returning the drawing script file so as to dynamically display the target scene display interface by executing the drawing script file.
In the scheme of the embodiment of the invention, the drawing script file for the target scene display interface can be determined in response to the target scene service request, and the drawing script file is used for dynamically displaying the target scene display interface. Therefore, flexibility of obtaining the drawing script file is achieved, and storage space of the client is saved.
The embodiment of the invention can be suitable for time display scenes, weather display scenes, environment temperature display scenes, alarm clock display scenes and music rhythm display scenes.
In a music rhythm display scenario, the cloud server may store at least one rhythm drawing script file for a target track. The user can browse at least one drawing script file effect display of the cloud service end for the target track. The user may select the draw script file effect of interest and locally associate the corresponding draw script file with the target track. Or the user selects the drawing script file effect so as to download from the server side when the corresponding rhythm scene is triggered.
In another music rhythm display scenario, the cloud server may store at least one rhythm drawing script file for a music type. The user can browse at least one drawing script file effect display of the music type of the cloud service side. The user can select an interesting drawing script file effect as a drawing effect of a music genre. Or the user selects the drawing script file effect so as to download from the server side when the corresponding rhythm scene is triggered.
In a weather display scene, the cloud server may store at least one drawing script file for the display effect of the weather type. The user can browse at least one drawing script file effect display of the cloud service end for the weather type. The user may select a weather type presentation of interest. Or the second user selects the drawing script file effect so as to download from the server side when the corresponding weather scene is triggered.
It should also be understood that the descriptions herein of features and steps of other embodiments (e.g., embodiments corresponding to any of the other figures) apply to aspects of this embodiment.
Fig. 7B is a schematic flowchart of a cloud service method according to another embodiment of the present invention. The cloud service method of fig. 7B may be applied to a cloud server such as a public cloud, a private cloud, or a hybrid cloud, and may also be applied to other terminal devices, and the method includes:
730: and receiving target scene display effect data uploaded based on the scene display effect template.
740: and generating a drawing script file for dynamically displaying the target scene display interface based on the target scene display effect data.
750: and storing the drawing script file.
In the embodiment of the invention, the display effect template is convenient for the user to edit and create, and the server side (serving as the server side equipment of the service terminal) is beneficial to the user to download at the client side, or the client side downloads when a scene is triggered.
It is to be understood that in the first example, a scene display effect template may be configured at the first client, and the user may select target scene display effect data based on the scene display effect template. And when the trigger event of the target scene is triggered at the first client, sending a drawing script service request to the server. The server may return the drawing script file to the first client, so as to trigger the target scene display effect at the first client. Thus, a customized drawing script file is realized in a simple manner.
In another example, a scene display effect template may be configured at the first client, and a user may develop target scene display effect data based on the scene display effect template and upload to the server. And when the trigger event of the target scene is triggered at the second client, sending a drawing script service request to the server. The server may return the drawing script file to the second client, so as to trigger the target scene display effect at the first client. Therefore, flexible editing of the drawing script file by the target user is realized through the editing of the client, and the user can download the drawing script file stored by the user from the server. Thereby realizing the production and consumption ecology of the drawing script file.
The embodiment of the invention can be suitable for time display scenes, weather display scenes, environment temperature display scenes, alarm clock display scenes and music rhythm display scenes.
In a music rhythm display scene, a first user can edit rhythm effects of interested target tracks and publish the target tracks to a cloud server for sharing. The cloud server side can store at least one rhythm drawing script file aiming at the target music. For example, the second user may browse at least one draw script file effect presentation for the target track at the cloud service. The second user may select the draw script file effect of interest and locally associate the corresponding draw script file with the target track. Or the second user selects the drawing script file effect so as to download from the server when the corresponding rhythm scene is triggered.
In another musical rhythm display scenario, the first user may edit rhythm effects of music types of interest and publish to the cloud server for sharing. The cloud server can store at least one rhythm drawing script file aiming at the music type. The second user may select the drawing script file effect of interest as the drawing effect for that music type. Or the second user selects the drawing script file effect so as to download from the server when the corresponding rhythm scene is triggered.
In a weather display scene, a first user can edit the display effect of the interested weather type and publish the weather type to a cloud server for sharing. The cloud server can store at least one drawing script file for the weather type display effect. The second user may select a weather-type presentation of interest. Or the second user selects the drawing script file effect so as to download from the server side when the corresponding weather scene is triggered.
It should also be understood that the descriptions herein of features and steps of other embodiments (e.g., embodiments corresponding to any of the other figures) apply to aspects of this embodiment.
Fig. 8A is a schematic block diagram of a cloud service apparatus according to another embodiment of the present invention. The cloud service apparatus of fig. 8A may be a cloud service terminal such as a public cloud, a private cloud, or a hybrid cloud, and may also be applicable to other terminal devices. The device includes:
the determining module 810, responding to the target scene service request, determining a drawing script file for the target scene display interface;
and returning to the module 820, returning the drawing script file so as to dynamically display the target scene display interface by executing the drawing script file.
In the scheme of the embodiment of the invention, the drawing script file for the target scene display interface can be determined in response to the target scene service request, and the drawing script file is used for dynamically displaying the target scene display interface. Therefore, flexibility of obtaining the drawing script file is achieved, and storage space of the client is saved.
The apparatus of this embodiment is used to implement the corresponding method in the foregoing method embodiments, and has the beneficial effects of the corresponding method embodiments, which are not described herein again. In addition, the functional implementation of each module in the apparatus of this embodiment can refer to the description of the corresponding part in the foregoing method embodiment, and is not described herein again.
It should also be understood that the descriptions herein of features and steps of other embodiments (e.g., embodiments corresponding to any of the other figures) apply to aspects of this embodiment.
Fig. 8B is a schematic block diagram of a cloud service device according to another embodiment of the present invention. The cloud service apparatus in fig. 8B may be a cloud service terminal such as a public cloud, a private cloud, or a hybrid cloud, and may also be applicable to other terminal devices. The device includes:
the receiving module 830 is configured to receive target scene display effect data uploaded based on the scene display effect template;
the generation module 840 generates a drawing script file for dynamically displaying the target scene display interface based on the target scene display effect data;
the storage module 850 stores the drawing script file.
In the embodiment of the invention, the display effect template is convenient for the user to edit and create, and the server side (serving as the server side equipment of the service terminal) is beneficial to the user to download at the client side, or the client side downloads when a scene is triggered.
The apparatus of this embodiment is used to implement the corresponding method in the foregoing method embodiments, and has the beneficial effects of the corresponding method embodiments, which are not described herein again. In addition, the functional implementation of each module in the apparatus of this embodiment can refer to the description of the corresponding part in the foregoing method embodiment, and is not described herein again.
It should also be understood that the descriptions herein of features and steps of other embodiments (e.g., embodiments corresponding to any of the other figures) apply to aspects of this embodiment.
Fig. 9 is a schematic structural diagram of an electronic device according to another embodiment of the invention; the electronic device may include:
one or more processors 901;
a computer-readable medium 902, which may be configured to store one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the methods as described in the embodiments above.
Fig. 10 is a hardware configuration of an electronic apparatus according to another embodiment of the present invention; as shown in fig. 10, the hardware structure of the electronic device may include: a processor 1001, a communication interface 1002, a computer-readable medium 1003, and a communication bus 1004;
wherein, the processor 1001, the communication interface 1002 and the computer readable medium 1003 complete the communication with each other through the communication bus 804;
alternatively, the communication interface 1002 may be an interface of a communication module;
the processor 1001 may be specifically configured to: responding to a target scene trigger event, and acquiring a drawing script file for a target scene display interface; executing the drawing script file, and dynamically displaying the target scene display interface through the at least one interface element dynamic drawing component, or,
responding to a target scene trigger event, and acquiring a drawing script file for a target scene display interface; executing the drawing script file, and displaying the target scene display interface through a target interface rendering engine in the multiple interface rendering engines, or,
responding to a target scene trigger event, and determining an access path of drawing data of a target scene display interface; based on the access path, obtaining the drawing data to dynamically display the target scene display interface, or,
responding to the target scene service request, and determining a drawing script file for a target scene display interface; returning the drawing script file so as to dynamically display the target scene display interface by executing the drawing script file, or,
receiving target scene display effect data uploaded based on the scene display effect template; generating a drawing script file for dynamically displaying a target scene display interface based on the target scene display effect data; and storing the drawing script file.
The Processor 1001 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The computer-readable medium 1003 may be, but is not limited to, a Random Access Memory (RAM), a Read-Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code configured to perform the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication section, and/or installed from a removable medium. The computer program performs the above-described functions defined in the method of the present invention when executed by a Central Processing Unit (CPU). It should be noted that the computer readable medium of the present invention can be a computer readable signal medium or a computer readable storage medium or any combination of the two. The computer readable medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access storage media (RAM), a read-only storage media (ROM), an erasable programmable read-only storage media (EPROM or flash memory), an optical fiber, a portable compact disc read-only storage media (CD-ROM), an optical storage media piece, a magnetic storage media piece, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code configured to carry out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may operate over any of a variety of networks: including a Local Area Network (LAN) or a Wide Area Network (WAN) -to the user's computer, or alternatively, to an external computer (e.g., through the internet using an internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions configured to implement the specified logical function(s). In the above embodiments, specific precedence relationships are provided, but these precedence relationships are only exemplary, and in particular implementations, the steps may be fewer, more, or the execution order may be modified. That is, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present invention may be implemented by software or hardware. The names of these modules do not in some cases constitute a limitation of the module itself.
As another aspect, the present invention also provides a computer-readable medium on which a computer program is stored, which when executed by a processor implements the method as described in the above embodiments.
As another aspect, the present invention also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: responding to a target scene trigger event, and acquiring a drawing script file for a target scene display interface; executing the drawing script file, and dynamically displaying the target scene display interface through the at least one interface element dynamic drawing component, or,
responding to a target scene trigger event, and acquiring a drawing script file for a target scene display interface; executing the drawing script file, and displaying the target scene display interface through a target interface rendering engine in the multiple interface rendering engines, or,
responding to a target scene trigger event, and determining an access path of drawing data of a target scene display interface; based on the access path, obtaining the drawing data to dynamically display the target scene display interface, or,
responding to the target scene service request, and determining a drawing script file for a target scene display interface; returning the drawing script file so as to dynamically display the target scene display interface by executing the drawing script file, or,
receiving target scene display effect data uploaded based on the scene display effect template; generating a drawing script file for dynamically displaying a target scene display interface based on the target scene display effect data; and storing the drawing script file.
The expressions "first", "second", "said first" or "said second" used in various embodiments of the present disclosure may modify various components regardless of order and/or importance, but these expressions do not limit the respective components. The above description is only configured for the purpose of distinguishing elements from other elements. For example, the first user equipment and the second user equipment represent different user equipment, although both are user equipment. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the present disclosure.
When an element (e.g., a first element) is referred to as being "operably or communicatively coupled" or "connected" (operably or communicatively) to "another element (e.g., a second element) or" connected "to another element (e.g., a second element), it is understood that the element is directly connected to the other element or the element is indirectly connected to the other element via yet another element (e.g., a third element). In contrast, it is understood that when an element (e.g., a first element) is referred to as being "directly connected" or "directly coupled" to another element (a second element), no element (e.g., a third element) is interposed therebetween.
The foregoing description is only exemplary of the preferred embodiments of the invention and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention according to the present invention is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is possible without departing from the scope of the invention as defined by the appended claims. For example, the above features and (but not limited to) features having similar functions disclosed in the present invention are mutually replaced to form the technical solution.

Claims (28)

1. A display control method comprising:
responding to a target scene trigger event, and acquiring a drawing script file for a target scene display interface;
and executing the drawing script file, and dynamically displaying the target scene display interface through the at least one interface element dynamic drawing component.
2. The method of claim 1, wherein the retrieving a draw script file for a target scene display interface in response to a target scene trigger event comprises:
responding to a target scene trigger event, and judging that information indicating the resource position of the drawing script file at a resource server exists;
and if the information indicating the resource position of the drawing script file at the resource server exists, accessing the resource position of the drawing script file at the resource server to acquire the drawing script file from the resource server.
3. The method of claim 2, wherein the retrieving a draw script file for a target scene display interface in response to a target scene trigger event further comprises:
and if the information indicating the resource position of the drawing script file at the resource server does not exist, sending a resource acquisition request to the resource server so as to acquire the information.
4. The method of claim 1, wherein the retrieving a draw script file for a target scene display interface in response to a target scene trigger event comprises:
responding to a current scene trigger event, switching from a last scene display interface to a current scene display interface, and acquiring a drawing script file for the current scene display interface, wherein the dynamically displaying the target scene display interface comprises:
and dynamically displaying the current scene display interface.
5. The method of claim 4, wherein the switching from a last scene display interface to a current scene display interface and retrieving a draw script file for the current scene display interface in response to a current scene trigger event comprises:
responding to a current scene trigger event, adding a current scene display interface into a scene display interface queue, and acquiring a drawing script file for the current scene display interface;
and when the drawing script file of the current scene display interface is acquired, updating the scene display interface queue so as to switch from the last scene display interface to the current scene display interface.
6. The method of claim 4, wherein the switching from a last scene display interface to a current scene display interface and retrieving a draw script file for the current scene display interface in response to a current scene trigger event comprises:
responding to a current scene trigger event, and adding a current scene display interface into a scene display interface queue;
switching from the last scene display interface to the current scene display interface by updating the scene display interface queue so as to trigger an acquisition request of a drawing script file of the current scene display interface;
and responding to the acquisition request to acquire the drawing script file.
7. The method of claim 1, wherein the method further comprises:
and responding to a target voice instruction, and sending a voice fragment in the target voice instruction to a voice recognition server so that the voice recognition server returns the target scene trigger event, wherein the target scene trigger event is used for triggering a target scene indicated by a recognition result of the voice fragment.
8. The method of claim 1, wherein said executing the draw script file to dynamically display the target scene display interface via the at least one interface element dynamic draw component comprises:
executing the drawing script file to call at least one interface display graphic component through the at least one interface element dynamic drawing component;
and executing the at least one interface display graphic component to dynamically display the target scene display interface.
9. The method of claim 1, wherein the method further comprises:
and deleting the drawing script file after the dynamic display of the target scene display interface is finished.
10. The method of claim 1, wherein the method further comprises:
determining a mapping relation between the ambient light brightness and the display brightness of the target scene display interface;
determining the current display brightness corresponding to the detected current ambient light brightness based on the mapping relationship, wherein,
the dynamically displaying the target scene display interface includes:
and dynamically displaying the target scene display interface at the current display brightness.
11. The method of claim 1, wherein said executing the draw script file to dynamically display the target scene display interface via the at least one interface element dynamic draw component comprises:
calling a target script engine in a plurality of script engines via an adaptation framework configured for the plurality of script engines;
and interpreting and executing the drawing script file through the target script engine, and calling an interface rendering engine to dynamically display the target scene display interface, wherein the interface rendering engine is used for executing the at least one interface element dynamic drawing component.
12. The method of claim 11, wherein the method further comprises:
and responding to a starting triggering event, and executing power-on self-test to start the target script engine and the interface rendering engine.
13. The method of claim 1, wherein the method further comprises:
responding to the starting trigger event, and determining a starting display picture script file;
and executing the drawing of the startup display picture file through the interface rendering engine.
14. The method of claim 1, wherein the at least one interface element dynamic rendering component comprises at least one of a point dynamic rendering component, a line dynamic rendering component, a polygon dynamic rendering component.
15. A display control method comprising:
responding to a target scene trigger event, and acquiring a drawing script file for a target scene display interface;
and executing the drawing script file, and displaying the target scene display interface through a target interface rendering engine in the multiple interface rendering engines.
16. The method of claim 15, wherein said executing the draw script file to display the target scene display interface via a target interface rendering engine of the plurality of interface rendering engines comprises:
executing the drawing script file, and calling a target interface rendering engine in the multiple interface rendering engines;
and executing at least one interface element dynamic drawing component through the target interface rendering engine so as to dynamically display the target scene display interface.
17. A display control method comprising:
responding to a target scene trigger event, and determining an access path of drawing data of a target scene display interface;
and acquiring the drawing data based on the access path so as to dynamically display the target scene display interface.
18. The method of claim 17, wherein the obtaining the rendering data for dynamic display of the target scene display interface based on the access path comprises:
based on the access path, acquiring a drawing script file comprising the data from the resource server;
and executing the drawing script file through a script engine, and dynamically displaying the target scene display interface.
19. The method of claim 17, wherein the obtaining the rendering data for dynamic display of the target scene display interface based on the access path comprises:
obtaining the drawing data from a local storage space;
and dynamically displaying the target scene display interface based on the drawing data through a user interface engine.
20. A cloud service method, comprising:
responding to the target scene service request, and determining a drawing script file for a target scene display interface;
and returning the drawing script file so as to dynamically display the target scene display interface by executing the drawing script file.
21. A cloud service method, comprising:
receiving target scene display effect data uploaded based on the scene display effect template;
generating a drawing script file for dynamically displaying a target scene display interface based on the target scene display effect data;
and storing the drawing script file.
22. A display control apparatus comprising:
the determining module is used for responding to a target scene trigger event and determining an access path of drawing data of a target scene display interface;
and the acquisition module acquires the drawing data based on the access path so as to dynamically display the target scene display interface.
23. A cloud service apparatus, comprising:
the determining module is used for responding to the target scene service request and determining a drawing script file for a target scene display interface;
and the return module returns the drawing script file so as to dynamically display the target scene display interface by executing the drawing script file.
24. A cloud service apparatus, comprising:
the receiving module is used for receiving target scene display effect data uploaded based on the scene display effect template;
the generation module is used for generating a drawing script file for dynamically displaying a target scene display interface based on the target scene display effect data;
and the storage module is used for storing the drawing script file.
25. A display control apparatus comprising:
the acquisition module is used for responding to a target scene trigger event and acquiring a drawing script file for a target scene display interface;
and the execution module executes the drawing script file and dynamically displays the target scene display interface through the at least one interface element dynamic drawing component.
26. A display control apparatus comprising:
the acquisition module is used for responding to a target scene trigger event and acquiring a drawing script file for a target scene display interface;
and the execution module executes the drawing script file and displays the target scene display interface through a target interface rendering engine in the multiple interface rendering engines.
27. An electronic device, the device comprising:
one or more processors;
a computer readable medium configured to store one or more programs,
when executed by the one or more processors, cause the one or more processors to implement a method as claimed in any one of claims 1-21.
28. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 21.
CN202010368264.8A 2020-04-30 2020-04-30 Display control method, cloud service method, device, electronic equipment and storage medium Pending CN113590238A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010368264.8A CN113590238A (en) 2020-04-30 2020-04-30 Display control method, cloud service method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010368264.8A CN113590238A (en) 2020-04-30 2020-04-30 Display control method, cloud service method, device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113590238A true CN113590238A (en) 2021-11-02

Family

ID=78237687

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010368264.8A Pending CN113590238A (en) 2020-04-30 2020-04-30 Display control method, cloud service method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113590238A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114356448A (en) * 2021-12-28 2022-04-15 北京光启元数字科技有限公司 Object control method, device, equipment and medium
CN117348876A (en) * 2023-12-04 2024-01-05 深圳市云希谷科技有限公司 Application development method, system and medium based on freeRTOS embedded system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108228052A (en) * 2017-12-29 2018-06-29 腾讯科技(深圳)有限公司 Trigger method, apparatus, storage medium and the terminal of interface assembly operation
CN109168026A (en) * 2018-10-25 2019-01-08 北京字节跳动网络技术有限公司 Instant video display methods, device, terminal device and storage medium
CN109324850A (en) * 2018-07-16 2019-02-12 百度在线网络技术(北京)有限公司 Display processing method, terminal and the server of application program
CN109765793A (en) * 2018-12-07 2019-05-17 深圳绿米联创科技有限公司 Equipment state display methods, device, terminal and storage medium
CN110032361A (en) * 2018-01-11 2019-07-19 腾讯科技(深圳)有限公司 Test analogy method, device, electronic equipment and computer readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108228052A (en) * 2017-12-29 2018-06-29 腾讯科技(深圳)有限公司 Trigger method, apparatus, storage medium and the terminal of interface assembly operation
CN110032361A (en) * 2018-01-11 2019-07-19 腾讯科技(深圳)有限公司 Test analogy method, device, electronic equipment and computer readable storage medium
CN109324850A (en) * 2018-07-16 2019-02-12 百度在线网络技术(北京)有限公司 Display processing method, terminal and the server of application program
CN109168026A (en) * 2018-10-25 2019-01-08 北京字节跳动网络技术有限公司 Instant video display methods, device, terminal device and storage medium
CN109765793A (en) * 2018-12-07 2019-05-17 深圳绿米联创科技有限公司 Equipment state display methods, device, terminal and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114356448A (en) * 2021-12-28 2022-04-15 北京光启元数字科技有限公司 Object control method, device, equipment and medium
CN117348876A (en) * 2023-12-04 2024-01-05 深圳市云希谷科技有限公司 Application development method, system and medium based on freeRTOS embedded system
CN117348876B (en) * 2023-12-04 2024-02-06 深圳市云希谷科技有限公司 Application development method, system and medium based on freeRTOS embedded system

Similar Documents

Publication Publication Date Title
US20190312747A1 (en) Method, apparatus and system for controlling home device
US11711670B2 (en) Method for activating service based on user scenario perception, terminal device, and system
WO2022156368A1 (en) Recommended information display method and apparatus
CN106569900B (en) Applied program processing method and device
AU2019233201A1 (en) Resource configuration method and apparatus, terminal, and storage medium
CN111866537B (en) Information display method and device in live broadcast room, storage medium and electronic equipment
CN112416613B (en) Application data processing method, device, equipment and medium
KR20090106453A (en) Architecture for delivery of video content responsive to remote interaction
CN111314372A (en) Display equipment awakening method, control terminal, server and display equipment
CN113590238A (en) Display control method, cloud service method, device, electronic equipment and storage medium
CN112116690B (en) Video special effect generation method, device and terminal
CN112882709A (en) Rendering method, device and equipment based on container engine system and storage medium
US20230117081A1 (en) Providing a user-centric application
CN110908629A (en) Electronic equipment operation method and device, electronic equipment and storage medium
US11777784B2 (en) Intelligent network management system
CN114647390B (en) Enhanced screen sharing method and system and electronic equipment
US20160125092A1 (en) Web component display by cross device portal
CN112911359B (en) Resource display method, display equipment and remote controller
CN112487322B (en) Third party application Loading page loading method and display device
KR102506155B1 (en) Electronic device, application executing system and control methods thereof
CN115113955A (en) Display interface updating method and device, electronic equipment and storage medium
CN108536540B (en) Method and device for acquiring mouse message of desktop icon
CN117130688B (en) Quick application card loading method, electronic equipment and storage medium
CN115237317B (en) Data display method and device, electronic equipment and storage medium
CN112231088B (en) Browser process optimization method and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination