CN111586481B - Terminal and application processing method - Google Patents

Terminal and application processing method Download PDF

Info

Publication number
CN111586481B
CN111586481B CN202010371207.5A CN202010371207A CN111586481B CN 111586481 B CN111586481 B CN 111586481B CN 202010371207 A CN202010371207 A CN 202010371207A CN 111586481 B CN111586481 B CN 111586481B
Authority
CN
China
Prior art keywords
data
application
interface
terminal
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010371207.5A
Other languages
Chinese (zh)
Other versions
CN111586481A (en
Inventor
庄广海
苗璐
马福帅
李祥艳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202010371207.5A priority Critical patent/CN111586481B/en
Publication of CN111586481A publication Critical patent/CN111586481A/en
Application granted granted Critical
Publication of CN111586481B publication Critical patent/CN111586481B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/4424Monitoring of the internal components or processes of the client device, e.g. CPU or memory load, processing speed, timer, counter or percentage of the hard disk space used
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8166Monomedia components thereof involving executable data, e.g. software
    • H04N21/8173End-user applications, e.g. Web browser, game
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Abstract

The embodiment of the application shows a terminal and an application processing method, and a display device comprises: a display and a controller. In this application, the controller sets two processes for one listening application as follows: a first process and a second process; the first process for monitoring the state data is always in a pull-up state; the second process for interface data processing is in a pull-up state only under the condition that the application interface needs to be displayed, and the second process is released when the application interface is closed, so that the memory occupied by the application interface of the monitoring application in a non-display state is reduced, unnecessary power consumption of the terminal is avoided, and the probability of terminal blocking is reduced.

Description

Terminal and application processing method
Technical Field
The invention relates to a social television, in particular to a terminal and an application processing method.
Background
An intelligent terminal (which may be simply referred to as a terminal in the embodiments of the present application) is a type of embedded computer system device. In recent years, in order to facilitate and enrich people's lives, research and development personnel develop a series of applications applied to intelligent terminals, such as screenshot applications, listening applications, screensaver applications, and the like.
There are monitoring applications that are always in a pull-up state during the operation of a terminal, and such applications store interface data and logic data in a memory, and render and display the interface data and logic data through a GPU (Graphics Processing Unit) when the application interface needs to be displayed.
For example, recently used applications, which also have to survive in the background, since the application needs to monitor in real time what application the user has used; when the user needs to know the latest usage, the user triggers the corresponding control so that the terminal can pull up the application interface of the latest usage application in response to the triggering of the control. In the terminal shown in the prior art, once an application interface of a monitoring application is pulled up, the terminal occupies a memory of the terminal all the time, and further causes unnecessary power consumption or jamming and other problems.
Disclosure of Invention
In order to solve technical problems in the prior art, embodiments of the present application illustrate a terminal and a processing method for an application.
A first aspect of an embodiment of the present application shows a terminal, including:
the display is used for displaying the application interface;
the controller is used for setting a first process and a second process aiming at one monitoring application;
responding to the pull-up of the monitoring application, starting a first process corresponding to the monitoring application, wherein the first process is used for monitoring state data; the status data comprises interface data;
responding to a trigger operation for displaying the application interface, and starting a second process, wherein the second process is used for sending the received interface data to the graphic processor so that the graphic processor renders the application interface based on the interface data;
controlling the display to show the application interface;
and releasing the second process in response to the trigger operation of closing the application interface.
A second aspect of the embodiment of the present application shows an application processing method, including:
responding to the pull-up of the monitoring application, starting a first process corresponding to the monitoring application, wherein the first process is used for monitoring state data; the status data comprises interface data, and the status data comprises interface data;
responding to a trigger operation for displaying the application interface, and starting a second process, wherein the second process is used for sending the received interface data to the graphic processor, so that the graphic processor renders the application interface based on the interface data;
controlling the display to show the application interface;
and releasing the second process in response to the trigger operation of closing the application interface.
To sum up, the embodiment of the present application shows a terminal and a processing method of an application, where the terminal includes: a display and a controller. The controller sets two processes, namely a first process and a second process, aiming at one monitoring application. Responding to the pull-up of the monitoring application, starting a first process corresponding to the monitoring application, wherein the first process is used for monitoring state data, and the state data comprises interface data; responding to a trigger operation for displaying the application interface, and starting a second process, wherein the second process is used for sending the received interface data to the graphic processor so that the graphic processor renders the application interface based on the interface data; and releasing the second process in response to the trigger operation of closing the application interface. Therefore, the display device shown in the embodiment of the application sets two processes for monitoring the application, and the two processes are used for enabling the monitored first process to be always in a pull-up state through the state data; the second process for interface data processing is in a pull-up state only under the condition that the application interface payment needs to be displayed, and the second process is released when the application interface is closed, so that the memory occupied by the application interface of the monitoring application in a non-display state is reduced, unnecessary power consumption of the terminal is avoided, and the probability of terminal jamming is reduced.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic view of an operation scenario between a display device and a control apparatus according to an embodiment of the present application;
fig. 2 is a block diagram illustrating a hardware configuration of the control apparatus 100 in fig. 1 according to an embodiment of the present disclosure;
fig. 3 is a block diagram illustrating a hardware configuration of the display device 200 in fig. 1 according to an embodiment of the present disclosure;
fig. 4 is a block diagram illustrating an architecture configuration of an operating system in a memory of the display device 200 according to an embodiment of the present application;
fig. 5 is a block diagram illustrating the structure of a terminal according to a possible embodiment;
FIG. 6 is a flowchart illustrating operation of a terminal according to one possible embodiment;
FIG. 7 is a schematic diagram illustrating an application interface in accordance with one possible embodiment;
FIG. 8 is a schematic diagram illustrating an application interface in accordance with one possible embodiment;
FIG. 9 is a schematic diagram illustrating an application interface in accordance with one possible embodiment;
FIG. 10 is a flowchart illustrating operation of a terminal according to one possible embodiment;
FIG. 11 is a flowchart illustrating operation of a terminal according to one possible embodiment;
FIG. 12 is a flow chart illustrating a method of processing an application according to one possible embodiment.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terminal shown in the embodiment of the application can be a mobile phone, a pad, a display device and other devices. In this embodiment, the device explains the structure of the terminal as an example.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus. As shown in fig. 1, the control apparatus 100 and the display device 200 may communicate with each other in a wired or wireless manner.
Among them, the control apparatus 100 is configured to control the display device 200, which may receive an operation instruction input by a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an intermediary for interaction between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
The control device 100 may be a remote controller 100A, which includes infrared protocol communication or bluetooth protocol communication, and other short-distance communication methods, etc. to control the display apparatus 200 in a wireless or other wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
The control apparatus 100 may also be an intelligent device, such as a mobile display device 100B, a tablet computer, a notebook computer, and the like. For example, the display device 200 is controlled using an application program running on the smart device. The application program can provide various controls for a User through an intuitive User Interface (UI) on a screen associated with the smart device through configuration.
For example, the mobile display device 100B may install a software application with the display device 200, implement connection communication through a network communication protocol, and implement the purpose of one-to-one control operation and data communication. Such as: the mobile display apparatus 100B may be caused to establish a control instruction protocol with the display apparatus 200 to implement the function of the physical keys as arranged in the remote control 100A by operating various function keys or virtual buttons of the user interface provided on the mobile display apparatus 100B. The audio and video contents displayed on the mobile display device 100B may also be transmitted to the display device 200, so as to implement the synchronous display function.
The display apparatus 200 may provide a network television function of a broadcast receiving function and a computer support function. The display device may be implemented as a digital television, a web television, an Internet Protocol Television (IPTV), or the like.
The display device 200 may be a liquid crystal display, an organic light emitting display, a projection device. The specific display device type, size, resolution, etc. are not limited.
The display apparatus 200 also performs data communication with the server 300 through various communication means. Here, the display apparatus 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 300 may provide various contents and interactions to the display apparatus 200. By way of example, the display device 200 may send and receive information such as: receiving Electronic Program Guide (EPG) data, receiving software Program updates, or accessing a remotely stored digital media library. The servers 300 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through the server 300.
Fig. 2 is a block diagram illustrating the configuration of the control device 100. As shown in fig. 2, the control device 100 includes a controller 110, a memory 120, a communicator 130, a user input interface 140, an output interface 150, and a power supply 160.
The controller 110 includes a RAM (Random Access Memory) 111, a ROM (Read-Only Memory) 112, a processor 113, a communication interface, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components of the communication cooperation, external and internal data processing functions.
For example, when an interaction of a user pressing a key disposed on the remote controller 100A or an interaction of touching a touch panel disposed on the remote controller 100A is monitored, the controller 110 may control to generate a signal corresponding to the monitored interaction and transmit the signal to the display device 200.
And a memory 120 for storing various operation programs, data and applications for driving and controlling the control apparatus 100 under the control of the controller 110. The memory 120 may store various control signal commands input by a user.
The communicator 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the control apparatus 100 transmits a control signal (e.g., a touch signal or a button signal) to the display device 200 via the communicator 130, and the control apparatus 100 may receive the signal transmitted by the display device 200 via the communicator 130. The communicator 130 may include an infrared module 131 (infrared signal interface), a radio frequency signal interface 132, and a bluetooth module 133. For example: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
The user input interface 140 may include at least one of a microphone 141, a touch pad 142, a sensor 143, a key 144, and the like, so that a user can input a user instruction regarding controlling the display apparatus 200 to the control apparatus 100 through voice, touch, gesture, press, and the like.
The output interface 150 outputs a user instruction received by the user input interface 140 to the display apparatus 200, or outputs an image or voice signal received by the display apparatus 200. Here, the output interface 150 may include an LED interface 151, a vibration interface 152 generating vibration, a sound output interface 153 outputting sound, a display 154 outputting an image, and the like. For example, the remote controller 100A may receive an output signal such as audio, video, or data from the output interface 150, and display the output signal in the form of an image on the display 154, in the form of audio on the sound output interface 153, or in the form of vibration on the vibration interface 152.
And a power supply 160 for providing operation power support for each element of the control device 100 under the control of the controller 110. In the form of a battery and associated control circuitry.
A hardware configuration block diagram of the display device 200 is exemplarily shown in fig. 3. As shown in fig. 3, the display apparatus 200 may include a tuner demodulator 210, a communicator 220, a monitor 230, an external device interface 240, a controller 250, a memory 260, a user interface 265, a video processor 270, a display 275, an audio processor 280, an audio output interface 285, and a power supply 290.
The tuner demodulator 210 receives the broadcast television signal in a wired or wireless manner, may perform modulation and demodulation processing such as amplification, mixing, and resonance, and is configured to demodulate, from a plurality of wireless or wired broadcast television signals, an audio/video signal carried in a frequency of a television channel selected by a user, and additional information (e.g., EPG data).
The tuner demodulator 210 is responsive to the user selected frequency of the television channel and the television signal carried by the frequency, as selected by the user and controlled by the controller 250.
The tuner demodulator 210 can receive a television signal in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; and according to different modulation types, a digital modulation mode or an analog modulation mode can be adopted; and can demodulate the analog signal and the digital signal according to the different kinds of the received television signals.
In other exemplary embodiments, the tuning demodulator 210 may also be in an external device, such as an external set-top box. In this way, the set-top box outputs a television signal after modulation and demodulation, and inputs the television signal into the display apparatus 200 through the external device interface 240.
The communicator 220 is a component for communicating with an external device or an external server according to various communication protocol types. For example, the display apparatus 200 may transmit content data to an external apparatus connected via the communicator 220, or browse and download content data from an external apparatus connected via the communicator 220. The communicator 220 may include a network communication protocol module or a near field communication protocol module, such as a WIFI module 221, a bluetooth module 222, and a wired ethernet module 223, so that the communicator 220 may receive a control signal of the control device 100 according to the control of the controller 250 and implement the control signal as a WIFI signal, a bluetooth signal, a radio frequency signal, and the like.
The monitor 230 is a component of the display apparatus 200 for collecting signals of an external environment or interaction with the outside. Monitor 230 may include a sound collector 231, such as a microphone, which may be used to receive a user's sound, such as a voice signal of a control instruction of the user to control display device 200; alternatively, ambient sounds may be collected that identify the type of ambient scene, enabling the display device 200 to adapt to ambient noise.
In some other exemplary embodiments, the monitor 230 may further include an image collector 232, such as a camera, a video camera, etc., which may be used to collect external environment scenes to adaptively change the display parameters of the display apparatus 200; and the function of acquiring the attribute of the user or interacting gestures with the user so as to realize the interaction between the display equipment and the user.
In some other exemplary embodiments, the monitor 230 may further include a light receiver (not shown) for collecting the ambient light intensity, adapting to the display parameter variation of the display device 200, and the like.
In some other exemplary embodiments, the monitor 230 may further include a temperature sensor (not shown), such as by sensing the ambient temperature, and the display device 200 may adaptively adjust the display color temperature of the image. For example, when the temperature is higher, the display apparatus 200 may be adjusted to display a color temperature of an image that is cooler; when the temperature is lower, the display device 200 may be adjusted to display a warmer color temperature of the image.
The external device interface 240 is a component for providing the controller 250 to control data transmission between the display apparatus 200 and an external apparatus. The external device interface 240 may be connected to an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
The external device interface 240 may include: one or more of an HDMI (High Definition Multimedia Interface) terminal 241, a CVBS (Composite Video Blanking and Sync) terminal 242, a Component (analog or digital) terminal 243, a USB (Universal Serial Bus) terminal 244, a Component (Component) terminal (not shown), a red, green, blue (RGB) terminal (not shown), and the like.
The controller 250 controls the operation of the display device 200 and responds to the operation of the user by running various software control programs (such as an operating system and various application programs) stored on the memory 260.
As shown in fig. 3, the controller 250 includes a RAM (random access memory) 251, a ROM (read only memory) 252, an image processor 253, a CPU processor 254, a communication interface 255, and a communication bus 256. Among them, the RAM251, the ROM252, the image processor 253, the CPU processor 254, and the communication interface 255 are connected by a communication bus 256.
The ROM252 stores various system boot instructions. When the display apparatus 200 starts power-on upon receiving the power-on signal, the CPU processor 254 executes a system boot instruction in the ROM252, copies the operating system stored in the memory 260 to the RAM251, and starts running the boot operating system. After the start of the operating system is completed, the CPU processor 254 copies the various application programs in the memory 260 to the RAM251 and then starts running and starting the various application programs.
An image processor 253 for generating various graphic objects such as icons, operation menus, and user input instruction display graphics, etc. The image processor 253 may include an operator for performing an operation by receiving various interactive instructions input by a user, thereby displaying various objects according to display attributes; and a renderer for generating various objects based on the operator and displaying the rendered result on the display 275.
A CPU processor 254 for executing operating system and application program instructions stored in memory 260. And according to the received user input instruction, processing of various application programs, data and contents is executed so as to finally display and play various audio-video contents.
In some example embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some initialization operations of the display apparatus 200 in the display apparatus preload mode and/or operations of displaying a screen in the normal mode. A plurality of or one sub-processor for performing an operation in a state of a standby mode or the like of the display apparatus.
The communication interface 255 may include a first interface, a second interface, and an nth interface. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a User input command for selecting a GUI (Graphical User Interface) object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the User input command.
Where the object may be any one of the selectable objects, such as a hyperlink or an icon. The operation related to the selected object is, for example, an operation of displaying a link to a hyperlink page, document, image, or the like, or an operation of executing a program corresponding to the object. The user input command for selecting the GUI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
A memory 260 for storing various types of data, software programs, or applications for driving and controlling the operation of the display device 200. The memory 260 may include volatile and/or nonvolatile memory. And the term "memory" includes the memory 260, the RAM251 and the ROM252 of the controller 250, or a memory card in the display device 200.
In some embodiments, the memory 260 is specifically used for storing an operating program for driving the controller 250 of the display device 200; storing various application programs built in the display apparatus 200 and downloaded by a user from an external apparatus; data such as visual effect images for configuring various GUIs provided by the display 275, various objects related to the GUIs, and selectors for selecting GUI objects are stored.
In some embodiments, memory 260 is specifically configured to store drivers for tuner demodulator 210, communicator 220, monitor 230, external device interface 240, video processor 270, display 275, audio processor 280, etc., and related data, such as external data (e.g., audio-visual data) received from the external device interface or user data (e.g., key information, voice information, touch information, etc.) received by the user interface.
In some embodiments, memory 260 specifically stores software and/or programs representing an Operating System (OS), which may include, for example: a kernel, middleware, an Application Programming Interface (API), and/or an Application program. Illustratively, the kernel may control or manage system resources, as well as functions implemented by other programs (e.g., the middleware, APIs, or applications); at the same time, the kernel may provide an interface to allow middleware, APIs, or applications to access the controller to enable control or management of system resources.
A block diagram of the architectural configuration of the operating system in the memory of the display device 200 is illustrated in fig. 4. The operating system architecture comprises an application layer, a middleware layer and a kernel layer from top to bottom.
The application layer, the application programs built in the system and the non-system-level application programs belong to the application layer. Is responsible for direct interaction with the user. The application layer may include a plurality of applications such as a setup application, a post application, a media center application, and the like. These applications may be implemented as Web applications that execute based on a WebKit engine, and in particular may be developed and executed based on HTML5, Cascading Style Sheets (CSS), and JavaScript.
Here, HTML, which is called HyperText Markup Language (HyperText Markup Language), is a standard Markup Language for creating web pages, and describes the web pages by Markup tags, where the HTML tags are used to describe characters, graphics, animation, sound, tables, links, etc., and a browser reads an HTML document, interprets the content of the tags in the document, and displays the content in the form of web pages.
CSS, known as Cascading Style Sheets (Cascading Style Sheets), is a computer language used to represent the Style of HTML documents, and may be used to define Style structures, such as fonts, colors, locations, etc. The CSS style can be directly stored in the HTML webpage or a separate style file, so that the style in the webpage can be controlled.
JavaScript, a language applied to Web page programming, can be inserted into an HTML page and interpreted and executed by a browser. The interaction logic of the Web application is realized by JavaScript. The JavaScript can package a JavaScript extension interface through the browser to realize communication with the kernel layer.
The middleware layer may provide some standardized interfaces to support the operation of various environments and systems. For example, the middleware layer may be implemented as Multimedia and Hypermedia Experts Group (MHEG) middleware related to data broadcasting, DLNA (Digital Living Network Alliance) middleware of middleware related to communication with an external device, middleware providing a browser environment in which each application program in the display device operates, and the like.
The kernel layer provides core system services, such as: file management, memory management, process management, network management, system security authority management and the like. The kernel layer may be implemented as a kernel based on various operating systems, for example, a kernel based on the Linux operating system.
The kernel layer also provides communication between system software and hardware, and provides device driver services for various hardware, such as: the display driver is provided for the display, the camera driver is provided for the camera, the key driver is provided for the remote controller, the WIFI driver is provided for the WIFI module, the audio driver is provided for the audio output interface, the Power Management driver is provided for the Power Management (PM) module, and the like.
A user interface 265 receives various user interactions. Specifically, it is used to transmit an input signal of a user to the controller 250 or transmit an output signal from the controller 250 to the user. For example, the remote controller 100A may transmit an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by the user to the user interface 265, and then the input signal is transferred to the controller 250 through the user interface 265; alternatively, the remote controller 100A may receive an output signal such as audio, video, or data output from the user interface 265 via the controller 250, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on the display 275, and the user interface 265 receives the user input commands through the GUI. Specifically, the user interface 265 may receive user input commands for controlling the position of a selector in the GUI to select different objects or items.
Alternatively, the user may input a user command by inputting a specific sound or gesture, and the user interface 265 receives the user input command by recognizing the sound or gesture through the sensor.
The video processor 270 is configured to receive an external video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on the display 275.
Illustratively, the video processor 270 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is configured to demultiplex an input audio/video data stream, where, for example, an input MPEG-2 stream (based on a compression standard of a digital storage media moving image and voice), the demultiplexing module demultiplexes the input audio/video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, for example, convert a frame rate of an input 60Hz video into a frame rate of 120Hz or 240Hz, where a common format is implemented by using, for example, an interpolation frame method.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output RGB data signals.
A display 275 for receiving the image signal from the video processor 270 and displaying the video content, the image and the menu manipulation interface. The video content may be displayed from the video content in the broadcast signal received by the tuner/demodulator 210, or from the video content input by the communicator 220 or the external device interface 240. The display 275 and also displays a user manipulation interface UI generated in the display apparatus 200 and used to control the display apparatus 200.
And, the display 275 may include a display screen assembly for presenting a picture and a driving assembly for driving the display of an image. Alternatively, a projection device and projection screen may be included, provided display 275 is a projection display.
The audio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing to obtain an audio signal that can be played by the speaker 286.
Illustratively, audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, Advanced Audio Coding (AAC), high efficiency AAC (HE-AAC), and the like.
The audio output interface 285 is used for receiving an audio signal output by the audio processor 280 under the control of the controller 250, and the audio output interface 285 may include a speaker 286 or an external sound output terminal 287, such as an earphone output terminal, for outputting to a generating device of an external device.
In other exemplary embodiments, video processor 270 may comprise one or more chips. Audio processor 280 may also comprise one or more chips.
And, in other exemplary embodiments, the video processor 270 and the audio processor 280 may be separate chips or may be integrated with the controller 250 in one or more chips.
And a power supply 290 for supplying power supply support to the display apparatus 200 from the power input from the external power source under the control of the controller 250. The power supply 290 may be a built-in power supply circuit installed inside the display apparatus 200 or may be a power supply installed outside the display apparatus 200.
In order to facilitate and enrich people's lives, research and development personnel develop a series of applications applied to intelligent terminals, such as screenshot applications, listening applications, screen saver applications, and the like.
There are monitoring applications that are always in a pull-up state during the operation of a terminal, and such applications store interface data and logic data in a memory, and render and display the interface data and logic data through a GPU (Graphics Processing Unit) when the application interface needs to be displayed.
For example, recently used applications, which also have to live in the background, since they need to monitor in real time what applications the user has used; when the user needs to know the latest usage, the user triggers the corresponding control so that the terminal can pull up the application interface of the latest usage application in response to the triggering of the control. In the terminal shown in the prior art, once an application interface of a monitoring application is pulled up, the terminal occupies a memory of the terminal all the time, and further causes unnecessary power consumption or jamming and other problems.
In order to solve the above technical problem, the embodiment of the present application illustrates a terminal, and the structure of the terminal may refer to fig. 5, and the operation process of the terminal may refer to fig. 6. The terminal includes: a controller 1a and a display 2 a.
The display 2a is used for displaying an application interface;
the controller 1a sets two processes, a first process 11a and a second process 12a, for each listening application.
S101, responding to the pull-up of the monitoring application, starting a first process corresponding to the monitoring application, wherein the first process is used for monitoring state data, and the state data comprises interface data;
in the present application, the controller 1a is provided with two processes, namely a first process 11a and a second process 12a, for a monitoring application; wherein the second process 12a is configured to process interface data associated with the display interface, the interface data being data for directing rendering of the graphics processor interface.
In the embodiment of the application, the monitoring application is always in a pull-up state so as to monitor state data in real time. The state data is used for recording the state of the terminal or the state of the application, and the state may be a pulled-up state or a closed state, or a state operated by a person or not operated by a person, or a state whether new data is written or not. For example: the monitoring application can be a recently used application for monitoring which applications are pulled up by the terminal, and the corresponding state data records which applications are pulled up by the terminal; the monitoring application can also be a screen saver application, and the corresponding state data records the time when the equipment is in an unmanned operation state; the monitoring application may also be an upgrade application, and the status data is used to record whether upgrade data is written in the upgrade database. The present embodiment describes three types of status data by way of example, and the status data is not limited to the three types of status data in practical application.
The status data in this application includes network data and local data, and the network data usually needs to be called in a remote server. Generally, the process of retrieving network data is affected by various external factors such as the state of the network, and therefore, in some cases, the process of retrieving network data takes a lot of time. In order to improve the response rate of the terminal, in the technical scheme shown in the embodiment of the application, the first process 11a is provided with a first thread and a second thread; the second thread is used for monitoring network data, and data which needs to be called in a remote server in the embodiment of the application can be called as network data. The first thread is used for monitoring local data, and the data which can be acquired without requesting a server is the local data.
The data processing method disclosed by the embodiment of the application adopts two threads to process and monitor the local data and the network data respectively, wherein the first thread is used for monitoring the local data. Because the data monitored by the first thread all the time is local data, and the data monitored by the second thread is network data, even if the network is unstable, the request process of the network data consumes a large amount of time, and only the second thread is in a waiting state under the condition, the first thread can monitor the local data in real time, and further the monitoring process of the local data can be guaranteed to run smoothly all the time.
S102, in response to a trigger operation for displaying the application interface, pulling up a second process 12a, wherein the second process is used for sending the received interface data to the graphics processor, so that the graphics processor renders the application interface based on the interface data;
the pull manner of pulling the second process 12a may be based on a request of the user external device or the user, and the controller passively pulls the second process. The terminal may also actively pull up the corresponding second process 12a if certain conditions are met. The pulling-up mode of the second process is not limited to the above two modes, and the pulling-up mode of the second process may be configured according to requirements in the process of practical application, and the applicant does not make much limitation here.
The pull process of the second process corresponding to different listening applications is described in detail below with reference to specific examples.
For the application used recently, the pull-up manner for pulling up the second process 12a may be based on a trigger operation sent by the user external device or the user, and the controller passively pulls up the second process. In some feasible embodiments, the trigger operation may be a voice instruction, and the corresponding user may interact with the terminal through the voice instruction. The specific interactive process may be: and the user pulls up the radio reception function of the terminal based on the awakening word and then interacts with the terminal. For example, the user may speak a wake-up word before interacting with the terminal, and the radio of the controller 1a may be pulled up, and accordingly, the controller 1a may receive a trigger operation input by the user through the user interface. In a possible embodiment, the triggering operation may be "gather, view application pulled up by the display device", and the controller pulls up a second process corresponding to the near use application in response to the triggering operation. The triggering operation can be that a user clicks an application interface triggering key on a remote controller to switch on the key, and the corresponding user can interact with the terminal through the remote controller. In the process of actual application, the trigger operation is not limited to the above two forms, and the trigger operation may be configured according to requirements in the process of actual application.
For the screensaver application, the triggering operation may be that the terminal actively pulls up the corresponding second process 12a if a certain condition is satisfied. For example, for a display device, the time threshold of the screen saver is 10min, and the first process 11a monitors the time of the display device being unattended in real time, i.e. the time of the unattended operation is the status data. When the display device is unattended for a time equal to 10min, the second process 12a is pulled up.
For upgrading the application, the triggering operation may be that the terminal actively pulls up the corresponding second process 12a if a certain condition is met. For example, the first process 11a monitors a data change condition of the upgrade database, and when the first process 11a monitors that an upgrade data packet exists in the upgrade database, the second process 12a is pulled up.
S103, controlling the display to display the application interface; s104, responding to the trigger operation of closing the application interface, releasing the second process 12 a.
In the technical solution shown in the embodiment of the present application, the second process 12a is released in response to a trigger operation for closing an application interface. In the present application, the first process 11a is always in a pull-up state, and can monitor the state data in real time.
The following describes in detail the operation process of the terminal according to the embodiment of the present application with reference to specific examples.
Example 1: for a computer, when the computer is in a power-on state, the controller 1a of the corresponding computer pulls up the first process 11a (which may be referred to as the first process 11a in this embodiment) that is applied recently. The first process 11a monitors and monitors application data of the computer pull-up application in real time. In this embodiment, the computer pull-up application includes: the first application, the second application, the third application, the fourth application, the fifth application, and the sixth application are a seventh application. The first process 11a records state data of the application, where the state data includes page data, and the state data may also include a CPU occupied by the application, a memory occupied by the application, and a disk occupied by the application; the state data may also include data such as the capabilities of the application, history of the application, pull-up of the application, display information of the application, services of the application, and the like. When the user wants to know which applications are pulled up by the computer, the user can click the interface display control through the mouse, and accordingly, the controller 1a pulls up the second process. The second process sends the received interface data to the graphics processor to cause the graphics processor to render the application interface based on the interface data, and the rendered interface may refer to fig. 7. The user may mouse click on a close control (not shown) on the interface to cause the controller to release the second process 12a to be released.
Example 2: for a display device, when the display device is in a power-on state, the controller 1a of the corresponding display device pulls up the first process 11a (which may be simply referred to as the first process 11a in this embodiment) of the upgrade application. The first process 11a monitors the data change condition in the upgrade database in real time, and records the changed installation package data. The application corresponding to the changed installation package data in the display device in the application is as follows: a application, B application, C application, D application, E application, F application, G application. The first process 11a records the state data of the application, where the state data may include the installed version of the installation package, the updated version of the installation package, the memory occupied by the installation package, and other information. When a user wants to know which applications in the display device need to be upgraded, the user may click the interface pull-up control through the remote controller, the controller 1a pulls up the second process (which may be referred to as a second process 12a in this embodiment) of the upgrade application, and the second process sends the received interface data to the graphics processor 13a, so that the graphics processor 13a is controlled to render the application interface based on the interface data, and the rendered interface may refer to fig. 8. The user may mouse click on a close control (not shown) on the interface to cause the controller to release the second process 12 a.
Example 3: for a display device, when the display device is in the power-on state, the controller 1a of the corresponding display device pulls up the first process 11a of the screen saver application (to this embodiment, it may be simply referred to as the first process 11 a). When the display device is in the unattended state, the first process 11a monitors, in real time, the time (which may be simply referred to as the unattended time in this application) when the display device is in the unattended state. When the unattended time of the unattended time display device is equal to the preset screensaver time, at this time, the controller 1a pulls up the second process (which may be referred to as the second process 12a in this embodiment) of the upgrade application, and the second process sends the received interface data to the graphics processor 13a, so that the graphics processor 13a is controlled to render the application interface based on the interface data, and the rendered interface may refer to fig. 9. The user may mouse click on a close control (not shown) on the interface to cause the controller to release the second process 12 a.
To sum up, the embodiment of the present application shows a terminal, which includes: the system comprises a display and a controller, wherein the controller sets two processes for a monitoring application, and the two processes are used for enabling state data to enable a monitored first process to be always in a pulled-up state; the second process for interface data processing is in a pull-up state only when the application interface request needs to be displayed, and the second process is released when the application interface is closed, so that the memory occupied by the application interface of the monitoring application in a non-display state is reduced, unnecessary power consumption of the terminal is avoided, and the probability of terminal blocking is reduced.
The terminal configures two processes for one monitoring application, namely a first process and a second process. The first process embodiment listens for status data and stores the heard status data in its memory, while the interface data runs in the second process 12 a. The second process 12a cannot directly access the interface data stored in the first process 11 a. In order to enable the second process 12a to successfully call the interface data stored in the first process 11a, in the technical solution shown in the embodiment of the present application, the first process 11a transmits the changed state data to the memory in response to monitoring the change of the state data. The data calling device is configured to respond to the triggering operation of displaying the application interface, call the interface data stored in the memory and output the interface data to the second process.
In the technical solution shown in the embodiment of the present application, the controller 1a includes at least two memories, which are respectively: the first nonvolatile Memory, which is also referred to as a FLASH Memory 14a in the technical solutions shown in the embodiments of the present application, and the Random Access Memory (RAM) are also referred to as a RAM Memory 15a in the technical solutions shown in the embodiments of the present application. Among other things, FLASH memory 14a is capable of retaining data after power is removed, but has a slower read and write speed and a limited write life. The RAM Memory 15a is a volatile Memory which loses Data when power is off, but has a high read-write speed and an almost unlimited read-write life, and may be implemented by, for example, a Synchronous Dynamic Random Access Memory (SDRAM), a Dynamic Random Access Memory (DRAM), a Double Data Rate (DDR), or the like.
In a possible embodiment, the specific operation process of the terminal can refer to fig. 10, and fig. 10 is a flowchart illustrating the operation of the terminal according to a possible embodiment;
the first process 11a is configured to perform step S201 of listening for status data in response to the pull of the listening application;
the monitoring process of the status data may refer to the above embodiments, and is not described herein again.
The first process 11a is further configured to execute step S202, in response to monitoring the change of the status data, transmitting the changed status data to the FLASH memory 14 a;
the state data can be changed by adding the state data, and the state data can be changed by replacing the state data and deleting the state data.
For example: the listening application may be a recently used application for listening which applications are pulled by the terminal, and the corresponding status data is used for recording the relevant data of the application pulled by the terminal. Each time the terminal pulls up a new application, the state data of the application will be written into the first process 11a, and the corresponding first process 11a monitors the increase of the state data and transmits the changed state data to the FLASH memory 14 a.
In a feasible embodiment, the terminal is powered on at 10 o 'clock 00 min, the first process 11a (which may be referred to as the first process 11a in this example) which recently uses the application is in a pull-up state, the terminal pulls up the Internet Explorer application at 10 o' clock 01 min, the first process 11a records the state data of the Internet Explorer application, and the first process 11a writes the state data of the Internet Explorer application into the FLASH memory 14 a. And when the score is 10: 05, the terminal pulls up the screenshot tool application, the first process 11a records the state data of the screenshot tool application, and the first process 11a writes the state data of the screenshot tool application and the state data recorded in the Internet Explorer application into the RAM 15 a. The state data written in the RAM memory 15a at 10 dots 05 overlaps the state data written in the RAM memory 15a at 10 dots 01.
The data calling device is configured to execute step S203 to call the interface data stored in the RAM memory 15a in response to the trigger operation of presenting the application interface;
in the terminal shown in this embodiment, when the first process 11a detects a change in the state data, the first process 11a writes the changed data into the RAM memory 15 a. In response to the trigger operation of displaying the application interface, the data calling device calls the interface data stored in the RAM memory 15 a; the RAM memory 15a has an unlimited number of read and write lifetimes, and therefore, erasing it multiple times during terminal operation does not affect its lifetime. Furthermore, because the data writing rate of the RAM memory 15a is much greater than the data writing rate of the FLASH memory 14a, the terminal directly calls corresponding interface data from the RAM memory 15a during the operation process, and accordingly, the data response efficiency of the terminal is improved.
The data calling device is configured to execute step S204 to output the interface data to the second thread process;
the second process 12a is configured to perform step S205 of sending the received interface data to the graphics processor 13a, so that the control graphics processor 13a renders an application interface based on the interface data.
The rendering process of the application interface may adopt an interface rendering mode commonly used in the art, which is not described in detail by the applicant herein.
S206, in response to the trigger operation of closing the application interface, the second process 12a is released.
For the recently used applications, in order to reduce the data transmission amount of the first process 11a, in a feasible embodiment, in response to monitoring the change of the state data, target sub-data may be sent to the RAM memory 15a, where the target sub-data is the state data corresponding to the N applications that the terminal has recently pulled up.
Specifically, in the process of actual application, a user usually needs to know the N applications recently pulled up by the terminal through an application interface that recently uses the applications, and based on this, each time the first process 11a monitors the state data change, it is only necessary to write the state data corresponding to the N applications recently pulled up by the terminal into the RAM memory 15 a. Wherein, N can be set according to the requirements of the user, and the applicant does not make much limitation here.
The selecting process of the target sub-data may be: the controller 1a configures a timestamp for each state data, and the timestamp is used for recording the pull-up time of the application; the first process is configured to: sorting the state data according to the sequence of the time stamps from first to last; and responding to the monitored change of the state data, and selecting the state data corresponding to the last N timestamps as the target subdata.
In a feasible embodiment, the terminal is powered on at 10 o ' clock 00 min, the first process 11a (which may be referred to as the first process 11a in this example) that recently uses the application is in a pull-up state, and the terminal pulls up 10 applications in the period from 10 o ' clock to 10 o ' clock 10 min:
the terminal pulls up the Internet Explorer application when the point is 10: 01, the first process 11a records the state data of the Internet Explorer application, and the timestamp corresponding to the state data of the Internet Explorer application is 10: 01;
the terminal pulls up the Baidu network disk application at 10 points 02 minutes, the first process 11a records state data of the Baidu network disk application, and a timestamp corresponding to the state data of the Baidu network disk application is 10 points 02 minutes;
the terminal pulls up the gathering application at 10 o 'clock 03 min, the first process 11a records the state data of the gathering application, and the timestamp corresponding to the state data of the gathering application is 10 o' clock 03 min;
when the point 10 is 04 minutes, the terminal pulls up the Karaoke application, the first process 11a records the state data of the Karaoke application, and the timestamp corresponding to the state data of the Karaoke application is 10 points 04 minutes;
the method comprises the steps that when the score is 10 point 05, the qq application is pulled up by the terminal, the first process 11a records the state data of the qq application, and the timestamp corresponding to the state data of the qq application is 10 point 05;
the terminal pulls up the WeChat application at 10 o 'clock 06 minutes, the first process 11a records the state data of the WeChat application, and the timestamp corresponding to the state data of the WeChat application is 10 o' clock 06 minutes;
at point 06 of 10, the first process 11a records the state data of the wechat application, and at this time, the first process 11a records 6 pieces of state data: state data of Internet Explorer application, state data of web disk application, state data of goodbye application, state data of karaoke application, state data of qq application and state data of WeChat application. And the state data is processed according to the time stamps in the sequence from first to last. The results obtained from the ranking are: state data of Internet Explorer application, 2 state data of web disk application, 3 state data of focused-look application, 4 state data of karaoke application, 5 state data of qq application, and 6 state data of WeChat application. In this embodiment, N is 5, and accordingly, the first process 11a selects 2, state data of the network disk application, 3, state data of the focused-look application, 4, state data of the karaoke application, 5, qq state data of the application, and 6, state data of the wechat application as target data, and writes the target data into the RAM memory 15 a.
Although the RAM memory 15a has better read-write efficiency, in the power-off process, data recorded in the RAM memory 15a is erased, and in order to ensure that the terminal can acquire the situation of the application pulled up before the terminal is powered off when the terminal is powered on, in the technical solution shown in the present application, the first process 11a in the technical solution shown in the embodiment of the present application is configured to transmit the changed state data to the RAM memory 15a and the FLASH memory 14a in response to monitoring the change of the state data. The data calling device is configured to, in response to a trigger operation for displaying an application interface, call the interface data stored in the RAM memory 15a, and output the interface data to the second thread process. When the display equipment is restarted, in response to the trigger operation of displaying the application interface, the data calling device calls the interface data stored in the FLASH memory 14a and outputs the interface data to the second line process.
The specific operation process of the terminal can be seen in fig. 11. The first process is configured to execute step S301, in response to the pulling up of the listening application, pulling up a first process 11a corresponding to the listening application, where the first process 11a is used for listening to the status data;
the monitoring process of the status data may refer to the above embodiments, and is not described herein again.
S302, responding to the monitored change of the state data, transmitting the changed state data to the FLASH memory 14a and the RAM memory 15 a; the response data calling device is configured to execute the triggering operation of step S3031 corresponding to the presentation of the application interface, and call the interface data stored in the RAM memory 15 a;
s3032, when the display device is restarted, in response to the trigger operation of displaying the application interface, the data retrieving device is configured to retrieve the interface data stored in the FLASH memory 14 a.
For a specific data retrieving process, reference may be made to the above embodiments, which are not described herein again.
The data calling device is configured to execute step S304 to output the interface data to the second thread process;
for a specific data transmission process, reference may be made to the above embodiments, which are not described herein again.
The second process 12a is configured to execute step S305 of sending the received interface data to the graphics processor 13a, so that the control graphics processor 13a renders an application interface based on the interface data
S306, responding to the trigger operation of closing the application interface, releasing the second process 12 a.
It can be seen that in the technical solution shown in the embodiment of the present application, when the first process 11a monitors that the state data changes, the changed state data is written into the RAM memory 15a and the FLASH memory 14a at the same time. When the terminal normally operates, responding to the trigger operation of displaying the application interface, and calling the interface data stored in the RAM memory 15 a; and outputting the interface data to a second line process. The RAM memory 15a has an unlimited number of read and write lifetimes, and therefore, erasing it multiple times during terminal operation does not affect its lifetime. Further, since the data writing rate of the RAM memory 15a is much higher than the data writing rate of the FLASH memory 14a, the terminal directly retrieves corresponding interface data from the RAM memory 15a during the operation process, and accordingly, the data response efficiency of the terminal is improved. When the display device is restarted, in response to the trigger operation for displaying the application interface, the data retrieving device retrieves the interface data stored in the FLASH memory 14a, and after the terminal is powered off and restarted, the data retrieving device can retrieve the corresponding interface data in the FLASH memory 14a and transmit the retrieved data to the second process 12a, so as to realize the rendering of the application interface.
A second aspect of the embodiment of the present application shows a monitoring method for an application, specifically referring to fig. 12, the method includes the following steps:
s401, in response to the pulling-up of the monitoring application, pulling up a first process 11a corresponding to the monitoring application, wherein the first process 11a is used for monitoring the state of an application program and recording state data; the status data comprises interface data;
s402, in response to a trigger operation for displaying an application interface, pulling up a second process 12a, wherein the second process is used for sending received interface data to the graphics processor 13a, so that the control graphics processor 13a renders the application interface based on the interface data;
s403 controls the display 2a to display the application interface;
s404, in response to the trigger operation of closing the application interface, the second process 12a is released.
The application processing method disclosed by the embodiment of the application is applied to a terminal, wherein the terminal sets two processes for monitoring the application, and the two processes are used for enabling state data to enable a first monitored process to be always in a pulled-up state; the second process for interface data processing is in a pull-up state only when the application interface needs to be displayed, and is released when the application interface is closed, so that the memory occupied by the application interface for monitoring the application in a non-display state is reduced, unnecessary power consumption of the terminal is avoided, and the probability of the terminal jamming problem is reduced.
Embodiments of the present application further provide a computer storage medium, where the computer storage medium includes computer instructions, and when the computer instructions are run on the electronic device, the electronic device is caused to perform each function or step performed by the electronic device in the foregoing method embodiments.
Embodiments of the present application further provide a computer program product, which, when running on a computer, causes the computer to perform each function or step performed by the electronic device in the above method embodiments.
Through the description of the above embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
It is understood that a person skilled in the art can combine, split, recombine and the like the embodiments of the present application to obtain other embodiments on the basis of several embodiments provided by the present application, and the embodiments do not depart from the scope of the present application.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A terminal, comprising:
the display is used for displaying the application interface;
the controller is used for setting a first process and a second process aiming at one monitoring application;
responding to the pull-up of the monitoring application, starting the first process, wherein the first process is used for monitoring state data and transmitting the state data to a memory, the state data comprises interface data, and the state data is used for recording the state of a terminal or the state of an application;
responding to a trigger operation for displaying the application interface, and starting a second process, wherein the second process is used for sending the received interface data to the graphic processor so that the graphic processor renders the application interface based on the interface data;
the controller is also provided with a data calling device, and the data calling device is configured to call the interface data stored in the memory in response to the trigger operation of displaying the application interface and output the interface data to a second process;
controlling the display to show the application interface;
and releasing the second process in response to the trigger operation of closing the application interface.
2. The terminal of claim 1, wherein the status data comprises network data and local data; the first process comprises a first thread and a second thread, the first thread is used for monitoring local data, and the second thread is used for monitoring network data.
3. The terminal of claim 1, wherein the first process is configured to, in response to listening for a change in state data, transfer the changed state data to a RAM memory.
4. The terminal of claim 3, wherein the first process is configured to transmit the changed state data to a FLASH memory in response to listening for a change in the state data.
5. The terminal of claim 1, wherein in response to monitoring the change of the state data, the first process is configured to send target sub-data to a FLASH memory, where the target sub-data is state data corresponding to N applications recently pulled up by the terminal.
6. The terminal of claim 1, wherein in response to monitoring the change of the state data, the first process is configured to send target sub-data to a RAM memory, where the target sub-data is state data corresponding to N applications recently pulled up by the terminal.
7. The terminal according to claim 5 or 6, wherein each state data is configured with a time stamp, and the time stamp is used for recording the pull-up time of the application;
the first process is configured to: sorting the state data according to the sequence of the time stamps from first to last;
and responding to the monitored change of the state data, and selecting the state data corresponding to the last N timestamps as the target subdata.
8. A terminal according to claim 3 or 4, wherein the controller is further provided with data retrieval means;
the data calling device is configured to call the interface data stored in the RAM memory in response to a trigger operation for displaying an application interface;
and outputting the interface data to a second process.
9. The terminal according to claim 4, wherein the controller is further provided with a data retrieving device;
the data calling device is configured to call the interface data stored in the FLASH memory in response to a trigger operation for displaying an application interface when the display is restarted;
and outputting the interface data to a second process.
10. A method for processing an application, comprising:
starting a first process corresponding to the monitoring application in response to the pull-up of the monitoring application, wherein the first process is used for monitoring state data and transmitting the state data to a memory; the status data comprises interface data; the state data is used for recording the state of the terminal or the state of the application;
responding to a trigger operation for displaying the application interface, and starting a second process, wherein the second process is used for sending the received interface data to the graphic processor so that the graphic processor renders the application interface based on the interface data;
the method further comprises the following steps: responding to the trigger operation of displaying the application interface, calling the interface data stored in the memory, and outputting the interface data to a second process;
controlling a display to show the application interface;
and releasing the second process in response to the trigger operation of closing the application interface.
CN202010371207.5A 2020-05-06 2020-05-06 Terminal and application processing method Active CN111586481B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010371207.5A CN111586481B (en) 2020-05-06 2020-05-06 Terminal and application processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010371207.5A CN111586481B (en) 2020-05-06 2020-05-06 Terminal and application processing method

Publications (2)

Publication Number Publication Date
CN111586481A CN111586481A (en) 2020-08-25
CN111586481B true CN111586481B (en) 2022-06-14

Family

ID=72118648

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010371207.5A Active CN111586481B (en) 2020-05-06 2020-05-06 Terminal and application processing method

Country Status (1)

Country Link
CN (1) CN111586481B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1433540A (en) * 1999-12-09 2003-07-30 英特尔公司 Method and apparatus for entering and exiting multiple threads within multithreaded processor
CN103176854A (en) * 2011-12-26 2013-06-26 腾讯科技(深圳)有限公司 Process-to-process communication method, device and system
CN103645947A (en) * 2013-11-25 2014-03-19 北京航空航天大学 MIL-STD-1553B bus monitoring and data analysis system
CN104375880A (en) * 2014-09-18 2015-02-25 腾讯科技(深圳)有限公司 Memory freeing method and device
JP2017194835A (en) * 2016-04-20 2017-10-26 富士通株式会社 Authentication program, authentication method, and authentication apparatus
CN109995937A (en) * 2019-03-14 2019-07-09 努比亚技术有限公司 Application icon display methods, mobile terminal and computer readable storage medium
CN110275782A (en) * 2018-03-13 2019-09-24 阿里巴巴集团控股有限公司 Data processing method and device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA1169160A (en) * 1980-09-16 1984-06-12 Daniel J. Galdun Video computer terminal with detachable intelligent keyboard module
CN104407965B (en) * 2014-12-17 2018-05-04 北京元心科技有限公司 The display methods and system of a kind of graphical interface window
CN108063980A (en) * 2017-12-20 2018-05-22 深圳市康冠技术有限公司 A kind of TV application software management method, system and device
CN110708581B (en) * 2019-08-27 2021-09-24 海信视像科技股份有限公司 Display device and method for presenting multimedia screen saver information

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1433540A (en) * 1999-12-09 2003-07-30 英特尔公司 Method and apparatus for entering and exiting multiple threads within multithreaded processor
CN103176854A (en) * 2011-12-26 2013-06-26 腾讯科技(深圳)有限公司 Process-to-process communication method, device and system
CN103645947A (en) * 2013-11-25 2014-03-19 北京航空航天大学 MIL-STD-1553B bus monitoring and data analysis system
CN104375880A (en) * 2014-09-18 2015-02-25 腾讯科技(深圳)有限公司 Memory freeing method and device
JP2017194835A (en) * 2016-04-20 2017-10-26 富士通株式会社 Authentication program, authentication method, and authentication apparatus
CN110275782A (en) * 2018-03-13 2019-09-24 阿里巴巴集团控股有限公司 Data processing method and device
CN109995937A (en) * 2019-03-14 2019-07-09 努比亚技术有限公司 Application icon display methods, mobile terminal and computer readable storage medium

Also Published As

Publication number Publication date
CN111586481A (en) 2020-08-25

Similar Documents

Publication Publication Date Title
CN111200746B (en) Method for awakening display equipment in standby state and display equipment
CN111654743B (en) Audio playing method and display device
CN111954059A (en) Screen saver display method and display device
CN114073098A (en) Streaming media synchronization method and display device
CN111601144A (en) Streaming media file playing method and display equipment
CN111417027A (en) Method for switching small window playing of full-screen playing of webpage video and display equipment
CN111277891A (en) Program recording prompting method and display equipment
CN111654729B (en) Account login state updating method and display device
CN111757181B (en) Method for reducing network media definition jitter and display device
CN113378092A (en) Video playing management method and display equipment
WO2021169168A1 (en) Video file preview method and display device
CN111324411B (en) User interface upgrading method in display equipment and display equipment
CN112203154A (en) Display device
CN112040308A (en) HDMI channel switching method and display device
CN111586481B (en) Terminal and application processing method
CN112506859B (en) Method for maintaining hard disk data and display device
CN111885415B (en) Audio data rapid output method and display device
CN111417022B (en) Conflict detection method and display device
CN111988648A (en) Time display method and display device
CN111679789A (en) Write-in control method and display device
CN113329246A (en) Display device and shutdown method
CN111562887B (en) Display device and partition capacity expansion method
CN112040317B (en) Event response method and display device
CN111586482B (en) Starting method and device
CN111901677B (en) Method for uniformly controlling process starting and display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant