CN116347143A - Display equipment and double-application same-screen display method - Google Patents

Display equipment and double-application same-screen display method Download PDF

Info

Publication number
CN116347143A
CN116347143A CN202310133094.9A CN202310133094A CN116347143A CN 116347143 A CN116347143 A CN 116347143A CN 202310133094 A CN202310133094 A CN 202310133094A CN 116347143 A CN116347143 A CN 116347143A
Authority
CN
China
Prior art keywords
application
display
interface
height
width
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310133094.9A
Other languages
Chinese (zh)
Inventor
张敬坤
庞秀娟
李金昆
李乃金
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202310133094.9A priority Critical patent/CN116347143A/en
Publication of CN116347143A publication Critical patent/CN116347143A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides display equipment and a double-application on-screen display method. Wherein the second application is an upper layer application establishing an interactive relationship with the first application. And if the starting state of the first application and the starting state of the second application are ready, controlling the display to display a user interface of the first application, and displaying a suspension interface on the upper layer of the application interface of the first application, wherein the suspension interface is used for displaying a user interface corresponding to the second application. And if the starting state of the first application is not ready, and the starting state of the second application is ready, controlling the display to display a user interface corresponding to the second application. The method can simultaneously display two application interfaces capable of mutually interacting on the same display device, realize the same-screen interactive display of double applications, improve the screen utilization rate of the display and improve the user experience.

Description

Display equipment and double-application same-screen display method
Technical Field
The application belongs to the field of on-screen display, and particularly relates to display equipment and a double-application on-screen display method.
Background
The display device refers to a terminal device capable of outputting a specific display screen, and may be a terminal device such as a smart television, a communication terminal, a smart advertisement screen, and a projector. Taking intelligent electricity as an example, the intelligent television is based on the Internet application technology, has an open operating system and a chip, has an open application platform, can realize a bidirectional man-machine interaction function, and is a television product integrating multiple functions of video, entertainment, data and the like, and the intelligent television is used for meeting the diversified and personalized requirements of users.
The display device may also install various applications in the intelligent terminal according to user's needs, for example, social applications such as a conventional video-type application, a short video, etc., and reading applications such as a cartoon, a reading book, etc. Different types of applications may support different display modes, for example, some applications may be displayed on a horizontal screen and others on a vertical screen. However, the display device does not have the logic to display multiple applications simultaneously for video playback, and the same display device typically cannot display application interfaces for two or more applications simultaneously. The application interface suspension can be realized through the native interface of the device, but the data communication with the following application cannot be realized, the modification of the bottom code is needed depending on the system version, and the locality is high.
Disclosure of Invention
Some embodiments of the present application provide a dual-application on-screen display method, so as to solve the problem that the same display device cannot display two application interfaces at the same time, resulting in poor user experience.
In a first aspect, some embodiments of the present application provide a display device including a display and a controller. Wherein the display is configured to display a user interface; the controller is configured to perform the following program steps:
responding to a control instruction displayed by the double applications on the same screen, and acquiring the starting state of the first application and the starting state of the second application; the second application is an upper-layer application establishing interaction relation with the first application; the starting state is a ready state or a non-ready state;
if the starting state of the first application and the starting state of the second application are ready, controlling the display to display a user interface of the first application, and displaying a suspension interface on the upper layer of the user interface of the first application, wherein the suspension interface is used for displaying a user interface corresponding to the second application;
and if the starting state of the first application is not ready, and the starting state of the second application is ready, controlling the display to display a user interface corresponding to the second application.
In a second aspect, some embodiments of the present application further provide a dual application on-screen display method, where the method includes:
responding to a control instruction displayed by the double applications on the same screen, and acquiring the starting state of the first application and the starting state of the second application; the second application is an upper-layer application establishing interaction relation with the first application; the starting state is a ready state or a non-ready state;
if the starting state of the first application and the starting state of the second application are ready, controlling the display to display a user interface of the first application, and displaying a suspension interface on the upper layer of the user interface of the first application, wherein the suspension interface is used for displaying a user interface corresponding to the second application;
and if the starting state of the first application is not ready, and the starting state of the second application is ready, controlling the display to display a user interface corresponding to the second application.
According to the scheme, the display device and the double-application same-screen display method can respond to the control instruction of the double-application same-screen display to acquire the starting state of the first application and the starting state of the second application. Wherein the second application is an upper layer application establishing an interactive relationship with the first application. And if the starting state of the first application and the starting state of the second application are ready, controlling the display to display a user interface of the first application, and displaying a suspension interface on the upper layer of the application interface of the first application, wherein the suspension interface is used for displaying a user interface corresponding to the second application. And if the starting state of the first application is not ready, and the starting state of the second application is ready, controlling the display to display a user interface corresponding to the second application. According to the method provided by the application interface interaction method, two application interfaces which can interact with each other can be displayed on the same display device at the same time, double-application same-screen interaction display is achieved, the screen utilization rate of the display is improved, and the user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
Fig. 1 is a usage scenario of a display device according to an embodiment of the present application;
fig. 2 is a hardware configuration diagram of a display device in an embodiment of the present application;
FIG. 3 is a flowchart of one embodiment of a dual application on-screen display method provided in the present application;
FIG. 4 is a flowchart of a second embodiment of a dual-application on-screen display method provided in the present application;
fig. 5 is a schematic flow chart of a third application replacement target switching interface according to an embodiment of the present application;
FIG. 6 is a schematic flow chart of establishing an interaction relationship between a first application and a second application in the embodiment of the present application;
FIG. 7 is a schematic diagram of a prompt interface according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a cross-screen display hover interface according to an embodiment of the disclosure;
FIG. 9 is a schematic diagram of a vertical screen display hover interface according to an embodiment of the disclosure;
FIG. 10 is a schematic diagram of a hover interface responsive to a motion event according to an embodiment of the present application;
FIG. 11 is a schematic diagram of a hover interface response magnification event according to an embodiment of the present application;
FIG. 12 is a schematic diagram of a hover interface response zoom out event according to an embodiment of the present application.
Detailed Description
For purposes of clarity and implementation of the present application, the following description will make clear and complete descriptions of exemplary implementations of the present application with reference to the accompanying drawings in which exemplary implementations of the present application are illustrated, it being apparent that the exemplary implementations described are only some, but not all, of the examples of the present application.
It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms "first," second, "" third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for limiting a particular order or sequence, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements explicitly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The display device provided in the embodiment of the application may have various implementation forms, and may be, for example, a television, a laser projection device, a display (monitor), an electronic whiteboard (electronic bulletin board), an electronic desktop (electronic table), and the like.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display apparatus 200 through the control apparatus 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device 200 includes infrared protocol communication or bluetooth protocol communication, and other short-range communication methods, and the display device 200 is controlled by a wireless or wired method. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc.
In some embodiments, the control device 300 (e.g., mobile phone, tablet, computer, notebook, etc.) may also be used to control the display device 200. For example, the display device 200 is controlled using an application running on the control device 300.
In some embodiments, the display apparatus 200 may receive not an instruction using the control apparatus 300 or the control device 100 described above, but a control of a user through a touch or gesture or the like.
In some embodiments, the display device 200 may also perform control in a manner other than the control apparatus 100 and the control device 300, for example, the voice instruction control of the user may be directly received through a module configured inside the display device 200 device to obtain voice instructions, or the voice instruction control of the user may be received through a voice control device configured outside the display device 200 device.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be permitted to make communication connections via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. The server 400 may be a cluster, or may be multiple clusters, and may include one or more types of servers.
As shown in fig. 2, the display apparatus 200 may include at least one of a modem 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface.
In some embodiments, the controller 250 may include a processor, a video processor, an audio processor, a graphic processor, a RAM, a ROM, and first to nth interfaces for input/output.
Display 260 may include the following components, namely: a display screen assembly for presenting a picture; a driving assembly driving the image display; a component for receiving an image signal outputted from the controller 250, performing display of video content, image content, and a menu manipulation interface, a component for manipulating a UI interface by a user, and the like.
The display 260 may be a liquid crystal display, an OLED display, a projection device, or a projection screen.
The communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver. The display device 200 may establish transmission and reception of control signals and data signals with the external control device 100 or the server 400 through the communicator 220.
A user interface, which may be used to receive control signals from the control device 100 (e.g., an infrared remote control, etc.).
The detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for capturing the intensity of ambient light; alternatively, the detector 230 includes an image collector such as a camera, which may be used to collect external environmental scenes, user attributes, or user interaction gestures, or alternatively, the detector 230 includes a sound collector such as a microphone, or the like, which is used to receive external sounds.
The external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, etc. The input/output interface may be a composite input/output interface formed by a plurality of interfaces.
The modem 210 receives broadcast television signals through a wired or wireless reception manner, and demodulates audio and video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals. In some embodiments, the controller 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the main device in which the controller 250 is located, such as an external set-top box or the like.
The controller 250 controls the operation of the display device and responds to the user's operations through various software control programs stored on the memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command to select a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the controller 250 includes at least one of a central processor (CentralProcessingUnit, CPU), a video processor, an audio processor, a graphics processor (GraphicsProcessingUnit, GPU), ramrandon AccessMemory, RAM), ROM (Read-only memory), first to nth interfaces for input/output, a communication Bus (Bus), and the like.
The user may input a user command through a Graphical User Interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
The user interface is a media interface for interaction and exchange of information between an application or operating system and a user, which enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form for user interfaces is a graphical user interface (GraphicUserInterface, GUI), which refers to a graphically displayed user interface associated with computer operations. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
In some embodiments, the system is divided into four layers, from top to bottom, an application layer (abbreviated as "application layer"), an application framework layer (abbreviated as "framework layer"), a An Zhuoyun row (android running time) and a system library layer (abbreviated as "system runtime layer"), and a kernel layer, respectively.
In some embodiments, at least one application program is running in the application program layer, and these application programs may be a Window (Window) program of an operating system, a system setting program, a clock program, or the like; or may be an application developed by a third party developer. In particular implementations, the application packages in the application layer are not limited to the above examples.
The framework layer provides an Application Programming Interface (API) and programming framework for the application. The application framework layer includes a number of predefined functions. The application framework layer corresponds to a processing center that decides to let the applications in the application layer act. Through the API interface, the application program can access the resources in the system and acquire the services of the system in the execution.
In some embodiments, the display apparatus 200 may start a corresponding application in response to an application start instruction input by a user, and display a user interface or a media asset screen of the application through the display 260. However, the display device 200 does not display the logic of playing video by multiple applications at the same time, that is, the same display device 200 cannot display the application interfaces of two or more applications at the same time, so that the purpose of displaying multiple applications on the same screen cannot be achieved.
In some embodiments, when a user wants to view two applications simultaneously, application interface hover may be implemented through the native interface of the device, i.e., a movable hover interface is formed at the top of the user interface of the application being displayed on display 260 to display different applications, e.g., when a user wants to view two applications simultaneously, the lower application may be played full screen and the upper application may hover in a window above the lower application.
The implementation method can realize the user interfaces of two applications displayed on the same screen, but the upper layer application cannot communicate data with the lower layer application, and the lower layer code modification is needed depending on the system version, so that the localization is high. Illustratively, the scenario in which the display device 200 initiates a fitness application is taken as an example. When the user uses the exercise application, the user needs to watch the action teaching of the exercise application and observe whether the action of the user is correct so as to adjust the action of the user in time, thereby achieving the purpose of learning and training. This requires the display device 200 to synchronously display the preview self-motion picture while displaying the motion teaching picture or the motion teaching video in the fitness application, that is, the image acquisition interface of the display device 200 captures the motion picture of the user.
To support the display device 200 to capture the user's exercise video, the display device 200 may integrate a software development kit for screen casting into the exercise application of the display device 200, which may be combined with the image capture interface of the display device 200 under the condition that the user authorizes the screen casting right when opening the exercise application. The image acquisition interface can capture the user's exercise action when the user moves to its acquisition range, and then display the user's exercise picture through the exercise application. After the display device 200 receives the screen-throwing instruction, the lower layer fitness application displays a fitness teaching video screen, and the upper layer application can be displayed above the lower layer application in a floating manner for displaying a user fitness action screen.
In the above embodiment, the native interface of the display device 200 may implement on-screen display of two applications in a floating manner, but it is impossible to implement data communication between an upper application and a lower application of the on-screen display. The interactive communication between applications needs to rely on the modification of the underlying code of the system version of the display device 200, such as system upgrade, version maintenance, etc., which results in a larger limitation.
In order to solve the problem that the same display device 200 cannot display user interfaces of two applications at the same time, some embodiments of the present application provide a display device 200, which display device 200 should include at least a display 260 and a controller 250. Wherein the display 260 is configured to display a user interface corresponding to the application. The controller 250 is configured to perform a dual application on-screen display method capable of implementing on-screen display of at least two interactable applications and dynamically adjusting display layouts of the two applications, as shown in fig. 3, the method comprising:
S100, responding to a control instruction displayed by the double applications on the same screen, and acquiring the starting state of the first application and the starting state of the second application.
The display device 200 may present an application program interface, where the application program interface includes an application program installed in the display device 200, and during the operation process of the display device 200, a user may select any application program in the application program interface to start and operate through inputting a control instruction, so that the current display screen jumps to the started application interface. Depending on the display device 200, a user may input a control instruction by means of the control apparatus 100 or touch screen touch.
When the controller 250 simultaneously starts two applications, it is necessary to start the first application and the second application in sequence, and for convenience of distinction, the application started first is referred to as a first application, and the application started later is referred to as a second application. In some embodiments, the second application may also be an upper-level application that establishes an interaction relationship with the first application.
In some embodiments, the number of applications displayed on the same screen may also be greater than two, and the maximum number of applications displayed may be determined based on the screen area of the display 260. Some embodiments of the present application only use the display device 200 to display the first application and the second application on the same screen as an exemplary description, and the control instruction may include two types of start instructions corresponding to the first application and the second application, respectively. One of the launch instructions is for launching a first application and the other launch instruction is for launching a second application. Taking the display 260 with the touch function as an example, the controller 250 may generate a corresponding start instruction by monitoring a click operation of a user on an icon of the first application and an icon of the second application on a screen of the display 260, and sequentially start the first application and the second application in response to the start instruction.
Before the controller 250 starts the first application and the second application, it is also necessary to acquire the start states of the first application and the second application, which are ready states or not ready states. Different status identifications may be selected to identify different application types in the display device 200. For example, the status identification used and the content of its corresponding representation may include: "PREPARED" means that the preparation is complete and the current application can be started; "FOCUS" means that the preparation is complete and the current application can be started; IDLE "means that in preparation, the current application is not bootable; "FINISH" indicates that the interface is exited and the current application is not bootable; "ERROR" indicates that the application is abnormal and the current application is not bootable.
In the state identifier, both of "PREPARED" and "formed" may be indicated as the ready-to-start state of the application, which may be started. However, the conditions of the two starting states are different, and for the application of the background service class, for example, for the application of partial screen projection, the normal operation of the WiFi network is required to be relied on, and the application belongs to the background service class, and the application can be normally started only in the 'PREPARED' state. For applications of non-background service class, such as off-grid and screen-throwing applications, the applications need to run normally under the specific my web page, the applications which need special conditions to start belong to the non-background service class applications, and the applications need to run normally under the specific conditions, and the "FOCUS" can control the start of the specific conditions. After the controller 250 responds to the control instruction, it is necessary to determine whether the first application and the second application can be normally started according to the start states of the first application and the second application.
In some embodiments, before the controller 250 launches the first application and the second application, it may also be desirable to determine that the application in the display device 200 supports hover launching. As shown in fig. 6, the controller 250 may traverse the application list of the display device and screen out the applications supporting the hover function from the application list as the second and third applications. In some embodiments, the application supporting the suspension function may be screened out by retrieving metadata information in the database of the display device 200, traversing all applications of the current complete machine through the metadata information.
It should be noted that, to distinguish the first application from the second application, all other applications supporting the floating display may be used as the third application.
After screening out the application supporting the suspension function, binding the client service in the first application and the third application, and binding the client service of the first application and the third application, so as to establish the interaction relationship between the first application and the third application and the interaction relationship between the second application and the third application. After the interactive relation is established, the first application and the second application can respectively perform interactive processes such as information interaction, data transmission, resource sharing and the like with the third application.
In some embodiments, taking the case that the first application and the third application establish an interaction relationship, when the interaction relationship between the first application and the third application is established, the interaction relationship between the first application and other applications may also be established by binding client services of other applications after the first application is started. After the interaction relationship is established successfully, the first application can perform cross-process message transfer with other applications.
For example, the current start state of the first application may be notified to a third application or other application, and after the third application or other application receives the notification message sent by the first application, the third application or other application may execute a subsequent procedure according to the notification message. For example, the connection state of other applications may be monitored, where the connection state includes successful connection or disconnection, or interaction information of the current third application may be recorded and notified to the other applications.
And S200, if the starting state of the first application and the starting state of the second application are ready, controlling the display to display the user interface of the first application, and displaying a suspension interface on the upper layer of the user interface of the first application.
The controller 250 can sequentially determine the start-up status according to the start-up sequence of the first application and the second application. If the start-up state of the first application is the ready state, the controller 250 controls the display 260 to display the user interface of the first application full screen, and then determines the start-up state of the second application, and if the start-up state of the second application is the ready state, the user interface of the second application is required to be displayed in a floating manner. At this time, the controller 250 divides a display area of the hover interface at a default position of the display 260, where the hover interface is located at an upper layer of the user interface of the first application, and is used for displaying the user interface corresponding to the second application. The controller 250 then controls the display 260 to display the user interface corresponding to the second application, so as to realize the on-screen display of the user interfaces of the two applications.
If the first application and the second application have an interactive relationship, when the controller 250 determines that the second application is in a ready state, an interactive connection between the first application and the second application is established, and after the establishment is successful, prompt information for notifying that the interactive relationship is established successfully is respectively sent to the user interface of the first application and the user interface of the second application.
In some embodiments, if the start-up state of the first application is the ready state, the controller 250 displays the user interface of the first application in full screen, if the start-up state of the second application is the not ready state, the controller 250 sends a prompt message of the start-up failure of the second application to the user interface of the first application, and re-detects the start-up state of the second application after a preset time interval, and if the start-up state of the second application is switched to the ready state within the preset detection times, the floating interface is displayed on the user interface of the first application. If the starting state of the second application is not ready within the preset detection times, a prompt message of the second application starting failure is sent to a user interface of the first application.
During the running process of the application, if the application has a program error or the user actively ends the first application, the full screen picture is in a non-display state. To increase the screen usage of the display 260, in some embodiments, if the start-up state of the first application is switched from the ready to the not ready state, the controller 250 dismisses the hover interface and displays the user interface of the second application full screen.
In some embodiments, the display device 200 may set the default hover position in accordance with the hover interface display orientation. For example, if the hover interface is displayed in a landscape display, the hover interface may hover at the upper left or upper right corner of the user interface upper layer of the first application. If the display direction of the suspension interface is vertical screen display, the suspension interface can be suspended at the left side or the right side of the upper layer of the first application, so that the suspension interface is prevented from shielding the user interface of the first application in a large area, and the user is prevented from watching the user interface of the first application.
And S300, if the starting state of the first application is not ready, and the starting state of the second application is ready, controlling the display to display a user interface corresponding to the second application.
In this embodiment, as shown in fig. 4, if the starting state of the first application is not ready, the controller 250 controls the display 260 to display a user interface corresponding to the second application, and sends a prompt message of failure in starting the first application to the user interface of the second application, and redetects the starting state of the first application at intervals of a preset time, if the starting state of the first application is ready, the controller 250 switches the full screen from the user interface of the second application to the user interface of the first application, and displays a suspension interface on the upper layer of the user interface of the first application. If the start state of the first application is still not ready within the preset detection times, the controller 250 sends a prompt message to the user interface of the second application that the first application cannot be started.
The user may also select to switch applications displayed on the same screen according to his own needs, for example, the user may start other applications through the control device 100 or the touch function to replace the first application and the second application. In some embodiments, as shown in fig. 5, the controller 250 may obtain the start state of the third application in response to a switching instruction input by the user. The switching instruction comprises a target switching interface designated by a user, wherein the target switching interface is a user interface or a suspension interface of the first application.
If the start-up state of the third application is the ready state, the controller 250 will switch the target switching interface to the user interface of the third application. For example, when the target switching interface is the user interface of the first application, after switching, the controller 250 may display the user interface of the third application in full screen, and display a floating interface on the upper layer of the user interface of the third application, where the user interface of the second application is still displayed. If the target switching interface is a floating interface, after switching, the controller 250 still displays the user interface of the first application in full screen, but the user interface of the second application in the floating interface is switched to the application interface of the third application.
In some embodiments, before the floating interface is displayed on the upper layer of the first application, the display device 200 may display an operation prompt interface on top of the first application, for prompting and guiding the user how to perform the dual-application on-screen display operation. Accordingly, after the upper layer of the first application displays the suspension interface, a suspension receipt may be sent to inform the first application that the suspension operation has been initiated, and the suspension receipt may be understood as receipt information that the second application successfully displays in the form of the suspension interface. When the first application receives the suspension receipt, the operation prompt interface can be exited according to the suspension receipt.
As shown in fig. 7, the first application is a fitness application, in order to achieve the purpose of learning while training, the user needs to screen the video or the image captured by the image acquisition interface in real time through the second application into the suspension interface, and in fig. 7, a prompt interface is displayed on the upper layer of the interface of the fitness application to prompt the user to perform the double-application on-screen display operation.
In some embodiments, controller 250 may also determine a display position of the hover interface based on a display direction of the hover interface. The controller 250 needs to read the display direction of the hover interface before displaying the hover interface. The display directions comprise a horizontal screen display and a vertical screen display, and the layout sizes of the suspension interfaces are different corresponding to the two different display directions.
As shown in fig. 8, if the display direction is a landscape display, a first layout size is acquired, wherein the first layout size includes a first layout coordinate and a first window size. The first layout coordinates are display coordinates of a hover interface (a cross-screen) located in the screen of the display 260, representing the center point coordinates of the hover interface. The first window size is a boundary size for forming the floating interface, the first window size comprises a first window height and a first window width, and when the display direction of the floating interface is horizontal screen display, the first window width is larger than the first window height.
After the first layout size is obtained, the controller 250 sets a display position of the floating interface according to the first layout coordinates, and adjusts a display area of the floating interface according to the first window size, so that the height of the floating interface is the first window height, and the width of the floating interface is the first window width.
As shown in fig. 9, if the display direction is vertical screen display, a second layout size is acquired, wherein the second layout size includes a second layout coordinate and a second window size. The second layout coordinates are display coordinates of a hover interface (portrait) located in the screen of the display 260, representing the center point coordinates of the hover interface. The second window size is a boundary size for forming the floating interface, the second window size comprises a second window height and a second window width, and when the display direction of the floating interface is vertical screen display, the second window height is larger than the second window width.
After the second layout size is obtained, the controller 250 sets a display position of the floating interface according to the second layout coordinates, and adjusts a display area of the floating interface according to the second window size, so that the height of the floating interface is the second window height, and the width of the floating interface is the second window width.
In some embodiments, controller 250 may also calculate the window size of the hover interface based on the resolution size of the video stream for the corresponding application.
If the display direction is the landscape display, the controller 250 needs to acquire the video stream to be played by the second application, and parse the video stream to acquire the video stream width and the video stream height. If the video stream width is greater than or equal to the video stream height, the video stream is a video played by a transverse screen, the first window height is determined to be a default height, the height-width ratio of the suspension interface in the display direction is obtained, and the first window width is calculated according to the default height and the height-width ratio.
In some embodiments, if the video stream width is smaller than the video stream height, the video stream is a video played by a vertical screen, the first window width is confirmed to be a default width, the aspect ratio of the display direction is obtained, and the first window height is calculated according to the default width and the aspect ratio.
If the display direction is vertical screen display and the video stream width is greater than or equal to the video stream height, determining that the second window width is a default width, and calculating the second window height according to the default width and the height-width ratio.
And if the video stream width is smaller than the video stream height, determining the second window height as a default height, and calculating the second window width according to the default width and the height-width ratio.
If the display screen is a horizontal screen, namely the current video stream is a horizontal screen, planning a display area of the horizontal screen, acquiring a default layout size corresponding to the display area of the horizontal screen, taking the default width as a fixed width, scaling the video stream according to the fixed width, and dynamically setting the size of a floating window through setLayoutParams; if the display screen is a vertical screen, namely the current video stream is the vertical screen, planning a display area of the vertical screen, acquiring a default layout size corresponding to the display area of the vertical screen, taking the default height as a fixed height, scaling the video stream according to the fixed height, and dynamically setting the size of a floating window through setLayoutParams.
The window size of the floating window is calculated as follows:
And if the video width is more than or equal to the video height, the video stream is played by a transverse screen, the width of the father layout is a fixed width and is marked as X, and the height of the father layout is X.
If VideoWidth < VideoHeight, the parent layout is high and denoted as Y, and the parent layout width is Y. Wherein the window width and window height of the floating window are the width and height of the parent layout.
In some embodiments, if the second application is a screen-throwing application, the suspension interface will display a screen-throwing picture sent by the screen-throwing terminal, and the window size of the suspension window needs to be adjusted due to the horizontal-vertical screen switching or the remote control menu key switching of the screen-throwing terminal.
The window size of the floating window may also be calculated as follows:
the width and height of the parent layout, i.e., the width and height of the current floating window, are obtained and respectively noted as ViewWidth, viewHeight.
The width and height of the current video stream are acquired and respectively recorded as VideoWidth, videoHeight.
If ViewWidth is larger than or equal to ViewHeight, judging the width of the video stream, if VideoWidth is larger than or equal to VideoHeight, the height of the father layout is fixed to be high, and is marked as Y, the width of the father layout is Y, namely the window width of the floating window is Y, and the window height of the floating window is Y, which is obtained through calculation. If VideoWidth < VideoHeight, the height of the parent layout is set to be high, and marked as Y, the height of the parent layout is Y x VideoWidth/VideoHeight, and the window width and the window height of the floating window are the width and the height of the parent layout.
If ViewWidth < ViewHeight, judging the width and height of the video stream, if VideoWidth is more than or equal to VideoHeight, the width of the parent layout is a fixed width, marked as X, the height of the parent layout is X, the width and height of the window in suspension are the width and height of the parent layout; if the video width < video height, the width of the parent layout is defined as the width, and marked as X, the height of the parent layout is X, the window width of the obtained floating window is X, video width, video height/video height by calculation, and the window height of the floating window is X.
If the user wants to actively adjust the position of the hover interface, the hover interface may be moved by a directional key or touch function, such as a finger or stylus, on the control device 100. In some embodiments, when the user intends to actively move to adjust the position of the hover interface, as shown in fig. 10, using a manual touch of the user as an example, the controller 250 may monitor a movement event on the screen of the display 260 and obtain a movement initial point and a movement end point in the movement event in response to the movement event. The moving initial point is any point in the center point of the suspension interface or the display area of the suspension interface, which is the point where the touch pressure disappears after the suspension interface moves. In fig. 10, the dotted line portion is an unmoved levitation interface, and the solid line portion is a levitation interface after movement.
After acquiring the movement initial point and the movement end point, the controller 250 may generate movement parameters according to the movement initial point and the movement end point, wherein the movement parameters include a movement direction of the levitation interface and a displacement value of the levitation interface in the movement direction. The moving direction is a vector direction in which the moving initial point points to the moving end point, the displacement value is a linear distance between the moving initial point and the moving end point, and the controller 250 may control the display 260 to display the floating interface after the movement according to the moving direction and the displacement value.
The controller 250 also needs to determine whether the moved hover interface is occluded from the display edge of the display 260, thereby affecting the integrity of the hover interface display. In some embodiments, the controller 250 also needs to parse the hover interface to obtain a boundary point of the hover interface, and then calculate a distance between the boundary point of the hover interface and an edge of the display area in the moving direction. If the distance is greater than or equal to the displacement value, it indicates that the controller 250 does not contact the edge of the display area after moving the floating interface, and the floating interface is not incomplete, so that the moving of the floating interface can be directly performed, and the controller 250 can control the display 260 to display the moved floating interface according to the moving direction and the displacement value in the moving direction.
If the distance is smaller than the displacement value, it is indicated that the controller 250 will contact with the boundary of the display area after moving the floating interface, so that the frame of the floating interface is incomplete, and at this time, the controller 250 needs to use the edge of the display area as the boundary, reduce the moving distance of the floating interface, and control the display 260 to display the moved floating interface at the edge of the display area, so as to completely display the moved floating interface.
In some embodiments, the user may also move the hover interface by controlling the keys of the apparatus 100, move the focus to the hover interface by controlling the directional keys of the apparatus 100, and click the directional keys to control the direction and distance of movement of the hover interface. The controller 250 may preset a fixed movement distance of the direction key, for example, click the "right" direction key once, and move the hover interface to the right Xpx or dp, where px is a unit of pixel and is a basic unit for displaying an image, and dp is a unit of length of an independent pixel of the device. Or long pressing of the right direction key, continuous rightward movement of the suspension interface, and the like. The controller 250 may acquire the number of times the user clicks the direction key to calculate the moving distance of the hover interface. If the product of the number of clicks of the user and the moving distance of each click causes the boundary point of the hover interface to exceed the display area edge, the controller 250 displays the hover interface at the display area edge.
In some embodiments, if the user wants to actively adjust the size of the hover interface, the apparatus 100 may be controlled or the touch function may be controlled, for example, by two-finger amplification, three-finger amplification, or multi-finger amplification. The controller 250 may monitor a zoom event sent by a user to the hover interface, and respond to the zoom event of the hover interface to obtain a moving distance of at least one zoom point on the hover interface, where the number of zoom points may be determined according to the number of simultaneous touches on the screen of the display 260 by the user, for example, the user sends the zoom event to the hover interface by two fingers, at this time, the zoom points of the hover interface are two, and the controller 250 may determine whether the zoom event is an zoom event or a zoom event according to a moving direction of the zoom point.
As shown in fig. 11, when the user drags the hover interface with two fingers and the zoom point moves toward the edge of the display area, the controller 250 determines that the event is an enlarged event and displays the enlarged hover interface. As shown in fig. 12, when the user drags the hover interface with two fingers and the zoom point moves toward the center of the hover interface, the controller 250 determines that the event is a zoom-out event and displays the zoomed-out hover interface. Meanwhile, the controller 250 acquires a moving distance of the zoom point and acquires a screen size of the display 260. Then, the controller 250 calculates a scaling of the hover interface according to the moving distance of the zoom point and the screen size, and controls the display 260 to display the scaled hover interface according to the scaling. It should be noted that, the zoom event and the touch effect on the hover interface are only implemented in the hover interface.
In some embodiments, the user may also perform zooming on the hover interface by controlling the apparatus 100, first moving the focus onto the hover interface by controlling the direction key of the apparatus 100, and then directly clicking the "zoom in" or "zoom out" key to perform zooming on the hover interface according to the preset zoom scale. The user can also manually input the scaling through the control device to better determine the scaling effect desired by the user.
Some embodiments of the present application further provide a dual-application on-screen display method, where the method includes:
s100, responding to a control instruction displayed by the double applications on the same screen, and acquiring the starting state of the first application and the starting state of the second application.
The second application is an upper-layer application establishing interaction relation with the first application; the activated state is either a ready state or a not ready state.
And S200, if the starting state of the first application and the starting state of the second application are ready, controlling the display to display the user interface of the first application, and displaying a suspension interface on the upper layer of the user interface of the first application.
The suspension interface is used for displaying a user interface corresponding to the second application.
And S300, if the starting state of the first application is not ready, and the starting state of the second application is ready, controlling the display to display a user interface corresponding to the second application.
According to the scheme, the display device and the double-application same-screen display method are provided, and the method can be used for responding to the control instruction of the double-application same-screen display to acquire the starting state of the first application and the starting state of the second application. Wherein the second application is an upper layer application establishing an interactive relationship with the first application. And if the starting state of the first application and the starting state of the second application are ready, controlling the display to display a user interface of the first application, and displaying a suspension interface on the upper layer of the application interface of the first application, wherein the suspension interface is used for displaying a user interface corresponding to the second application. And if the starting state of the first application is not ready, and the starting state of the second application is ready, controlling the display to display a user interface corresponding to the second application. According to the method provided by the application interface interaction method, two application interfaces which can interact with each other can be displayed on the same display device at the same time, double-application same-screen interaction display is achieved, the screen utilization rate of the display is improved, and the user experience is improved.
It will be apparent to those skilled in the art that the techniques of embodiments of the present invention may be implemented in software plus a necessary general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, in essence or contributing to the prior art.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the present disclosure and to enable others skilled in the art to best utilize the embodiments.

Claims (10)

1. A display device, characterized by comprising:
a display configured to display a user interface;
a controller configured to:
responding to a control instruction displayed by the double applications on the same screen, and acquiring the starting state of the first application and the starting state of the second application; the second application is an upper-layer application establishing interaction relation with the first application; the starting state is a ready state or a non-ready state;
if the starting state of the first application and the starting state of the second application are ready, controlling the display to display a user interface of the first application, and displaying a suspension interface on the upper layer of the user interface of the first application, wherein the suspension interface is used for displaying a user interface corresponding to the second application;
and if the starting state of the first application is not ready, and the starting state of the second application is ready, controlling the display to display a user interface corresponding to the second application.
2. The display device of claim 1, wherein the controller is further configured to:
and if the starting state of the first application is switched from the ready state to the non-ready state, the suspension interface is withdrawn, and a display is controlled to display the user interface of the second application.
3. The display device of claim 1, wherein the controller is further configured to:
responding to a switching instruction input by a user, and acquiring a starting state of a third application; the switching instruction comprises a target switching interface designated by a user; the target switching interface is a user interface of a first application or the suspension interface;
and if the starting state of the third application is the ready state, replacing the user interface displayed by the target switching interface with the user interface of the third application.
4. The display device of claim 3, wherein the controller is further configured to:
traversing an application list of the display device;
screening applications supporting a suspension function from the application list to serve as the second application and the third application;
binding the client service in the first application and the second application, and binding the client service in the first application and the third application to establish an interaction relationship between the first application and the third application and an interaction relationship between the second application and the third application.
5. The display device of claim 1, wherein the controller, executing before displaying the hover interface at an upper layer of the user interface of the first application, is further configured to:
Reading the display direction of the suspension interface; the display direction is horizontal screen display or vertical screen display;
if the display direction is horizontal screen display, acquiring a first layout size; the first layout size includes a first layout coordinate and a first window size; the first window size includes a first window width and a first window height;
setting a display position of the suspension interface according to the first layout coordinates, and adjusting a display area of the suspension interface according to the first window size;
if the display direction is vertical screen display, acquiring a second layout size; the second layout size includes a second layout coordinate and a second window size; the second window size includes a second window width and a second window height;
and setting the display position of the suspension interface according to the second layout coordinates, and adjusting the display area of the suspension interface according to the second window size.
6. The display device of claim 5, wherein if the display orientation is a landscape display, the controller is further configured to:
acquiring a video stream to be played by a second application;
analyzing the video stream to obtain video stream width and video stream height;
If the video stream width is greater than or equal to the video stream height, determining the first window height as a default height, acquiring the height-width ratio of the display direction, and calculating the first window width according to the default height and the height-width ratio;
and if the video stream width is smaller than the video stream height, confirming that the first window width is a default width, acquiring the height-width ratio of the display direction, and calculating the first window height according to the default width and the height-width ratio.
7. The display device of claim 6, wherein if the display orientation is a portrait display, the controller is further configured to:
if the video stream width is greater than or equal to the video stream height, determining that the second window width is a default width, and calculating the second window height according to the default width and the height-width ratio;
and if the video stream width is smaller than the video stream height, determining the second window height as a default height, and calculating the second window width according to the default width and the height-width ratio.
8. The display device of claim 1, wherein the controller is further configured to:
Responding to a movement event of the suspension interface, and acquiring a movement initial point and a movement end point in the movement event;
generating a movement parameter according to the movement initial point and the movement end point, wherein the movement parameter comprises a movement direction of the suspension interface and a displacement value of the suspension interface in the movement direction;
calculating the distance between the boundary point of the suspension interface and the edge of the display area in the moving direction;
if the distance is greater than or equal to the displacement value, controlling the display to display a floating interface after moving according to the moving direction and the displacement value;
and if the distance is smaller than the displacement value, controlling the display to display a floating interface after moving at the edge of the display area.
9. The display device of claim 1, wherein the controller is further configured to:
responding to a scaling event of the suspension interface, and acquiring the moving distance of at least one scaling point on the suspension interface;
acquiring the screen size of the display;
calculating the scaling of the suspension interface according to the moving distance and the screen size;
and controlling the display to display the scaled suspension interface according to the scaling proportion.
10. A dual application on-screen display method applied to a display device, the display device comprising a display and a controller, the method comprising:
responding to a control instruction displayed by the double applications on the same screen, and acquiring the starting state of the first application and the starting state of the second application; the second application is an upper-layer application establishing interaction relation with the first application; the starting state is a ready state or a non-ready state;
if the starting state of the first application and the starting state of the second application are ready, controlling the display to display a user interface of the first application, and displaying a suspension interface on the upper layer of the user interface of the first application, wherein the suspension interface is used for displaying a user interface corresponding to the second application;
and if the starting state of the first application is not ready, and the starting state of the second application is ready, controlling the display to display a user interface corresponding to the second application.
CN202310133094.9A 2023-02-17 2023-02-17 Display equipment and double-application same-screen display method Pending CN116347143A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310133094.9A CN116347143A (en) 2023-02-17 2023-02-17 Display equipment and double-application same-screen display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310133094.9A CN116347143A (en) 2023-02-17 2023-02-17 Display equipment and double-application same-screen display method

Publications (1)

Publication Number Publication Date
CN116347143A true CN116347143A (en) 2023-06-27

Family

ID=86888514

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310133094.9A Pending CN116347143A (en) 2023-02-17 2023-02-17 Display equipment and double-application same-screen display method

Country Status (1)

Country Link
CN (1) CN116347143A (en)

Similar Documents

Publication Publication Date Title
WO2020248640A1 (en) Display device
CN112073798B (en) Data transmission method and equipment
CN113645494B (en) Screen fusion method, display device, terminal device and server
CN112399233A (en) Display device and position self-adaptive adjusting method of video chat window
CN114430492B (en) Display device, mobile terminal and picture synchronous scaling method
CN113593488A (en) Backlight adjusting method and display device
CN115243082A (en) Display device and terminal control method
CN112947783B (en) Display device
WO2022193475A1 (en) Display device, method for receiving screen projection content, and screen projection method
CN111259639B (en) Self-adaptive adjustment method of table and display equipment
CN112650418B (en) Display device
CN116801027A (en) Display device and screen projection method
CN112235621B (en) Display method and display equipment for visual area
CN115550717A (en) Display device and multi-finger touch display method
CN116347143A (en) Display equipment and double-application same-screen display method
CN112732120A (en) Display device
CN114417035A (en) Picture browsing method and display device
CN111949179A (en) Control amplifying method and display device
CN113825001B (en) Panoramic picture browsing method and display device
CN117608426A (en) Display equipment and multi-application same-screen display method
WO2024139950A1 (en) Display device and processing method for display device
WO2024139245A1 (en) Display device and split-screen display method
CN111310424B (en) Form generation method and display device
WO2021248671A1 (en) Display device
CN117093293A (en) Display equipment and multi-window adjusting method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination