CN112783398A - Display control and interaction control method, device, system and storage medium - Google Patents

Display control and interaction control method, device, system and storage medium Download PDF

Info

Publication number
CN112783398A
CN112783398A CN201911090098.3A CN201911090098A CN112783398A CN 112783398 A CN112783398 A CN 112783398A CN 201911090098 A CN201911090098 A CN 201911090098A CN 112783398 A CN112783398 A CN 112783398A
Authority
CN
China
Prior art keywords
screen
interactive
data
control
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911090098.3A
Other languages
Chinese (zh)
Inventor
罗锦妮
石莹倩
杨璨榕
吴慧贞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201911090098.3A priority Critical patent/CN112783398A/en
Publication of CN112783398A publication Critical patent/CN112783398A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The embodiment of the application provides a display control and interaction control method, equipment, a system and a storage medium. In the embodiment of the application, for the first screen, an interactive interface can be generated for the first screen, the interactive interface is displayed on the second screen, and the interactive interface comprises an interactive control, so that a user can interactively control the first screen on the second screen through the interactive control, the user can interactively control the first screen conveniently, and the improvement of the interactive efficiency is facilitated. Furthermore, the interactive interface can also comprise summary information generated according to the dynamic data required to be displayed on the first screen, and based on the summary information, a user can quickly read the dynamic information on the first screen without tracking the change of the dynamic data on the first screen, so that the interactive control on the first screen is facilitated, and the interactive efficiency with the first screen is further improved.

Description

Display control and interaction control method, device, system and storage medium
Technical Field
The present application relates to the field of visualization technologies, and in particular, to a method, an apparatus, a system, and a storage medium for display control and interaction control.
Background
Data visualization is a way to clearly and efficiently convey and communicate information by means of graphical means. With the advent of large-screen display technology, the display effect of data visualization is better and better, for example, the picture is larger, the color is more, the resolution is higher, and the like.
The large-screen display technology brings trouble to users while providing a high-quality display effect for data visualization. In some application scenarios, a user needs to interact and explain data on a touch-control large screen, and the user needs to frequently move in front of the huge large screen due to the large screen, so that the problems of inconvenience in interaction, low interaction efficiency and the like exist.
Disclosure of Invention
Aspects of the present application provide a display control and interaction control method, device, system, and storage medium, so as to enable a user to perform interaction control on another screen through one screen, thereby improving convenience and efficiency of interaction between the user and the other screen.
An embodiment of the present application provides a display control method, including: acquiring data to be displayed on a first screen, wherein the data comprises dynamic data; generating abstract information in an interactive interface according to the dynamic data, wherein the interactive interface further comprises an interactive control; and displaying the interactive interface on a second screen to guide a user to carry out interactive control on the first screen through the interactive control through the summary information.
An embodiment of the present application further provides an interaction control method, including: displaying an interactive interface on a second screen, wherein the interactive interface comprises summary information and an interactive control; responding to the interactive operation sent out by the interactive control, and performing corresponding operation on the data interface on the first screen; the summary information is generated according to the dynamic data displayed in the data interface and is used for guiding a user to send out interactive operation on the interactive control.
An embodiment of the present application further provides a display control method, including: acquiring data displayed by a first screen; generating an interactive control in an interactive interface according to the data; and displaying the interactive interface on a second screen so that a user can interactively control the first screen through the interactive control.
An embodiment of the present application further provides a display control system, including: a display control device, a first screen, and a second screen; the display control device is used for acquiring data to be displayed on a first screen and visually displaying the data on the first screen, wherein the data comprises dynamic data; generating abstract information in an interactive interface according to the dynamic data, wherein the interactive interface further comprises an interactive control; and displaying the interactive interface on a second screen to guide a user to carry out interactive control on the first screen through the interactive control through the summary information.
An embodiment of the present application further provides a display control system, including: a display control device, a first screen, and a second screen; the display control device is used for acquiring data displayed by the first screen; generating an interactive control in an interactive interface according to the data; and displaying the interactive interface on the second screen so that a user can interactively control the first screen through the interactive control.
An embodiment of the present application further provides a display control apparatus, including: a memory and a processor; the memory for storing a computer program; the processor, coupled with the memory, to execute the computer program to: acquiring data to be displayed on a first screen, wherein the data comprises dynamic data; generating abstract information in an interactive interface according to the dynamic data, wherein the interactive interface further comprises an interactive control; and displaying the interactive interface on a second screen to guide a user to carry out interactive control on the first screen through the interactive control through the summary information.
An embodiment of the present application further provides an interactive control device, including: a memory and a processor; the memory for storing a computer program; the processor, coupled with the memory, to execute the computer program to: displaying an interactive interface on a second screen, wherein the interactive interface comprises summary information and an interactive control; responding to the interactive operation sent out by the interactive control, and performing corresponding operation on the data interface on the first screen; the summary information is generated according to the dynamic data displayed in the data interface and is used for guiding a user to send out interactive operation on the interactive control.
An embodiment of the present application further provides a display control apparatus, including: a memory and a processor; the memory for storing a computer program; the processor, coupled with the memory, to execute the computer program to: acquiring data displayed by a first screen; generating an interactive control in an interactive interface according to the data; and displaying the interactive interface on a second screen so that a user can interactively control the first screen through the interactive control.
Embodiments of the present application further provide a computer-readable storage medium storing a computer program, which, when executed by a processor, causes the processor to implement the steps in the display control method or the interaction control method provided in the embodiments of the present application.
In the embodiment of the application, for the first screen, an interactive interface can be generated for the first screen, the interactive interface is displayed on the second screen, and the interactive interface comprises an interactive control, so that a user can interactively control the first screen on the second screen through the interactive control, the user can interactively control the first screen conveniently, and the improvement of the interactive efficiency is facilitated.
Furthermore, in the embodiment of the application, the interactive interface may further include summary information generated according to the dynamic data that needs to be displayed on the first screen, and based on the summary information, the user can quickly read the dynamic information on the first screen without tracking the change of the dynamic data on the first screen, which is beneficial to quickly performing interactive control on the first screen, and further improves the interactive efficiency with the first screen.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic diagram illustrating the structure and operation principle of a display control system according to an exemplary embodiment of the present application;
FIG. 2a is a schematic diagram illustrating another operational principle of a display control system according to an exemplary embodiment of the present application;
fig. 2b is a schematic view of an application scenario of controlling a large screen by a small screen according to an exemplary embodiment of the present application;
FIG. 2c is a schematic diagram of an interface relationship between a large screen and a small screen for displaying relational data according to an exemplary embodiment of the present application;
FIG. 2d is a schematic diagram of an interface relationship between a large screen and a small screen for displaying spatial data according to an exemplary embodiment of the present application;
FIG. 3 is a schematic diagram of another display control system provided in an exemplary embodiment of the present application;
fig. 4 is a flowchart illustrating a display control method according to an exemplary embodiment of the present disclosure;
fig. 5 is a flowchart illustrating an interaction control method according to an exemplary embodiment of the present application;
FIG. 6 is a schematic flow chart diagram illustrating another display control method according to an exemplary embodiment of the present disclosure;
fig. 7a is a schematic structural diagram of a display control apparatus according to an exemplary embodiment of the present application;
FIG. 7b is a schematic structural diagram of another display control device provided in an exemplary embodiment of the present application;
fig. 8 is a schematic structural diagram of an interactive control device according to an exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the current large-screen display scene, the technical problem that the interaction efficiency between a user and a large screen is low exists, and in the embodiment of the application, an interaction interface is generated for the first screen, the interaction interface is displayed on a second screen, and the interaction interface comprises an interaction control, so that the user can interactively control the first screen on the second screen through the interaction control, the user can conveniently interactively control the first screen, and the improvement of the interaction efficiency is facilitated.
Furthermore, in the embodiment of the application, the interactive interface may further include summary information generated according to the dynamic data that needs to be displayed on the first screen, and based on the summary information, the user can quickly read the dynamic information on the first screen without tracking the change of the dynamic data on the first screen, which is beneficial to quickly performing interactive control on the first screen, and further improves the interactive efficiency with the first screen.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic structural diagram of a display control system according to an exemplary embodiment of the present application. As shown in fig. 1, the display control system includes: a display control device 11, a first screen 12 and a second screen 13. The display control device 11 is in communication connection with the first screen 12 and the second screen 13, and can control the first screen 12 and the second screen 13 to display data. Further, the display control apparatus 11 may control the second screen 13 to display an interactive interface through which the first screen 12 can be interactively controlled.
Wherein, the connection between the display control device 11 and the first screen 12 or the second screen 13 can be wireless or wired. Alternatively, the display control device 11 may be in communication connection with the first screen 12 or the second screen 13 through a mobile network, and accordingly, the network format of the mobile network may be any one of 2G (gsm), 2.5G (gprs), 3G (WCDMA, TD-SCDMA, CDMA2000, UTMS), 4G (LTE), 4G + (LTE +), 5G, WiMax, or a new network format that will appear in the future. Alternatively, the display control device 11 may be communicatively connected to the first screen 12 or the second screen 13 through bluetooth, WiFi, infrared, zigbee, NFC, or the like.
In the present embodiment, the product form of the display control device 11 is not limited, and any computer device with certain computing, storing and communicating capabilities can be used as the display control device 11 in the embodiment of the present application. For example, the display control device 11 may be a terminal device such as a notebook computer, a desktop computer, a smart phone, or an IOT device, an edge computing device such as an intelligent street lamp, a camera, or a traffic monitoring device, a server device such as a conventional server, a cloud server, a server array, or a data center, an ARM chip, and some processing chips or modules implemented based on an FPGA or a CPLD. In fig. 1, the display control apparatus 11 is illustrated as a server, but is not limited thereto.
In the present embodiment, the product form of the first screen 12 and the second screen 13 is not limited, and any device having a display function may be used as the first screen 12 or the second screen 13 in the embodiment of the present application. For example, the first screen 12 or the second screen 13 may be a CRT display screen or an LCD display screen. Further, if the first screen 12 or the second screen 13 is an LCD display screen and may further include a Touch Panel (TP), the first screen 12 or the second screen 13 may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect signals such as duration and pressure related to the touch or slide operation.
In the present embodiment, the size of the first screen 12 and the second screen 13 is not limited, and may be, for example, a larger screen such as 49, 55, 60, 72, 100, or 120 inches, or a smaller screen such as 3.5, 4.7, 5.5, 9.7, 7.9, or 12.9 inches. The first screen 12 and the second screen 13 may have the same or different specification and size.
In an alternative embodiment, the first screen 12 has a larger format size than the second screen 13, as shown in fig. 1. For example, the first screen 12 may have a format size of 49, 55, 60, 72, 100, or 120 inches, etc., and the second screen 13 may have a format size of 3.5, 4.7, 5.5, 9.7, 7.9, or 12.9 inches, etc. Under the condition that the specification size of the first screen 12 is larger than that of the second screen 13, a user can perform interactive control on a large screen through a small screen, and the problem of inconvenience in interaction due to the fact that the specification size of the first screen 12 is large can be solved to a certain extent.
In another alternative embodiment, the first screen 12 has a smaller format size than the second screen 13. For example, the first screen 12 may have a format size of 3.5, 4.7, 5.5, 9.7, 7.9, or 12.9 inches, etc., and the second screen 13 may have a format size of 49, 55, 60, 72, 100, or 120 inches, etc. Under the condition that the specification size of the first screen 12 is smaller than that of the second screen 13, a user can control the small screen through the larger screen, the problems of inconvenient operation, low resolution and the like caused by the smaller specification size of the first screen 12 can be avoided to a certain extent, and the method is suitable for some application scenes with darker light rays or users with poor or imperfect eyesight such as the old, children and the like.
In this embodiment, the first screen 12 needs to display data, and this embodiment does not limit the way in which the first screen 12 displays data, and data may be displayed in the form of graphics, diagrams, and the like. The data display effect may be static or dynamic. The data display effect enables a user to interact with it, whether static or dynamic. In some application scenarios, the data that the first screen 12 needs to display includes dynamic data. The dynamic data refers to data with a change frequency greater than a set frequency threshold, and may be, for example, real-time data generated in the running process of some online applications, cloud applications, physical devices, virtual machines, or some instances.
In the present embodiment, the user is allowed to interactively control the first screen 12. The interactive control here mainly refers to an interactive control initiated with respect to the content displayed on the first screen 12. For example, the user can control the first screen 12 to switch the displayed data content, for example, to switch the displayed sales data for each commodity in 2010 to the sales data for each commodity in 2011. As another example, the user may also control the first screen 12 to alter the display effect, such as highlighting a portion of the data or a portion of the interface element. Of course, the user may also initiate various interactive controls with respect to the first screen 12 itself, such as controlling the first screen 12 to be turned off, turned on, and adjusting the brightness, resolution, etc. of the first screen 12. For a description of the user initiating the interactive control with respect to the first screen 12 itself, reference may be made to the following embodiments, which are not repeated herein.
In practical applications, the user may not be able to directly perform interactive control on the first screen 12, or it may be inconvenient to perform interactive control on the first screen 12. For example, the first screen 12 does not support touch operation or voice control, and the user cannot perform interactive control on the first screen 12. For another example, the first screen 12 is installed at a high position, and it is inconvenient for the user to interactively control the first screen 12. For another example, the size of the first screen 12 is large, so that the user needs to frequently walk in front of the first screen 12 to perform interactive control with the first screen 12, and the operation is inconvenient.
In order to facilitate the user to perform interactive control on the first screen 12, in the present embodiment, the display control device 11 may acquire data that the first screen 12 needs to display. As shown in fig. 1, the display control device 11 may obtain data to be displayed on the first screen 12 from a data source, where the data source may include at least one of a database, an on-cloud application, an on-line application, a local application, and the like, and specifically, what kind of data source may depend on an application scenario. Alternatively, the display control device 11 may also directly acquire the data that needs to be displayed by the first screen 12 from the first screen 12. Wherein, the data to be displayed on the first screen 12 includes dynamic data and other data besides the dynamic data; summary information in the interactive interface is generated according to the dynamic data, and in addition to the summary information, the interactive interface in this embodiment further includes an interactive control, as shown in fig. 1. Optionally, the interactive interface may include an information area and an interaction control area, the summary information is located in the information area, and the interaction control is located in the interaction control area, but is not limited thereto. As shown in FIG. 1, the interactive interface may also be non-partitioned.
The interactive control is used for a user to issue an interactive instruction to the first screen, and may be in any control form, such as a button, an icon list, a text label, an input box, a menu, a picture box, a scroll bar, and the like. In addition, the number of the interactive controls can be one or more.
The summary information may reflect the change of the dynamic data, and may guide the user to quickly interpret the dynamic data displayed on the first screen 12. The number of the summary information may be one or more. The type of summary information may be text type, chart type, animation type or code type.
In this embodiment, the style, and layout of the interactive interface are not limited, and all the interface styles, and layouts including the summary information and the interactive controls are applicable to the embodiment of the present application. It should be noted that, in addition to the summary information and the interaction control, other contents may be included in the interaction interface, for example, a visualization result of the data displayed on the first screen 12.
After obtaining the interactive interface, the display control apparatus 11 may display the interactive interface on the second screen 13. The user may perform interactive control with the second screen 13, that is, the user may perform an operation on the interactive interface displayed on the second screen 13, for example, may initiate an operation on an interactive control on the interactive interface, where the operation may be a click, a double click, a touch, a long press, or a mouse hover, and the operation may be determined according to a human-computer interaction mode supported by the second screen 13. By means of the interactive control, the user can send an interactive instruction to the first screen 12, so that interaction with the first screen 12 is realized, the problem that the user cannot directly perform interactive control on the first screen 12 or is inconvenient to perform interactive control on the first screen 12 can be solved, the interactive control on the first screen 12 is facilitated, and the interaction efficiency with the first screen 12 can be improved. Furthermore, for the user, the user sees the interactive interface displayed on the second screen 13, and can also quickly know the change condition of the dynamic data displayed on the first screen 12 according to the summary information on the interactive interface, so as to quickly determine what kind of interactive control needs to be performed on the first screen 12, and then send out a corresponding interactive instruction through the interactive control, so as to quickly perform interactive control on the first screen 12, and further improve the interactive efficiency with the first screen 12.
In the embodiments of the present application, the information form and content of the summary information are not limited, and all information that can reflect the change condition of the dynamic data presented on the first screen 12 can be used as the summary information in the embodiments of the present application.
In an alternative embodiment, the dynamic data that needs to be displayed on the first screen 12 can be directly used as summary information, which means that the dynamic data presented on the first screen 12 will be synchronously displayed on the second screen 13. It is possible for the user to directly view the dynamic data on the second screen 13 without tracking the dynamic data on the first screen 12. For example, in the case that the first screen 12 is a huge screen or a larger screen, the second screen 13 may be a screen on a terminal device used by the user, such as a screen of a tablet computer, a screen of a smart phone, or a screen of an office computer, and compared with tracking dynamic data on a huge screen or a larger screen, it is more convenient and more efficient for the user to view the dynamic data on the screen of the terminal device, and thus the interaction efficiency with the huge screen or the larger screen can be improved.
In another alternative embodiment, the trend of the dynamic data may be generated as summary information, which means that the dynamic data is displayed on the first screen 12 and the trend of the dynamic data is displayed on the second screen 13. The change trend of the dynamic data can be represented by a graph such as a trend line and a histogram, or can be a numerical difference value reflecting the change trend of the dynamic data, or can be text information describing the change trend of the dynamic data. Assuming that the dynamic data refers to the sales of women's clothing, which changes in real time during the activity and increases, the text-type information may be "women's clothing sales increases, and the increase rate reaches 30% in 1 hour", but is not limited thereto. In fig. 1, the trend of the dynamic data is represented by a trend line, and the trend line in fig. 1 is only an example.
In the embodiment of the present application, the generation manner of the interactive control is not limited. For example, interaction controls adapted to the interaction functions may be preset according to the interaction functions supported by the first screen 12, and based on this, the display control device 11 may directly add the preset interaction controls to the interaction interface. Besides, the interactive control in the interactive interface can also be generated dynamically and in real time according to the data to be displayed on the first screen 12. The following describes the manner of generating interactive controls dynamically and in real time:
for the first screen 12, the data to be displayed includes dynamic data and other data, and the other data is associated with the dynamic data. For example, the dynamic data may be real-time sales volume or sales amount of each commodity in the current period, the other data may be sales volume or sales amount in a plurality of historical periods, or dynamic icons reflecting changes of real-time sales volume or sales amount. Considering that the frequency of change of the dynamic data is high, the first screen 12 may directly display the dynamic data itself, and may not perform a special visualization process on the dynamic data. For other data, the first screen 12 may display a visualization corresponding to the other data. In order to facilitate better interaction with the visualization results of other data, the display control device 11 may analyze the visualization properties of other data that the first screen 12 needs to display; and generating an interactive control in the interactive interface according to the visualization attributes of the other data, wherein the interactive control is mainly used for performing interactive operation on the visualization result of the other data on the first screen 12.
The visualization attribute of the data can indicate the visualization graphic form in which the data is suitable to be displayed. For a data, the visual attribute may be a time attribute, which indicates that the data is changed with time, and is suitable for being displayed by adopting a graphic form capable of reflecting the change with time. For a piece of data, the visualized attribute may be a position attribute, which indicates that the data is changed with geographical position, and is suitable for being displayed by adopting a graphic form capable of reflecting the change with geographical position. The visualization attribute of data may be a category attribute, which indicates that the data is data of a certain category, and the data may be displayed in a form of a graph that represents different categories, depending on the categories. The visualization attribute of the data can reflect the graphic form suitable for the data during visualization, and further can reflect the control type suitable for use when interacting with the graphic forms.
Based on the above, when the display control device 11 generates the interactive control in the interactive interface according to the visual attribute of other data, the corresponding target control type may be determined according to the visual attribute of other data; and generating an interactive control in the interactive interface according to the type of the target control. The target control type is a control type which is determined according to the visualization attributes of other data and is suitable for interacting with the graphic form adopted by the other data during visualization.
For example, if the visualization properties of the other data include a time property, then a control type that supports interaction by time may be determined as the target control type; if the visualization attributes of other data comprise position attributes, determining a control type supporting interaction according to the position as a target control type; and if the visualization attributes of other data comprise category attributes, determining the control type supporting interaction according to the category as the target control type.
The control types supporting the interaction according to time can comprise: a timeline control, and the timeline control can be slid or clicked. The user can send an interactive instruction for viewing data or a visual effect corresponding to a certain time point by performing sliding or clicking operation on the time axis control, and the display control device 11 can display or highlight the data or the visual effect corresponding to the certain time point on the first screen 12. In addition to timeline controls, the types of controls that support interaction by time may include: and an input box control which can input time information and supports input operation. The user can input the time point information in the input box control, so as to send an interactive instruction for viewing data or visualization effect corresponding to a certain time point.
The control type supporting the interaction according to the position can comprise: a map control that can be zoomed, rotated, or dragged. By zooming, rotating or dragging the map control, the user may issue an interactive instruction to view data or a visualization effect corresponding to a certain geographic area or location, and the display control device 11 may display or highlight the data or the visualization effect corresponding to the certain geographic area or location on the first screen 12. The map control is a complete or local electronic map, and each geographic area or position is mainly displayed on the electronic map. In addition, the electronic map may be two-dimensional or three-dimensional.
The control types supporting interaction by category can include: and selecting a control, wherein the selection control can be clicked, touched or pressed for a long time. Optionally, one selection control may be set to correspond to one category, and if there are multiple categories, multiple selection controls may be set on the interactive interface to correspond to multiple different categories, respectively. The user can click, touch or press a certain selection control for a long time, so that an interactive instruction for viewing the data or the visualization result under the category corresponding to the selection control can be sent. One selection control may correspond to one category or may correspond to multiple categories. In the case that one selection control corresponds to multiple categories, different categories can be distinguished through different operation types supported by the selection control, for example, a single click operation represents corresponding category 1, a double click operation represents corresponding category 2, a long press operation represents corresponding category 3, and the like.
In the above embodiments of the present application, how the first screen 12 displays the data that needs to be displayed is not limited. For example, the first screen 12 may perform a data display operation by itself, for example, it may obtain data to be displayed from a data source, and display the data to be displayed according to a data display model preset by a user. In addition to this, the first screen 12 may be controlled by the display control device 11 to perform data display. The process of the display control device 11 controlling the first screen 12 to display data is explained below with the embodiment shown in fig. 2 a.
As shown in fig. 2a, the display control apparatus 11 may obtain data, which includes dynamic data and other data, to be displayed on the first screen 12 from a data source. After obtaining the data to be displayed on the first screen 12, the display control device 11 may generate summary information and an interactive control in the interactive interface according to the dynamic data, and display the interactive interface on the second screen 13; on the other hand, other data may be visualized to obtain an overall visualization result to be displayed on the first screen 12, where the overall visualization result includes the visualization result of the other data and the dynamic data, and the overall visualization result is displayed on the first screen 12. As shown in fig. 2a, the first screen 12 includes at least two display areas, one is a dynamic data display area and the other is a visualization result display area.
In an alternative embodiment, dynamic display device 11 may analyze the visualization properties of other data; then, on one hand, generating an interactive control in the interactive interface according to the visual attributes of other data; on the other hand, according to the visualization attribute of other data, determining the visualization graph form required for displaying other data; according to the visualization graph form, performing visualization processing on the data to be displayed on the first screen 12 to obtain a data interface, wherein the data interface is equivalent to the integral visualization result; the data interface may then be displayed on the first screen 12, which includes the visualization of other data in a visualization graphical modality, such as the bar graph shown in fig. 2 a. The histogram shown in fig. 2a is only one example of a visualization graphical modality, and is not limited thereto.
Optionally, if the visualization attributes of the other data include a time attribute, a graphical modality that can be presented in time is determined. For example, the graphic form that can be presented in time may be a two-dimensional or three-dimensional coordinate graph with time as a certain coordinate axis, and the coordinate graph may use a scatter chart, a histogram, a graph, or the like. If the visualization attributes of the other data include location attributes, a graphical modality that can be presented by location is determined. For example, the graphical modality that can be presented by location may be a data map. If the visualization attributes of other data comprise category attributes, determining the graphic forms capable of being displayed in a classified mode. The graphic forms which can be displayed in a classified manner can be scatter diagrams, histograms, graphs, pie charts and the like which can embody the categories.
In combination with the aforementioned implementation of "generating an interactive control in an interactive interface according to a target control type", the display control device 11 may display a visual graphic form required by other data in combination with the first screen 12 in a process of generating an interactive control according to a target control type, that is, an interactive control matching with the visual graphic form may be selected from the interactive controls under the target control type to serve as the interactive control in the interactive interface. The interactive controls under the target control type can be multiple, and the interactive controls matched with the visual graph form are selected, so that the technical implementation difficulty in interactive control on the visual graph form according to the interactive controls can be reduced, and the development cost is reduced.
Further, during control development, multiple versions can be developed for the same interactive control for different screen types and/or specification sizes. Based on this, when the interactive control in the interactive interface is selected from the target control type, the type and/or specification size of the second screen 13 can be combined, and the interactive control matched with the type and/or specification size of the second screen can be selected from the interactive controls, so that the rationality of the style and layout of the interactive interface can be ensured, and the user experience can be improved.
Further, besides the interaction control and the summary information, a visualization result corresponding to other data that needs to be displayed by the first screen 12 can be displayed on the second screen 13. Optionally, the display control device 11 may also perform visualization processing on other data to be displayed on the first screen 12 according to the specification size and the type of the second screen 13 to obtain a visualization result corresponding to the other data, and display the visualization result of the other data on the second screen 13. Alternatively, the display control device may adjust the above-obtained visualization result for the first screen 12 according to the specification size and the type of the second screen 13 to obtain a visualization result suitable for the second screen 13, and then display the visualization result on the second screen 13.
In the present embodiment, the implementation manner of displaying the visualization results of the interaction control, the summary information, and other data on the second screen 13 is not limited. For example, the visualization result of other data may be displayed as a first layer, and then the interaction control and the summary information may be displayed on the first layer as a second layer. Optionally, the interaction control and the summary information may be displayed in an edge region of the first layer, so that occlusion of a core region of the first layer may be reduced.
Further, on the basis that the interactive interface is displayed on the second screen 13 and the visualization result of the dynamic data and other data is displayed on the first screen 12, the user can perform interactive control on the first screen 12 through the interactive interface on the second screen 13. For example, as shown in fig. 2a, the user may issue an interactive operation to the interactive control on the second screen 13, and further, the user may issue an interactive operation to the interactive control under the guidance of the summary information. The display control device 11 may perform corresponding operations on the data interface on the first screen 11 in response to the interactive operations sent for the interactive controls on the second screen 13. In this embodiment, the interactive operation issued by the user is not limited, and may be any interactive operation that can be supported by the first screen 13.
Here, the display control device 11 in the present embodiment may be implemented independently of the first screen 12 and the second screen 13, or may be implemented in one device integrated with the first screen 12 or the second screen 13. In an alternative embodiment, the display control device 11 is implemented independently, the first screen 12 is an independent screen, and the second screen is a screen on the terminal device for remotely controlling the first screen 12.
The technical scheme provided by the embodiment of the application can be applied to a real-time data large-screen display scene with an interpreter or a host. As shown in fig. 2b, in this application scenario, the visualization of the real-time data and other data associated with the real-time data needs to be displayed on a large screen (corresponding to the first screen 12 described above). An interpreter or a host needs to explain by combining real-time data on a large screen, and different visual results need to be displayed to audiences through the large screen in the explanation process. As shown in fig. 2b, in order to facilitate interaction with the large screen, the instructor or the host interactively controls the large screen through the handheld terminal. The handheld terminal used by the instructor or the host may be a mobile phone, a tablet computer, a mini-notebook, or the like, the screen of which (corresponding to the second screen 13 described above) is relatively small.
As shown in fig. 2b, the server device (equivalent to the display control device 11) may generate summary information of the interactive interface according to the real-time data that needs to be displayed on the large screen, and may generate a corresponding interactive control according to the visualization attribute of other data; and displaying the interactive interface with the summary information and the interactive control on a screen of a handheld terminal used by an interpreter or a host.
As shown in fig. 2b, the interpreter or the host can view the interactive interface displayed by the handheld terminal, the interactive interface includes summary information capable of reflecting the change condition of the real-time data, based on the summary information, the interpreter or the host can quickly interpret the real-time information displayed on the large screen, and can quickly determine what kind of interactive control needs to be performed on the large screen, and quickly perform interactive control on the large screen. In the embodiment, the interpreter or the host can accurately control the change of the real-time data without frequently walking in front of the large screen, logic guidance can be carried out on the real-time data displayed on the large screen through the abstract information on the handheld terminal, a better data visualization effect can be timely and quickly displayed for audiences, and the viewing experience of the audiences is favorably improved.
In the present embodiment, the visualization result on the large screen is not limited, and may be any visualization result that supports the interactive operation. As shown in fig. 2c, a visualized data interface of the relational data on the large screen and an interactive interface on the handheld terminal are shown. The large screen emphatically displays the visual effect; the interactive interface at the hand-held terminal side comprises a screening button, and the form of the graph on the large screen can be controlled and adjusted through the screening button; in addition to the filter button, details and changes of real-time data in the large screen are also presented on the interactive interface. As shown in fig. 2d, a visualized data interface of the spatial data on the large screen and an interactive interface on the handheld terminal are shown. The large screen emphatically displays the visual effect; the interactive interface at the hand-held terminal side comprises a screening button, the screening button can control and adjust the graphic form on the large screen, and the screening button can select different geographical areas, for example, different provinces; besides the screening button, the details and changes of real-time data in the large screen and carriers of spatial data corresponding to different geographic areas, such as a map, are presented on the interactive interface. The user can zoom and rotate the carrier of spatial data. Here, in fig. 2c and 2d, it is desired to highlight the correspondence relationship between the data interface displayed on the large screen and the interactive interface on the handheld terminal side and the difference of the interface effect by the screen shape in the specific scene, and the specific content displayed on the two screens is not focused, so that the related content is blurred in fig. 2c and 2 d.
In the embodiment, on the basis of realizing the multi-end linkage of the large data screen and the handheld small screen, the real-time data displayed in the large screen is displayed in a self-adaptive mode on the small screen, a host or a presenter of the large data screen plays a role in logic guidance, the difficulty of observing the change of the real-time data is reduced, the circulation of man-machine interaction between a user and the large data screen is realized, and the understanding and reading of the large combined nested data screen are more efficient and accurate.
Fig. 3 is a schematic structural diagram of another display control system according to an exemplary embodiment of the present application. As shown in fig. 3, the system includes: a display control device 31, a first screen 32 and a second screen 33. The display control device 31 is in communication connection with the first screen 32 and the second screen 33, and can control the first screen 32 and the second screen 33 to display data. Further, the display control device 31 may control the second screen 33 to display an interactive interface through which the first screen 32 can be interactively controlled.
Here, the display control device 31 and the first screen 32 or the second screen 33 may be connected wirelessly or by wire.
In the present embodiment, the product form of the display control device 31 is not limited, and any computer device with certain computing, storing and communicating capabilities can be used as the display control device 31 in the embodiment of the present application. For example, the display control device 31 may be a terminal device such as a notebook computer, a desktop computer, a smart phone, or an IOT device, an edge computing device such as an intelligent street lamp, a camera, or a traffic monitoring device, a server device such as a conventional server, a cloud server, a server array, or a data center, an ARM chip, and some processing chips or modules implemented based on an FPGA or a CPLD. In fig. 3, the display control device 31 is illustrated as a server, but is not limited thereto.
In the present embodiment, the product form of the first screen 32 and the second screen 33 is not limited, and may be, for example, a CRT display screen or an LCD display screen. Similarly, the specification sizes of the first screen 32 and the second screen 33 are not limited. The first screen 32 and the second screen 33 may have the same specification size, or the specification size of the first screen 32 may be larger than the specification size of the second screen 33, or the specification size of the first screen 32 may be smaller than the specification size of the second screen 33.
In the present embodiment, the first screen 32 displays data, but the present embodiment is not limited to the manner in which the first screen 32 displays data, and data may be displayed in the form of graphics, diagrams, and the like. The data display effect may be static or dynamic. The data display effect enables a user to interact with it, whether static or dynamic.
In the present embodiment, the display control apparatus 31 may acquire data displayed by the first screen 32. For example, the display control device 31 may acquire data displayed by the first screen 32 from a data source; alternatively, the data displayed by the first screen 32 may be directly acquired from the first screen 32. In this embodiment, the data displayed on the first screen 32 is not limited, and may or may not include dynamic data.
After acquiring the data displayed on the first screen 32, the display control device 31 may generate an interaction control in the interaction interface according to the acquired data. The interactive control is used for the user to perform interactive control on the first screen 32, where the interactive control is mainly interactive control initiated on the content displayed on the first screen 32, but is not limited to this. The interactive controls may be in any control form, and may be, for example, buttons, icon lists, text labels, input boxes, menus, picture boxes, scroll bars, and the like. In addition, the number of the interactive controls can be one or more.
In this embodiment, the style, and layout of the interactive interface are not limited, and all the interface styles, and layouts including the above interactive controls are applicable to the embodiments of the present application. It should be noted that, besides the above-mentioned interactive controls, the interactive interface may also include other contents, such as a visualization result of the data displayed on the first screen 32.
After obtaining the interactive interface, the display control device 31 may display the interactive interface on the second screen 33. The user may perform interactive control with the second screen 33, that is, the user may perform an operation on the interactive interface displayed on the second screen 33, for example, may initiate an operation on an interactive control on the interactive interface, where the operation may be a click, a double click, a touch, a long press, or a mouse hover, and the operation may be determined according to a human-computer interaction manner supported by the second screen 33. By means of the interactive control, the user can send an interactive instruction to the first screen 32, so as to realize interaction with the first screen 32, solve the problem that the user cannot directly perform interactive control on the first screen 32 or is not convenient to perform interactive control on the first screen 32, facilitate interactive control on the first screen 32, and improve the interaction efficiency with the first screen 32.
In the embodiment of the present application, the generation manner of the interactive control is not limited. For example, the interactive controls adapted to the interactive functions can be preset according to the interactive functions supported by the first screen 32, and based on this, the display control device 32 can directly add the preset interactive controls to the interactive interface. Besides, the interactive control in the interactive interface can be dynamically generated in real time according to the data to be displayed on the first screen 32. The following describes the manner of generating interactive controls dynamically and in real time:
the display control device 31 may analyze the visualization property of the data displayed on the first screen 32; generating an interactive control in an interactive interface according to the visual attribute of the data; wherein, the interactive control is used for carrying out interactive operation on the visualization result of the data on the first screen 32. For a description of the visualization property of the data, reference may be made to the foregoing embodiments, which are not described in detail herein.
Further optionally, an embodiment of the display control device 31 generating, according to the visualization attribute of the data, an interaction control in the interaction interface includes: determining a corresponding target control type according to the visual attributes of the other data; and generating an interactive control in the interactive interface according to the type of the target control. The target control type is a control type determined according to the visualization property of the data displayed on the first screen 32 and suitable for interacting with the graphic form adopted by the data during visualization.
Optionally, an implementation manner of the generating the interactive control in the interactive interface according to the target control type includes: randomly selecting one or more interactive controls from the interactive controls under the target control type as interactive controls in an interactive interface; or, one or more interactive controls can be selected from the interactive controls under the target control type according to attribute information such as the size or style of the interactive controls, and the selected interactive controls can be used as the interactive controls in the interactive interface.
Optionally, another embodiment of the generating an interactive control in an interactive interface according to the type of the target control includes: and selecting an interaction control in the interaction interface from the interaction controls under the target control type according to at least one information of the visualization graphic form used by the first screen 32 for displaying the data, the type of the second screen 32 and the specification size of the second screen 32. In the optional implementation manner, in the process of selecting the interactive control from the target control type, the visual graphic form used by the first screen 32 to display data, the type of the second screen 32 and/or the specification size of the second screen 32 are referred to, so that selection of a more appropriate interactive control is facilitated, the style, the layout and the like of the interactive interface are more reasonable and beautiful, and the user experience is improved.
In the embodiment of the present application, how the first screen 32 displays data is not limited. The first screen 32 may perform a data display operation by itself, for example, it may obtain data to be displayed from a data source, and display the data to be displayed according to a data display model preset by a user. Alternatively, the first screen 32 may perform data display tasks under the control of other devices. In addition to this, the display control device 31 can also control the first screen 32 to perform data display. For example, the display control device 31 may also acquire data to be displayed on the first screen 32, and perform visualization processing on the acquired data to obtain a data interface; the data interface is then displayed on the first screen 32.
In some application scenarios, the data displayed by the first screen 32 includes dynamic data. Based on this, the display control device 31 may also acquire the dynamic data displayed by the first screen 32 before displaying the interactive interface on the second screen 33; summary information in the interactive interface is generated according to the dynamic data, and the summary information can be used for a user to know the change condition of the dynamic data presented on the first screen 32. In other words, in this embodiment, the interactive interface includes not only the interactive control, but also summary information. For the user, the user sees the interactive interface displayed on the second screen 33, and can also quickly know the change condition of the dynamic data displayed on the first screen 32 according to the summary information on the interactive interface, so as to quickly determine what kind of interactive control needs to be performed on the first screen 32, and further send out a corresponding interactive instruction through the interactive control, so as to quickly perform interactive control on the first screen 32, and further improve the interactive efficiency with the first screen 32. For the generation process of the summary information, reference may be made to the foregoing embodiments, which are not described herein again.
In the embodiments, the display control device generates an interaction control in the interaction interface according to the data or dynamic data displayed on the first screen, and may perform interaction control on the visualized content displayed on the first screen based on the interaction control. In addition, in some embodiments, the display control device may further generate an interactive control in the interactive interface according to an interactive function supported by the first screen itself, where the interactive control is mainly used for a user to initiate various controls on the first screen itself, for example, to control the first screen to be turned off and on, and to adjust the brightness and the resolution of the first screen.
Optionally, the interactive control supported by the first screen itself may be directly added to the interactive interface, and the interactive interface is displayed on the second screen, so that the user can perform various interactive controls on the first screen itself through the interactive controls on the second screen.
In an application scenario, the first screen may be a screen of a movie playing device, for example, a screen of a television; the second screen may be a screen of the mobile terminal, for example, a screen of a smartphone. The method comprises the steps that an interactive interface required by playing control over a television can be generated, a control corresponding to the television is added to the interactive interface, and the interactive interface is sent to a smart phone; for a user, the interactive interface on the smart phone can be opened, that is, the interactive interface is displayed on the screen of the smart phone, and then various playing controls can be performed on the television by using the interactive control in the interactive interface, such as adjusting the playing progress, tuning the channel, pausing, turning off, starting up, adjusting the playing volume, adjusting the brightness and resolution of the television screen, and the like, so that the playing control of the television is greatly facilitated.
In the above embodiment, by generating an interactive interface with an interactive control and displaying the interactive interface on the second screen, a user can interactively control the first screen on which the streaming media content is played through the second screen; moreover, according to the difference of the physical meanings of the interactive controls, the user can perform interactive control on the content displayed by the first screen, also can perform interactive control on the first screen, and certainly can perform interactive control on the content displayed by the first screen and also perform interactive control on the first screen. In addition, the technical scheme provided by the embodiment of the application can also be applied to non-streaming media playing scenes. For example, any hardware device or module capable of performing a task may be controlled by generating an interactive interface with interactive controls. The method comprises the steps that a control interface with different control types can be generated based on the current task type of a hardware device or a module to be controlled; the control interface is displayed on a screen, so that a user can correspondingly control the controlled hardware equipment or module through different control types in the control interface, for example, the task execution speed, the task execution time, the task execution can be adjusted, the task execution can be closed, the task execution can be suspended, the task execution result can be viewed, and the like.
Fig. 4 is a flowchart illustrating a display control method according to an exemplary embodiment of the present disclosure. As shown in fig. 4, the method includes:
40. acquiring data to be displayed on a first screen, wherein the data comprises dynamic data;
41. generating abstract information in an interactive interface according to the dynamic data, wherein the interactive interface also comprises an interactive control;
42. and displaying an interactive interface on the second screen to guide a user to carry out interactive control on the first screen through the interactive control through the summary information.
In some optional embodiments, step 41, generating summary information in the interactive interface according to the dynamic data includes: directly taking the dynamic data as abstract information in an interactive interface; or generating the change trend of the dynamic data as summary information in the interactive interface.
In some optional embodiments, the method of this embodiment further includes: analyzing visual attributes of other data associated with the dynamic data; generating an interactive control in the interactive interface according to the visual attributes of other data; the interactive control is used for carrying out interactive operation on the visualization result of other data.
Further optionally, the method of this embodiment further includes: determining a visual graph form required for displaying other data according to the visual attributes of the other data; carrying out visualization processing on the data according to the visualization graph form to obtain a data interface; displaying a data interface on a first screen; wherein, the data interface comprises the visualization result of other data under the visualization graph form.
Optionally, an embodiment of the generating an interaction control in an interaction interface according to the visualization property of the other data includes: determining a corresponding target control type according to the visual attributes of other data; and generating an interactive control in the interactive interface according to the type of the target control.
In an optional embodiment, the determining, according to the visualization property of the other data, the corresponding target control type includes: if the visualization attributes of other data comprise time attributes, determining a control type supporting interaction according to time as a target control type; if the visualization attributes of other data comprise position attributes, determining a control type supporting interaction according to the position as a target control type; and if the visualization attributes of other data comprise category attributes, determining the control type supporting interaction according to the category as the target control type.
Optionally, the control types supporting interaction by time may include: a timeline control that can be slid or clicked; the control types supporting interaction according to positions can comprise: a map control that can be zoomed, rotated, or dragged; the control types supporting interaction by category may include: and selecting a control, wherein the selection control can be clicked, touched or pressed for a long time.
In an optional embodiment, the generating an interactive control in the interactive interface according to the target control type includes: and selecting an interactive control matched with the visual graph form from the interactive controls in the target control type as the interactive control in the interactive interface.
In an alternative embodiment, determining the visualization graph shape required for displaying the other data according to the visualization attributes of the other data comprises: if the visual attributes of other data comprise time attributes, determining a graphic form which can be displayed according to time; if the visualization attributes of other data comprise position attributes, determining a graphic form which can be displayed according to the position; if the visualization attributes of other data comprise category attributes, determining the graphic forms capable of being displayed in a classified mode.
In an optional embodiment, the method of this embodiment further includes: and responding to the interactive operation sent by the second screen aiming at the interactive control, and carrying out corresponding operation on the data interface on the first screen.
In an optional embodiment, the interactive interface comprises an information area and an interactive control area, the summary information is located in the information area, and the interactive control is located in the interactive control area.
Optionally, the first screen has a larger format size than the second screen. Further, the second screen is a screen of a terminal device for remotely controlling the first screen; the first screen is an independent screen.
For detailed description of each step in the embodiment of the method, reference may be made to the foregoing embodiment of the system, which is not described herein again.
In this embodiment, for a first screen, an interactive interface is generated for the first screen, the interactive interface is displayed on a second screen, and the interactive interface includes an interactive control, so that a user can interactively control the first screen on the second screen through the interactive control, the user can interactively control the first screen conveniently, and the improvement of the interactive efficiency is facilitated. Furthermore, the interactive interface can also comprise summary information generated according to the dynamic data required to be displayed on the first screen, and based on the summary information, a user can quickly read the dynamic information on the first screen without tracking the change of the dynamic data on the first screen, so that the interactive control on the first screen is facilitated, and the interactive efficiency with the first screen is further improved.
Fig. 5 is a flowchart illustrating an interaction control method according to an exemplary embodiment of the present application. As shown in fig. 5, the method includes:
50. displaying an interactive interface on a second screen, wherein the interactive interface comprises summary information and an interactive control;
51. responding to the interactive operation sent out by the interactive control, and performing corresponding operation on the data interface on the first screen; the summary information is generated according to dynamic data displayed in the data interface and is used for guiding a user to send out interactive operation on the interactive control.
Optionally, the interactive controls are generated from visualization properties of other data displayed in the data interface. For the generation manner of the interaction control and the summary information, reference may be made to the description in the foregoing system embodiment, and details are not described here.
In this embodiment, the interactive interface corresponding to the first screen is displayed on the second screen, so that the user can perform interactive control on the first screen through the interactive control on the second screen, the user can perform interactive control on the first screen conveniently, and the improvement of the interactive efficiency is facilitated. Furthermore, the interactive interface can also comprise summary information generated according to the dynamic data required to be displayed on the first screen, and based on the summary information, a user can quickly read the dynamic information on the first screen without tracking the change of the dynamic data on the first screen, so that the interactive control on the first screen is facilitated, and the interactive efficiency with the first screen is further improved.
Fig. 6 is a flowchart illustrating another display control method according to an exemplary embodiment of the present application. As shown in fig. 6, the method includes:
60. acquiring data displayed by a first screen;
61. generating an interactive control in an interactive interface according to the data;
62. and displaying the interactive interface on a second screen so that a user can interactively control the first screen through the interactive control.
In an optional embodiment, generating an interaction control in the interaction interface according to the data includes: analyzing the visual attribute of the data; generating an interactive control in an interactive interface according to the visual attribute of the data; the interactive control is used for carrying out interactive operation on the visualization result of the data.
In an optional embodiment, generating an interaction control in the interaction interface according to the visualization property of the data includes: determining a corresponding target control type according to the visual attributes of other data; and generating an interactive control in the interactive interface according to the type of the target control.
Optionally, generating an interactive control in the interactive interface according to the type of the target control, including: and selecting an interactive control in the interactive interface from the interactive controls under the target control type according to at least one information of the visual graphic form used by the first screen display data, the type of the second screen and the specification and size of the second screen.
Optionally, before displaying the interactive interface on the second screen, the method of this embodiment may further include: acquiring dynamic data in the data; and generating summary information in the interactive interface according to the dynamic data, wherein the summary information can be used for a user to know the change condition of the dynamic data on the first screen.
For detailed description of each step in the embodiment of the method, reference may be made to the foregoing embodiment of the system, which is not described herein again.
In this embodiment, for a first screen, an interactive interface is generated for the first screen, the interactive interface is displayed on a second screen, and the interactive interface includes an interactive control, so that a user can interactively control the first screen on the second screen through the interactive control, the user can interactively control the first screen conveniently, and the improvement of the interactive efficiency is facilitated.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subjects of steps 40 to 42 may be device a; for another example, the execution subject of steps 40 and 41 may be device a, and the execution subject of step 42 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the order of the operations such as 40, 41, etc. are merely used for distinguishing different operations, and the order itself does not represent any execution order. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
Fig. 7a is a schematic structural diagram of a display control device according to an exemplary embodiment of the present application. As shown in fig. 7a, the apparatus comprises: a memory 74a, a processor 75a, and a communication component 76 a.
The memory 74a is used to store computer programs and may be configured to store other various data to support operations on the display control apparatus. Examples of such data include instructions, messages, pictures, videos, etc. for any application or method operating on the display control device.
The memory 74a may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A processor 75a, coupled to the memory 74a, for executing computer programs in the memory 74a for: acquiring data to be displayed on a first screen, wherein the data comprises dynamic data; generating abstract information in an interactive interface according to the dynamic data, wherein the interactive interface also comprises an interactive control; and displaying an interactive interface on the second screen to guide a user to carry out interactive control on the first screen through the interactive control through the summary information.
In some optional embodiments, when the processor 75a generates the summary information in the interactive interface according to the dynamic data, it is specifically configured to: directly taking the dynamic data as abstract information in an interactive interface; or generating the change trend of the dynamic data as summary information in the interactive interface.
In some optional embodiments, the processor 75a is further configured to: analyzing visual attributes of other data associated with the dynamic data; generating an interactive control in the interactive interface according to the visual attributes of other data; the interactive control is used for carrying out interactive operation on the visualization result of other data.
Further optionally, the processor 75a is further configured to: determining a visual graph form required for displaying other data according to the visual attributes of the other data; carrying out visualization processing on the data according to the visualization graph form to obtain a data interface; displaying a data interface on a first screen; wherein, the data interface comprises the visualization result of other data under the visualization graph form.
Optionally, when the processor 75a generates an interaction control in the interaction interface, it is specifically configured to: determining a corresponding target control type according to the visual attributes of other data; and generating an interactive control in the interactive interface according to the type of the target control.
In an alternative embodiment, when determining the corresponding target control type, the processor 75a is specifically configured to: if the visualization attributes of other data comprise time attributes, determining a control type supporting interaction according to time as a target control type; if the visualization attributes of other data comprise position attributes, determining a control type supporting interaction according to the position as a target control type; and if the visualization attributes of other data comprise category attributes, determining the control type supporting interaction according to the category as the target control type.
Optionally, the control types supporting interaction by time may include: a timeline control that can be slid or clicked; the control types supporting interaction according to positions can comprise: a map control that can be zoomed, rotated, or dragged; the control types supporting interaction by category may include: and selecting a control, wherein the selection control can be clicked, touched or pressed for a long time.
In an optional embodiment, when the processor 75a generates an interactive control in the interactive interface, it is specifically configured to: and selecting an interactive control matched with the visual graph form from the interactive controls in the target control type as the interactive control in the interactive interface.
In an alternative embodiment, the processor 75a, when determining the visualization graphical modality required for displaying the other data, is specifically configured to: if the visual attributes of other data comprise time attributes, determining a graphic form which can be displayed according to time; if the visualization attributes of other data comprise position attributes, determining a graphic form which can be displayed according to the position; if the visualization attributes of other data comprise category attributes, determining the graphic forms capable of being displayed in a classified mode.
In an alternative embodiment, the processor 75a is further configured to: and responding to the interactive operation sent by the second screen aiming at the interactive control, and carrying out corresponding operation on the data interface on the first screen.
In an optional embodiment, the interactive interface comprises an information area and an interactive control area, the summary information is located in the information area, and the interactive control is located in the interactive control area.
Optionally, the first screen has a larger format size than the second screen. Further, the second screen is a screen of a terminal device for remotely controlling the first screen; the first screen is an independent screen.
Further, as shown in fig. 7a, the display control apparatus further includes: display 77a, power supply 78a, audio 79a, and the like. Only some of the components are schematically shown in fig. 7a, and it is not meant that the display control apparatus includes only the components shown in fig. 7 a. In addition, the components within the dashed box in fig. 7a are optional components, not necessary components, and may be determined according to the product form of the display control device. The display control device of this embodiment may be implemented as a terminal device such as a desktop computer, a notebook computer, a smart phone, or an IOT device, or may be a display control device such as a conventional server, a cloud server, or a server array. If the display control device of this embodiment is implemented as a terminal device such as a desktop computer, a notebook computer, a smart phone, etc., the display control device may include components within a dashed line frame in fig. 7 a; if the display control device of the present embodiment is implemented as a display control device such as a conventional server, a cloud server, or a server array, the components within the dashed box in fig. 7a may not be included.
Specifically, the display 77a may be a display independent of the first screen and the second screen, may be implemented as the first screen, or may be implemented as the second screen.
In addition to the display control device described above, embodiments of the present application also provide a computer-readable storage medium storing computer instructions that, when executed by one or more processors, cause the one or more processors to perform acts comprising: acquiring data to be displayed on a first screen, wherein the data comprises dynamic data; generating abstract information in an interactive interface according to the dynamic data, wherein the interactive interface also comprises an interactive control; and displaying an interactive interface on the second screen to guide a user to interactively control the first screen through the interactive control through the summary information.
In addition to the above operations, the processor executing the computer program may also implement other operations, which may be specifically referred to the description in the foregoing embodiments and will not be described herein again.
Fig. 7b is a schematic structural diagram of a display control device according to an exemplary embodiment of the present application. As shown in fig. 7b, the apparatus comprises: a memory 74b, a processor 75b, and a communication component 76 b.
A memory 74b for storing a computer program and may be configured to store other various data to support operations on the display control apparatus. Examples of such data include instructions, messages, pictures, videos, etc. for any application or method operating on the display control device.
The memory 74b may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A processor 75b, coupled to the memory 74b, for executing computer programs in the memory 74b for: acquiring data displayed by a first screen; generating an interactive control in an interactive interface according to the data; and displaying an interactive interface on the second screen so that a user can interactively control the first screen through the interactive control.
In an optional embodiment, when the processor 75b generates the interactive control in the interactive interface according to the data, it is specifically configured to: analyzing the visual attribute of the data; generating an interactive control in an interactive interface according to the visual attribute of the data; the interactive control is used for carrying out interactive operation on the visualization result of the data.
In an optional embodiment, when the processor 75b generates an interaction control in the interaction interface, it is specifically configured to: determining a corresponding target control type according to the visual attributes of other data; and generating an interactive control in the interactive interface according to the type of the target control.
Optionally, when the processor 75b generates an interactive control in the interactive interface according to the target control type, the processor is specifically configured to: and selecting an interactive control in the interactive interface from the interactive controls under the target control type according to at least one information of the visual graphic form used by the first screen display data, the type of the second screen and the specification and size of the second screen.
Optionally, the processor 75b is further configured to: acquiring dynamic data in the data before the interactive interface is displayed on the second screen; and generating summary information in the interactive interface according to the dynamic data, wherein the summary information can be used for a user to know the change condition of the dynamic data on the first screen.
Further, as shown in fig. 7b, the display control apparatus further includes: display 77b, power supply component 78a, audio component 79a, and the like. Only some of the components are schematically shown in fig. 7b, and it is not meant that the display control apparatus includes only the components shown in fig. 7 b. In addition, the components within the dashed box in fig. 7b are optional components, not necessary components, and may be determined according to the product form of the display control device. The display control device of this embodiment may be implemented as a terminal device such as a desktop computer, a notebook computer, a smart phone, or an IOT device, or may be a display control device such as a conventional server, a cloud server, or a server array. If the display control device of this embodiment is implemented as a terminal device such as a desktop computer, a notebook computer, a smart phone, etc., the display control device may include components within a dashed line frame in fig. 7 b; if the display control device of the present embodiment is implemented as a display control device such as a conventional server, a cloud server, or a server array, the components within the dashed box in fig. 7b may not be included.
Specifically, the display 77b may be a display independent of the first screen and the second screen, may be implemented as the first screen, or may be implemented as the second screen.
In addition to the display control device described above, embodiments of the present application also provide a computer-readable storage medium storing computer instructions that, when executed by one or more processors, cause the one or more processors to perform acts comprising: acquiring data displayed by a first screen; generating an interactive control in an interactive interface according to the data; and displaying an interactive interface on the second screen so that a user can interactively control the first screen through the interactive control.
In addition to the above operations, the processor executing the computer program may also implement other operations, which may be specifically referred to the description in the foregoing embodiments and will not be described herein again.
Fig. 8 is a schematic structural diagram of an interactive control device according to an exemplary embodiment of the present application. As shown in fig. 8, the apparatus includes: a memory 84 and a processor 85.
The memory 84 is used to store computer programs and may be configured to store other various data to support operations on the interactive control device. Examples of such data include instructions, messages, pictures, videos, etc. for any application or method operating on the interactive control device.
The memory 84 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A processor 85 coupled to the memory 84 for executing computer programs in the memory 84 for: displaying an interactive interface on a second screen, wherein the interactive interface comprises summary information and an interactive control; responding to the interactive operation sent out by the interactive control, and performing corresponding operation on the data interface on the first screen; the summary information is generated according to dynamic data displayed in the data interface and is used for guiding a user to send out interactive operation on the interactive control.
Further, as shown in fig. 8, the interactive control apparatus further includes: a display 87, a communications component 86, a power component 88, an audio component 89, and the like. Only some of the components are schematically shown in fig. 8, and it is not meant that the interactive control device comprises only the components shown in fig. 8. In addition, the components within the dashed box in fig. 8 are optional components, not necessary components, and may be determined according to the product form of the interactive control device. The interaction control device of this embodiment may be implemented as a terminal device such as a desktop computer, a notebook computer, a smart phone, or an IOT device, or may be a display control device such as a conventional server, a cloud server, or a server array. If the interactive control device of this embodiment is implemented as a terminal device such as a desktop computer, a notebook computer, a smart phone, etc., the interactive control device may include components within a dashed line frame in fig. 8; if the interactive control device of this embodiment is implemented as a display control device such as a conventional server, a cloud server, or a server array, the components in the dashed box in fig. 8 may not be included.
In addition to the above-described interactive control device, embodiments of the present application also provide a computer-readable storage medium storing computer instructions that, when executed by one or more processors, cause the one or more processors to perform acts comprising: displaying an interactive interface on a second screen, wherein the interactive interface comprises summary information and an interactive control; responding to the interactive operation sent out by the interactive control, and performing corresponding operation on the data interface on the first screen; the summary information is generated according to dynamic data displayed in the data interface and is used for guiding a user to send out interactive operation on the interactive control.
In addition to the above operations, the processor executing the computer program may also implement other operations, which may be specifically referred to the description in the foregoing embodiments and will not be described herein again.
The communication components of fig. 7 a-8 described above are configured to facilitate communication between the device in which the communication component is located and other devices in a wired or wireless manner. The device where the communication component is located can access a wireless network based on a communication standard, such as a WiFi, a 2G, 3G, 4G/LTE, 5G and other mobile communication networks, or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component may further include a Near Field Communication (NFC) module, Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and the like.
The displays in fig. 7 a-8 described above include screens, which may include Liquid Crystal Displays (LCDs) and Touch Panels (TPs). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The power supply components of fig. 7 a-8 described above provide power to the various components of the device in which the power supply components are located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
The audio components of fig. 7 a-8 described above may be configured to output and/or input audio signals. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (27)

1. A display control method, comprising:
acquiring data to be displayed on a first screen, wherein the data comprises dynamic data;
generating abstract information in an interactive interface according to the dynamic data, wherein the interactive interface further comprises an interactive control;
and displaying the interactive interface on a second screen to guide a user to carry out interactive control on the first screen through the interactive control through the summary information.
2. The method of claim 1, wherein generating summary information in an interactive interface from the dynamic data comprises:
directly taking the dynamic data as abstract information in the interactive interface; or
And generating the change trend of the dynamic data as summary information in the interactive interface.
3. The method of claim 1, further comprising:
analyzing visual attributes of other data associated with the dynamic data;
generating an interactive control in the interactive interface according to the visual attributes of the other data;
and the interaction control is used for carrying out interaction operation on the visualization result of the other data.
4. The method of claim 3, further comprising:
determining a visualization graph form required for displaying the other data according to the visualization attributes of the other data;
performing visualization processing on the data according to the visualization graph form to obtain a data interface;
displaying the data interface on the first screen; wherein the data interface comprises a visualization result of the other data in the visualization graph form.
5. The method of claim 4, wherein generating an interaction control in the interactive interface according to the visualization properties of the other data comprises:
determining a corresponding target control type according to the visual attributes of the other data;
and generating an interactive control in the interactive interface according to the type of the target control.
6. The method of claim 5, wherein determining the corresponding target control type based on the visualization properties of the other data comprises:
if the visualization attributes of the other data comprise time attributes, determining a control type supporting interaction according to time as the target control type;
if the visualization attributes of the other data comprise position attributes, determining a control type supporting interaction according to the position as the target control type;
and if the visualization attributes of the other data comprise category attributes, determining a control type supporting interaction according to categories as the target control type.
7. The method of claim 6,
types of controls that support interaction by time include: a timeline control that can be slid or clicked;
the types of controls that support interaction by location include: a map control that can be zoomed, rotated, or dragged;
types of controls that support interaction by category include: selecting a control, wherein the selection control can be clicked, touched or pressed for a long time.
8. The method of claim 5, wherein generating the interactive control in the interactive interface according to the target control type comprises:
and selecting an interactive control matched with the visual graph form from the interactive controls in the target control type as the interactive control in the interactive interface.
9. The method of claim 4, wherein determining a visualization graphical modality required for displaying the other data based on the visualization attributes of the other data comprises:
if the visual attributes of the other data comprise time attributes, determining a graphic form which can be displayed according to time;
if the visualization attributes of the other data comprise position attributes, determining a graphic form which can be displayed according to the position;
and if the visual attributes of the other data comprise category attributes, determining the graphic forms capable of being displayed in a classified mode.
10. The method of claim 4, further comprising:
and responding to the interactive operation sent by the second screen aiming at the interactive control, and carrying out corresponding operation on the data interface on the first screen.
11. The method according to any one of claims 1-10, wherein the interactive interface comprises an information area and an interactive control area, the summary information is located in the information area, and the interactive control is located in the interactive control area.
12. The method of any of claims 1-10, wherein a gauge size of the first screen is larger than a gauge size of the second screen.
13. The method according to claim 12, wherein the second screen is a screen of a terminal device that remotely controls the first screen; the first screen is an independent screen.
14. An interaction control method, comprising:
displaying an interactive interface on a second screen, wherein the interactive interface comprises summary information and an interactive control;
responding to the interactive operation sent out by the interactive control, and performing corresponding operation on the data interface on the first screen;
the summary information is generated according to the dynamic data displayed in the data interface and is used for guiding a user to send out interactive operation on the interactive control.
15. The method of claim 14, wherein the interaction control is generated based on visual properties of other data displayed in the data interface.
16. A display control method, comprising:
acquiring data displayed by a first screen;
generating an interactive control in an interactive interface according to the data;
and displaying the interactive interface on a second screen so that a user can interactively control the first screen through the interactive control.
17. The method of claim 16, wherein generating an interaction control in an interactive interface from the data comprises:
analyzing the visual attributes of the data;
generating an interactive control in the interactive interface according to the visual attribute of the data;
and the interaction control is used for carrying out interaction operation on the visualization result of the data.
18. The method of claim 17, wherein generating an interaction control in the interactive interface according to the visualization properties of the data comprises:
determining a corresponding target control type according to the visual attribute of the data;
and generating an interactive control in the interactive interface according to the type of the target control.
19. The method of claim 18, wherein generating an interactive control in the interactive interface according to the target control type comprises:
and selecting an interactive control in the interactive interface from interactive controls under the target control type according to at least one information of a visual graphic form used by the first screen for displaying the data, the type of the second screen and the specification and size of the second screen.
20. The method of claim 16, prior to displaying the interactive interface on the second screen, further comprising:
acquiring dynamic data in the data;
and generating summary information in the interactive interface according to the dynamic data, wherein the summary information can be used for the user to know the change condition of the dynamic data on the first screen.
21. A display control system, comprising: a display control device, a first screen, and a second screen;
the display control device is used for acquiring data to be displayed on a first screen and visually displaying the data on the first screen, wherein the data comprises dynamic data; and
generating abstract information in an interactive interface according to the dynamic data, wherein the interactive interface further comprises an interactive control; and displaying the interactive interface on a second screen to guide a user to carry out interactive control on the first screen through the interactive control through the summary information.
22. The system of claim 21, further comprising: the terminal equipment is used for remotely controlling the first screen; the second screen is a screen of the terminal equipment; the first screen is an independent screen.
23. A display control system, comprising: a display control device, a first screen, and a second screen;
the display control device is used for acquiring data displayed by the first screen; generating an interactive control in an interactive interface according to the data; and displaying the interactive interface on the second screen so that a user can interactively control the first screen through the interactive control.
24. A display control apparatus characterized by comprising: a memory and a processor;
the memory for storing a computer program;
the processor, coupled with the memory, to execute the computer program to:
acquiring data to be displayed on a first screen, wherein the data comprises dynamic data;
generating abstract information in an interactive interface according to the dynamic data, wherein the interactive interface further comprises an interactive control;
and displaying the interactive interface on a second screen to guide a user to carry out interactive control on the first screen through the interactive control through the summary information.
25. An interactive control device, comprising: a memory and a processor;
the memory for storing a computer program;
the processor, coupled with the memory, to execute the computer program to:
displaying an interactive interface on a second screen, wherein the interactive interface comprises summary information and an interactive control;
responding to the interactive operation sent out by the interactive control, and performing corresponding operation on the data interface on the first screen; the summary information is generated according to the dynamic data displayed in the data interface and is used for guiding a user to send out interactive operation on the interactive control.
26. A display control apparatus characterized by comprising: a memory and a processor;
the memory for storing a computer program;
the processor, coupled with the memory, to execute the computer program to:
acquiring data displayed by a first screen;
generating an interactive control in an interactive interface according to the data;
and displaying the interactive interface on a second screen so that a user can interactively control the first screen through the interactive control.
27. A computer-readable storage medium having a computer program stored thereon, which, when executed by a processor, causes the processor to carry out the steps of the method according to any one of claims 1-20.
CN201911090098.3A 2019-11-08 2019-11-08 Display control and interaction control method, device, system and storage medium Pending CN112783398A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911090098.3A CN112783398A (en) 2019-11-08 2019-11-08 Display control and interaction control method, device, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911090098.3A CN112783398A (en) 2019-11-08 2019-11-08 Display control and interaction control method, device, system and storage medium

Publications (1)

Publication Number Publication Date
CN112783398A true CN112783398A (en) 2021-05-11

Family

ID=75748556

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911090098.3A Pending CN112783398A (en) 2019-11-08 2019-11-08 Display control and interaction control method, device, system and storage medium

Country Status (1)

Country Link
CN (1) CN112783398A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113296655A (en) * 2021-06-10 2021-08-24 湖北亿咖通科技有限公司 Control method, device and equipment for display screen
CN113453057A (en) * 2021-06-22 2021-09-28 海信电子科技(深圳)有限公司 Display device and playing progress control method
CN113867869A (en) * 2021-09-29 2021-12-31 上海哔哩哔哩科技有限公司 Interactive data display method and device
CN115079911A (en) * 2022-06-14 2022-09-20 北京字跳网络技术有限公司 Data processing method, device, equipment and storage medium
CN115878010A (en) * 2023-03-01 2023-03-31 南方科技大学 Operation interaction method and device, electronic equipment and computer-readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140013519A (en) * 2012-07-24 2014-02-05 엘지전자 주식회사 Mobile terminal having wide screen size and content displaying method thereof
CN104331243A (en) * 2014-10-22 2015-02-04 积成电子股份有限公司 Mobile terminal and large screen display interaction control method based on thumbnail reconstruction
US20160092152A1 (en) * 2014-09-25 2016-03-31 Oracle International Corporation Extended screen experience
CN106095919A (en) * 2016-06-12 2016-11-09 上海交通大学 Data variation trend spring visualization system and method towards analysis of central issue
CN106445355A (en) * 2016-12-19 2017-02-22 广东威创视讯科技股份有限公司 Method and small screen device for showing contents of large screen
CN109062640A (en) * 2018-06-28 2018-12-21 努比亚技术有限公司 A kind of interface Activiation method, terminal and computer readable storage medium
CN110110002A (en) * 2019-05-13 2019-08-09 江南大学 Big data virtual interactive interface system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140013519A (en) * 2012-07-24 2014-02-05 엘지전자 주식회사 Mobile terminal having wide screen size and content displaying method thereof
US20160092152A1 (en) * 2014-09-25 2016-03-31 Oracle International Corporation Extended screen experience
CN104331243A (en) * 2014-10-22 2015-02-04 积成电子股份有限公司 Mobile terminal and large screen display interaction control method based on thumbnail reconstruction
CN106095919A (en) * 2016-06-12 2016-11-09 上海交通大学 Data variation trend spring visualization system and method towards analysis of central issue
CN106445355A (en) * 2016-12-19 2017-02-22 广东威创视讯科技股份有限公司 Method and small screen device for showing contents of large screen
CN109062640A (en) * 2018-06-28 2018-12-21 努比亚技术有限公司 A kind of interface Activiation method, terminal and computer readable storage medium
CN110110002A (en) * 2019-05-13 2019-08-09 江南大学 Big data virtual interactive interface system

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113296655A (en) * 2021-06-10 2021-08-24 湖北亿咖通科技有限公司 Control method, device and equipment for display screen
CN113453057A (en) * 2021-06-22 2021-09-28 海信电子科技(深圳)有限公司 Display device and playing progress control method
CN113867869A (en) * 2021-09-29 2021-12-31 上海哔哩哔哩科技有限公司 Interactive data display method and device
CN115079911A (en) * 2022-06-14 2022-09-20 北京字跳网络技术有限公司 Data processing method, device, equipment and storage medium
CN115878010A (en) * 2023-03-01 2023-03-31 南方科技大学 Operation interaction method and device, electronic equipment and computer-readable storage medium

Similar Documents

Publication Publication Date Title
CN112783398A (en) Display control and interaction control method, device, system and storage medium
EP3091426B1 (en) User terminal device providing user interaction and method therefor
EP2487578B1 (en) Method and system for controlling screen of mobile terminal
CN108334371B (en) Method and device for editing object
US11758080B2 (en) DIY effects image modification
KR20220130197A (en) Filming method, apparatus, electronic equipment and storage medium
KR102280620B1 (en) Method for editing media and an electronic device thereof
US20180130503A1 (en) Interface apparatus and recording apparatus
US20230168805A1 (en) Configuration of application execution spaces and sub-spaces for sharing data on a mobile touch screen device
US10901612B2 (en) Alternate video summarization
US20140176600A1 (en) Text-enlargement display method
WO2014019207A1 (en) Widget processing method, device and mobile terminal
US11182073B2 (en) Selection on user interface based on cursor gestures
CN107526505B (en) Data processing method and electronic equipment
CN104951477B (en) Method and apparatus for crossing filter data
CN112995401A (en) Control display method, device, equipment and medium
AU2017330785A1 (en) Electronic apparatus and controlling method thereof
KR20140113826A (en) apparatus and method for common of contents, communication service system
KR20140072737A (en) Display apparatus and Method for providing user menu thereof
CN112784128A (en) Data processing and display method, device, system and storage medium
CN115460448A (en) Media resource editing method and device, electronic equipment and storage medium
CN114090896A (en) Information display method and device and electronic equipment
CN109190097B (en) Method and apparatus for outputting information
CN113360064A (en) Method and device for searching local area of picture, medium and electronic equipment
CN113592983A (en) Image processing method and device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination