WO2021219002A1 - Dispositif d'affichage - Google Patents

Dispositif d'affichage Download PDF

Info

Publication number
WO2021219002A1
WO2021219002A1 PCT/CN2021/090538 CN2021090538W WO2021219002A1 WO 2021219002 A1 WO2021219002 A1 WO 2021219002A1 CN 2021090538 W CN2021090538 W CN 2021090538W WO 2021219002 A1 WO2021219002 A1 WO 2021219002A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
user
display
menu
global
Prior art date
Application number
PCT/CN2021/090538
Other languages
English (en)
Chinese (zh)
Inventor
华峰
王学磊
张凡文
薛梅
王楷
马志峰
牛峰
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202011016939.9A external-priority patent/CN112162809B/zh
Priority claimed from CN202011551199.9A external-priority patent/CN114296623A/zh
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Publication of WO2021219002A1 publication Critical patent/WO2021219002A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • This application relates to the technical field of display devices, and specifically to a display device.
  • the embodiment of the present application provides a display device that is used to centrally manage and control various function entries of a smart touch TV, so that users can quickly find the functions they want to use, and improve user experience.
  • a display device including:
  • a display including a touch screen
  • the controllers connected to the display and the user interface respectively are used to perform:
  • the display In response to the first input instruction, the display is controlled to display a global menu control, and the first input instruction is input by the user by touching the touch screen with a finger.
  • the controller is also used to execute:
  • the display In response to a second input instruction, the display is controlled to display a global menu control at a position where the user's finger stops moving, and the second input instruction is input by the user after pressing the global menu control with a finger and moving on the touch screen.
  • the controller is configured to execute:
  • the position of the global menu control is synchronously updated according to the center point coordinates of all finger touch coordinates, and the third input instruction is input by the user by touching the touch screen with at least four fingers.
  • the controller is also used to execute:
  • the menu display area also includes at least one application or tool control;
  • the menu display area also includes a global return control.
  • the controller is also used to execute:
  • a display device including:
  • a display including a touch screen
  • the controllers connected to the display and the user interface respectively are used to perform:
  • the display In response to a first input instruction, the display is controlled to display a global menu control and a global return control, and the first input instruction is input by the user by touching the touch screen with a finger.
  • the controller is also used to execute:
  • the display In response to the second input instruction, the display is controlled to display the global menu control and the global return control at the position where the user's finger stops moving. Touch screen mobile input.
  • the controller is further configured to execute: in response to the third input instruction, the positions of the global menu control and the global return control are synchronously updated according to the center point coordinates of all finger touch coordinates, the third input
  • the instruction is input by the user by touching the touch screen with at least four fingers;
  • the synchronous update of the positions of the global menu control and the global return control according to the center point coordinates of all finger touch coordinates includes:
  • the global menu control and the global return control are displayed at positions corresponding to the coordinates of the center point.
  • the controller is also used to execute:
  • the global menu control and the global return control are displayed at positions corresponding to the display coordinates.
  • the controller is also used to execute:
  • the menu display area also includes at least one application or tool control;
  • the adding an application or tool control to the menu display area, or deleting an application or tool control in the menu display area specifically includes:
  • the project coordinates are valid coordinates and the user's finger touch time exceeds the first preset time period, obtain the background image of the application or tool control corresponding to the current project coordinates, and draw a floating mirror image of the background image;
  • the display position of the application or tool control is updated, and the floating mirror image is hidden.
  • the present application provides a display device, including:
  • a display configured to display a user interface, the user interface including a global menu control and/or a global return control;
  • a controller configured to receive a first input instruction input by a user to select the global menu control
  • the user interface In response to the first input instruction, the user interface is controlled to display a menu display area, and the menu display area includes at least one application or tool control.
  • the controller is configured to:
  • the global menu control and/or the global return control are controlled to be displayed on the user interface.
  • the display includes a touch screen, and the second input instruction input by the user is input by the user touching the touch screen.
  • the controller is further configured to, in response to a third input instruction, control the display to display the global menu control and/or the global return control at a position where the user's finger stops moving, and the third The input instruction is input by the user moving on the touch screen after pressing the global menu control and/or the global return control with a finger.
  • the controller is further configured to, in response to a fourth input instruction, synchronously update the position of the global menu control according to the center point coordinates of all finger touch coordinates, and the fourth input instruction is for the user to pass multiple Touch the touch screen to input with two fingers.
  • the controller is further configured to:
  • an application or tool control is added to the menu display area, or an application or tool control in the menu display area is deleted.
  • adding an application or tool control to the menu display area, or deleting an application or tool control in the menu display area specifically includes:
  • the project coordinates are valid coordinates and the user's finger touch time exceeds the first preset time period, obtain the background image of the application or tool control corresponding to the current project coordinates, and draw a floating mirror image of the background image;
  • the display position of the application or tool control is updated, and the floating mirror image is hidden.
  • the synchronously updating the positions of the global menu control and the global return control according to the center point coordinates of all finger touch coordinates includes:
  • the global menu control and the global return control are displayed at positions corresponding to the coordinates of the center point.
  • it further includes:
  • a rotating component configured to drive the display to rotate
  • the controller is further configured to control the rotation component to drive the display to rotate in response to an instruction to rotate the screen input by the user;
  • the global menu control and/or the global return control are displayed at the position corresponding to the display coordinates.
  • the menu display area further includes a mode selection control
  • the controller is further configured to, in response to a user input triggering the mode selection control instruction, control the display device to call up a mode selection interface, the mode selection interface including a prompt screen for instructing the user to select a mode;
  • the display In response to the selection of the corresponding mode control input by the user, the display is controlled to enter the corresponding mode.
  • the menu display area includes a plurality of controls corresponding to a plurality of modes
  • the controller is further configured to, in response to a user input, trigger any one of the plurality of controls, and at the same time, control the display to enter a mode corresponding to the triggered control.
  • the menu display area further includes a favorite control
  • the controller is further configured to receive an instruction to select the favorite control input by the user;
  • the user can only see the favorite data in the mode in which the device is located.
  • the touch characteristics of the display device are fully utilized to make up for the insufficient operation of the remote control.
  • the global menu control and the global return control have also become the unified entrance to the various functions of the smart TV. Centralized management and control of each function allows users to quickly find the functions they want to use and enhance user experience.
  • FIG. 1A shows a usage scenario of a display device according to some embodiments
  • FIG. 1B shows a rear view of a display device according to some embodiments
  • FIG. 2 shows a block diagram of the hardware configuration of the control device 100 according to some embodiments
  • FIG. 3 shows a block diagram of the hardware configuration of the display device 200 according to some embodiments
  • FIG. 4 shows a software configuration diagram in the display device 200 according to some embodiments
  • FIG. 5 shows a schematic diagram of a user interface according to some embodiments
  • Figures 6-8 show schematic diagrams of a user interface according to some embodiments.
  • FIG. 9 shows a schematic diagram of a menu display area according to some embodiments.
  • Figure 13 shows a schematic diagram of a global menu control according to some embodiments.
  • Fig. 14-16 shows another schematic diagram of a user interface according to some embodiments.
  • Figures 17-20 show schematic diagrams of another menu display area according to some embodiments.
  • Fig. 21-22 shows another schematic diagram of a user interface according to some embodiments.
  • Figures 23-27 show schematic diagrams of a menu editing area according to some embodiments.
  • Fig. 28 is a schematic diagram of two modes according to an embodiment of the present application.
  • Fig. 29 is an interface diagram of an education mode according to an embodiment of the present application.
  • Figure 30 is a standard mode interface diagram according to an embodiment of the present application.
  • FIG. 31 is a schematic diagram of switching between two modes according to an embodiment of the present application.
  • Fig. 32 is an interface diagram of an education mode according to an embodiment of the present application.
  • FIG. 33 is a diagram of a topic details interface according to an embodiment of the present application.
  • FIG. 34 is a diagram of a my collection interface according to an embodiment of the present application.
  • Fig. 35 is an interface in the education mode according to an embodiment of the present application.
  • Fig. 1A is a schematic diagram of a usage scenario of a display device according to an embodiment.
  • the display device 200 also performs data communication with the server 400, and the user can operate the display device 200 through the smart device 300 or the control device 100.
  • the control device 100 may be a remote controller, and the communication between the remote controller and the display device includes at least one of infrared protocol communication or Bluetooth protocol communication, and other short-distance communication methods, and the display is controlled by wireless or wired methods.
  • the smart device 300 may include any one of a mobile terminal, a tablet computer, a computer, a notebook computer, and an AR/VR device.
  • the display device 200 may also be controlled in a manner other than the control device 100 and the smart device 300.
  • the display device 200 also performs data communication with the server 400.
  • the display device 200 may be allowed to communicate through a local area network (LAN), a wireless local area network (WLAN), and other networks.
  • the server 400 may provide various contents and interactions to the display device 200.
  • the display device can be fixedly placed on a table or wall, or can be placed on a supporting bracket, which can achieve movements such as ascending/descending or rotating, and can also be used as an art display.
  • the display device 200 includes a rotating component 276, a controller 250, a display 275, a terminal interface 278 protruding from the gap on the back plate, and a rotating component 276 connected to the back plate.
  • the rotating component 276 can make the display 275 Rotate.
  • the rotating component 276 can rotate the display screen to the vertical screen state, that is, the state where the vertical side length of the screen is greater than the horizontal side length, or it can rotate the screen to the horizontal screen state, that is, the screen horizontally.
  • the state where the side length is greater than the vertical side length is greater than the vertical side length.
  • FIG. 2 shows a block diagram of the configuration of the control device 100 according to an exemplary embodiment.
  • the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply.
  • the control device 100 can receive an input operation instruction from a user, and convert the operation instruction into an instruction that can be recognized and responded to by the display device 200, so as to serve as an interactive mediator between the user and the display device 200.
  • the communication interface 130 is used to communicate with the outside, and includes at least one of a WIFI chip, a Bluetooth module, an NFC or an alternative module.
  • the user input/output interface 140 includes at least one of a microphone, a touch panel, a sensor, a button, or an alternative module.
  • FIG. 3 shows a block diagram of the hardware configuration of the display device 200 according to an exemplary embodiment.
  • the display device 200 includes a tuner and demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface 280.
  • the controller includes a central processing unit, a video processor, an audio processor, a graphics processor, RAM, ROM, and the first interface to the nth interface for input/output.
  • the display 260 may be at least one of a liquid crystal display, an OLED display, a touch display, and a projection display, and may also be a projection device and a projection screen.
  • the tuner and demodulator 210 receives broadcast television signals through wired or wireless reception, and demodulates audio and video signals, such as EPG data signals, from multiple wireless or wired broadcast television signals.
  • the detector 230 is used to collect signals from the external environment or interact with the outside.
  • the controller 250 and the tuner and demodulator 210 may be located in different separate devices, that is, the tuner and demodulator 210 may also be in an external device of the main device where the controller 250 is located, such as an external set-top box.
  • the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in the memory.
  • the controller 250 controls the overall operation of the display device 200.
  • the user may input a user command on a graphical user interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the graphical user interface (GUI).
  • GUI graphical user interface
  • the user may input a user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
  • the "user interface” is a medium interface for interaction and information exchange between an application or operating system and a user, and it realizes the conversion between the internal form of information and the form acceptable to the user.
  • the commonly used form of the user interface is the Graphic User Interface (GUI), which refers to the user interface related to the operation of the computer that is displayed graphically. It can be an icon, window, control and other interface elements displayed on the display screen of an electronic device.
  • GUI Graphic User Interface
  • the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, etc. At least one of the visual interface elements.
  • the system is divided into four layers, from top to bottom, respectively, the application (Applications) layer (referred to as the “application layer”), and the Application Framework layer (referred to as the “framework layer”). “), Android runtime and system library layer (referred to as “system runtime library layer”), and kernel layer.
  • the kernel layer contains at least one of the following drivers: audio driver, display driver, Bluetooth driver, camera driver, WIFI driver, USB driver, HDMI driver, sensor driver (such as fingerprint sensor, temperature sensor, pressure sensor, etc.), and power supply Drive etc.
  • the controller controls the display to display global menu controls and global return controls. In some embodiments, after the user touches the screen of the display with a finger, the controller controls the display to display only global menu controls. When the user clicks on the global menu control, the global return control is displayed. In some embodiments, the user invokes the global menu control and the global return control through specific keys on the remote control. In some embodiments, the user can use a specific gesture, for example, the user shows a gesture of spreading five fingers to the camera, and the controller controls the display to display global menu controls and/or global return controls.
  • the steps of the controller controlling the display to display global menu controls and global return controls include:
  • Step S51 Monitor the boot broadcast of the display device to pull up the SoftControlService (soft control service);
  • Step S52 Register InputFilter (input filter) monitoring; in some implementations, the display device will pull the soft control service after booting, which can be obtained by registering the input filter monitoring Gesture action instructions or remote control instructions sent by the user.
  • Step S53 Receive the instruction input by the user through the user interface;
  • Step S54 Determine whether the instruction input by the user is a MotionEvent (touch event); after receiving the instruction input by the user, it is necessary to distinguish whether the input instruction is a MotionEvent (touch event) or KeyEvent (keyboard event). If it is a touch event, step S55 is executed.
  • Step S55 Perform algorithm matching processing on the touch event, and determine whether to display the global menu control and the global return control according to the processing result.
  • the algorithm matching processing includes pointCount (number of touch points), flinger distance (sliding distance), flinger vector (sliding vector), and so on.
  • pointCount number of touch points
  • flinger distance sliding distance
  • flinger vector sliding vector
  • the display is controlled to display global menu controls and global return controls.
  • the display when the number of touch points is 1, the display is controlled to display global menu controls and global return controls; when the number of touch points is 2, the Cartesian distance between the two touch points needs to be calculated.
  • the distance is greater than 100px, control the display to display the global menu control and global return control; when the number of touch points is 3, it is necessary to calculate the Cartesian distance of the maximum and minimum coordinates of the three touch points, when the Cartesian distance is greater than 200px
  • control the display to display global menu controls and global return controls when the number of touch points is 1, the display is controlled to display global menu controls and global return controls; when the number of touch points is 2, the Cartesian distance between the two touch points needs to be calculated.
  • the distance is greater than 100px, control the display to display the global menu control and global return control; when the number of touch points is 3, it is necessary to calculate the Cartesian distance of the maximum and minimum coordinates of the three touch points, when the Cartesian distance is greater
  • the global menu control and the global return control in an ideal state, when two fingers or three fingers touch the screen of the display at the same time, the global menu control and the global return control will be displayed. However, after the user touches the screen of the display, the finger often touches the screen successively. When a finger touches the screen, the global menu control and global return control are often displayed directly. In some embodiments, the default display state of the global menu control and the global return control is displayed. After the global menu control is clicked, the custom menu display area is displayed. The global menu control and global return control are suspended above all applications and are visible in the whole scene.
  • the global menu control and the global return control can be displayed.
  • the global menu control and global return control will not be displayed after the user's finger touches the screen of the display during the ATV search process; in some modes that require remote control operation to complete, or
  • the global menu controls and global return controls are not displayed after the user's finger touches the screen of the monitor, such as in factory mode and during boot navigation; when the finger touches the screen to complete drawing or text annotation , The same does not display the global menu controls and global return controls, such as in whiteboard applications and annotation applications; the display of the rotation animation on the display interface during the display rotation does not require the display of the global menu controls and global return controls. Therefore, during the display rotation After the user's finger touches the screen of the display, the global menu controls and global return controls are not displayed.
  • the user after turning on the display device, the user enters the startup homepage or enters an application through a remote control operation.
  • the display device provides a schematic diagram of the user interface to the display.
  • the user interface of FIG. 6 adds a global menu control 61 and a global return control 62 on the basis of the user interface of FIG. 7.
  • the user interface of FIG. 8 adds a global menu control 61 on the basis of the user interface of FIG. 7.
  • the menu display area 63 displays the global return control 62, as shown in FIG. 9.
  • the global menu control 61 and the global return control 62 can be suspended or embedded in the user interface of FIG. 7.
  • a control refers to a visual object displayed on a user interface of a display device to represent corresponding content such as icons, thumbnails, video clips, links, and so on.
  • the display form of the control is usually diversified.
  • the control may include text content and/or an image for displaying thumbnails related to the text content.
  • the control can be the text and/or icon of the application.
  • the global menu control 61 and the global return control 62 may be simplified graphics; as shown in FIG. 11, the global menu control 61 and the global return control 62 may be text; as shown in FIG.
  • the global menu control 61 and the global return control 62 can be simplified graphics and text; as shown in FIG. 13, the global menu control 61 can be a character image.
  • the current user interface displays global menu controls and global return controls, but the user does not input any instructions within the second preset time period, and the controller controls the display to display the global menu controls and global return controls in a semi-transparent state; The user has not input any instructions for the third preset period of time, and the display does not display global menu controls and global return controls.
  • the global menu control 61 and the global return control 62 when the global menu control 61 and the global return control 62 are not received within 5 seconds after the user input is not received, the global menu control 61 and the global return control 62 become translucent. , As shown in Figure 14; after another 5 seconds, the user input is not received, the global menu control 61 and the global return control 62 automatically disappear, as shown in the user interface in Figure 7.
  • the user uses at least two fingers to rotate and slide on the screen or input an instruction to rotate the screen through a remote control, so as to realize the operation of controlling the rotation component to drive the display to rotate.
  • the rotation animation is displayed and the global menu controls and global return controls are not displayed.
  • the drag coordinates of the global menu control and the global return control are recorded in real time, which helps to determine the actual coordinates of the global menu control and the global return control before the rotation.
  • FIG. 6 when the user inputs a rotation instruction, the rotation component drives the display to rotate, and the rotation animation is displayed during the rotation. After the rotation is completed, the user interface as shown in FIG. 15 is displayed.
  • the drag coordinates of the global menu control and the global return control can be the coordinates of the center point of the global menu control and the global return control, or the coordinates of a certain point on the edge of the global menu control and the global return control, or Pre-set the coordinates of any point in the global menu control and the global return control.
  • the state of the screen is the horizontal state
  • the user presses the global menu control or the global return control with a finger and moves on the touch screen to drive the global menu control and the global return control to move on the screen. Users can select the global menu control and global return control with their fingers, and drag the global menu control and global return control to any position on the screen.
  • it is possible to determine whether the user performs a click operation or a drag operation on the global menu control and the global return control by comparing the moving distance of the user's finger on the screen with the preset pixels. For example, when the moving distance of the user's finger on the screen is greater than 4px, it can be determined that the user is performing a drag operation and move the global menu control and global return control to the position where the finger stops.
  • the user can keep the finger on the screen in order to continue to move the global menu control and the global return control to the next required position.
  • the user can also take his finger off the screen, and the global menu control and global return control will no longer move.
  • the moving distance of the user's finger on the screen is not greater than 4px, it can be judged that the user is clicking. If the user clicks the global menu control, the operation of displaying the menu display area is executed; if the user clicks the global return control, the return is executed The instruction of the previous step.
  • the embodiment of the application makes full use of the touch characteristics of the display device and supports the global drag and drop function.
  • the global menu control and the global return control can be dragged to any position on the display screen, which is convenient for operation and improves user experience.
  • the user touches the touch screen with at least four fingers, so that the global menu control and the global return control are displayed on the coordinates of the center point of the user's finger, realizing multiple designated positions.
  • the step of specifying multiple positions includes: intercepting MotionEvent (touch event) based on the InputFilter (input filtering) mechanism; by intercepting the touch event, action_down (downward operation) and action_up (upward operation) can be obtained. Control events. Customize GestureDetectManager (gesture management class) based on touch events.
  • the custom gesture management class includes: record a complete action_down (downward operation) and action_up (upward operation) multi-finger touch events, and record the coordinates of each click in the form of an array, that is, the touch of each touch point Control coordinates; determine the maximum and minimum coordinates of all touch coordinates; calculate the Cartesian coordinate distance between the maximum and minimum coordinates; if the Cartesian coordinate distance is greater than the preset pixels, calculate the center point of all touch coordinates Coordinates; display the global menu control and global return control at the position corresponding to the center point coordinates.
  • the number of touch points is 5, and the preset pixel is 300px. If the Cartesian coordinate distance d of the above two points is not greater than 300, it is determined that it is not a valid five-finger operation; if the Cartesian coordinate distance d of the above two points is greater than 300, it can be determined that it is a valid five-finger operation.
  • the embodiment of the application makes full use of the touch characteristics of the display device, and can directly change the display position of the global menu control and the global return control on the display screen through five designated positions, without dragging the global menu control and the global return control, and the operation is convenient. Improve user experience.
  • the display displays a menu display area
  • the menu display area includes menu editing controls
  • the menu display area may also include at least one application or tool control. If the global return control is not displayed together with the global menu control, the menu display area may also include the global return control.
  • the global menu control 61 can be displayed with an expand arrow
  • the global return control 62 can be displayed with a return arrow; when the user clicks the global menu control 61, if no tool is currently added Or application, as shown in FIG. 17, the menu display area 63 includes the add tool/application control 64 and the menu retract control 66.
  • the menu editing interface can be entered, as shown in FIG. 22;
  • the menu retract control 66 the menu display area 63 can be retracted, as shown in FIG. 10.
  • the menu display area 63 includes at least one application or tool control, a menu retracting control 66, and a menu editing control 65.
  • the user clicks the menu editing control 65 he can enter the menu editing The interface is shown in Figure 22.
  • the display controls in the menu display area 63 can be changed by sliding the finger left and right on the display screen.
  • the position of the menu retracting control 66 is always displayed in the menu display area 63; when the user's finger slides to the right, the menu display area 63 is as shown in FIG. 19.
  • the menu display area 63 is as shown in FIG. 19.
  • a volume adjustment bar 68 is displayed in the menu display area 63, and the user can change the volume level by sliding a finger left and right on the display screen.
  • the menu display area 63 is automatically retracted without any operation by the user during the custom time.
  • the volume adjustment bar 68 is automatically retracted without any operation by the user during the custom time, as shown in FIG. 19.
  • the user can select the menu display area 63 with a finger and drag the menu display area 63 to any position on the screen.
  • the user can also display the menu display area on the center coordinate point of the finger on the touch screen by specifying multiple positions.
  • the menu display area 63 can be moved to the left adaptively, so that the contents of the menu display area are fully displayed.
  • the display device provides a user interface to the display, and the menu editing area 71 is included in the user interface.
  • the menu editing area 71 includes a selected menu area 72, a to-be-selected menu area 73, a reset control 74, and a completion control 75.
  • the selected menu area 72 includes applications or tool controls that have been selected to be displayed in the menu display area 63
  • the to-be-selected menu area 73 includes applications or tool controls that have not been selected to be displayed in the menu display area.
  • the application or tool control in the selected menu area 72 is cleared; when the user clicks the completion control 75, it returns to the user interface shown in FIG. 21.
  • the application or tool control displayed in the menu display area 63 is the application or tool control in the menu area 72 that has been selected before the completion control 75 is clicked.
  • the friends circle control 76 moves from the selected menu area 72 to the to-be-selected menu area 73.
  • the application or tool control may be displayed in the first position of the tool or application list.
  • the user can add applications or tool controls to the menu display area by dragging applications or tool controls with a finger in the menu editing area, or delete applications or tool controls in the menu display area.
  • the step of adding an application or tool control to the menu display area, or deleting an application or tool control in the menu display area specifically includes: detecting whether an action_down event occurs; if an action_down event is detected, it is detected
  • pointToPosition pointToPosition function
  • the step of adding an application or tool control to the menu display area, or deleting an application or tool control in the menu display area specifically includes: detecting whether an action_down event occurs; if an action_down event is detected, it is detected
  • pointToPosition pointToPosition function
  • the method for determining whether the item coordinates are valid is to determine the item Whether the coordinates have application or tool controls.
  • the item coordinate does not have any application or tool control, the item coordinate is invalid; if the item coordinate is within the range of a certain application or tool control, the item coordinate is valid; if the item coordinate is invalid, execute the step of detecting whether an action_down event occurs; if the item If the coordinates are valid, send a long press determination message; determine whether the action_down event exceeds the first preset duration, that is, whether the user's finger touch duration exceeds the first preset duration; in some embodiments, the first preset duration is 800 milliseconds; if Whether the action_down event exceeds the first preset duration, obtain the background image of the application or tool control corresponding to the current project coordinate, and draw the floating mirror of the background image through the window manager (window management); hide the application or tool control; detect action_move (moving operation) ) Event, update the mirror position, that is, update the position of the floating mirror according to the position of the user's finger movement; detect whether an action_up event occurs; if an action_up event is detected,
  • the display position of the application or tool control can be updated to the position where the user's finger leaves the touch screen, or it can be updated to a preset position, such as the end of the selected application or tool control. If the action_up event is not detected, perform the steps of detecting the action_move event and updating the mirror position.
  • the user can drag the application or tool control in the to-be-selected menu area 73 to the range of the selected menu area 72.
  • the application or tool control can be added to the end of the selected menu area 72.
  • the menu editing area as shown in FIG. 23 when the user selects the relatives and friends circle control 76 in the to-be-selected menu area 73, and drags the relatives and friends circle control 76 from the to-be-selected menu area 73 by the gesture drag method as shown in FIG. Drag to the selected menu area 72, as shown in Figure 24.
  • the number of applications or tool controls in the selected menu area 72 reaches the preset number, if the user continues to add applications or tool controls, a prompt message "Up to 13 applications or tools can be added" pops up.
  • the user can drag an application or tool control in the to-be-selected menu area 73 to a designated position in the selected menu area 72. If the specified position is located between two selected applications or tool controls, the dragged application or tool control is displayed at the position of the next selected application or tool control, and the next selected application or tool control is moved to the next position , Other applications or tool controls move to the next position in turn. If the designated position is not occupied by an application or tool control, the dragged application or tool control can be directly added to the designated position or added to the end of the selected menu area 72.
  • the dragged application or tool control will be added to the specified position; and the original application or tool control at this position can be directly moved to the tool or application list in the menu area 73 to be selected.
  • FIG. 26 when the user drags the annotation control 78 in the to-be-selected menu area 73 to the position of the home page control 79 in the selected menu area 72, the position is already occupied by the home page control 79. As shown in FIG. 27, the comment control 78 will be added to this position; and the original homepage control 79 in this position can be moved to the next position in sequence, and other application or tool controls can be moved to the next position in sequence. If the number of applications or tool controls in the selected menu area 72 reaches the preset number, the application or tool controls at the end of the original menu can be moved to the menu area 73 to be selected. In FIG.
  • the application or tool control may be the first in the list of tools or applications in the to-be-selected menu area 73.
  • the embodiments of the present application make full use of the touch characteristics of the display device, customize menu options, centrally control various functions, and improve user experience.
  • the touch characteristics of the display device are fully utilized to make up for the insufficient operation of the remote control.
  • the global menu control and the global return control have also become the unified entrance to the various functions of the smart TV. Centralized management and control of each function allows users to quickly find the functions they want to use and enhance user experience.
  • the display device can call up the mode selection interface through the global menu control, including a prompt screen for instructing the user to select the mode.
  • FIG. 28 is a schematic diagram of two modes according to an embodiment of the present application. As shown in FIG. 28, a designated prompt pattern is set on the background of the two controls of "Standard Mode” and "educationion Mode”; and below the two controls Set the prompt text, such as "Please select the mode you want to use. After entering the homepage, you can still switch the TV mode at will", so as to guide the user to complete the mode selection operation through graphics or text.
  • the focus flag can also be set on a mode control by default according to the operating conditions of the display device.
  • the focus flag can be set on the control corresponding to the education mode by default.
  • the user can control the focus mark to move to other controls according to the operation mode they want to enter.
  • the user uses the "left" button on the control device (remote control) to move the focus mark to the control corresponding to the standard mode.
  • the focus mark can present different display shapes according to different interface styles.
  • the focus mark can be a box that selects the control; it can also be a processing mark for highlighting or changing the color of the control, thereby indicating the position of the focus.
  • the mode selection interface presented by the display device may also change according to the change of the user's control action.
  • the prompt pattern and prompt text on the interface also change, that is, the description interface and description content corresponding to the control mode are converted.
  • the operating system of the display device can further jump to the interface related to the education mode.
  • a mode selection control may be provided in the menu display area under the global menu control, and the above process can be realized through the mode selection control.
  • FIG. 29 is an interface diagram of an education mode according to an embodiment of the present application.
  • the education mode homepage may include multiple display areas, for example, the Header area at the top of the display homepage, The content area in the middle, and the switching area (channel Tab) at the bottom of the displayed homepage.
  • the Header area is used to display functional controls, for example, "mode switch”, “voice”, “search”, “user”, “grade”, “points”, “wifi status", “time” Wait. The user can realize the corresponding function by moving the focus mark to any functional control.
  • FIG. 30 is a standard mode interface diagram according to an embodiment of the present application. As shown in FIG.
  • the Header area can be set at the top of the entire display screen, including multiple functional controls, such as "search”, “member”, “Message”, “Terminal Connection Status”, “Network Connection Status”, “Weather”, “Time”, etc.
  • Each control can pop up the actual corresponding interface after clicking. For example, when the user clicks the "message” control, a message interface can pop up, and various messages published by the server or pushed by the application can be displayed in the message interface to form a message list. And, each message includes two states: read/unread. When there are unread messages in the message list, you can add prompt patterns on the "message” control.
  • the content area can occupy most of the area of the display screen, including the middle and bottom, so as to display resource cards, setting function cards and other content.
  • FIG. 31 is a schematic diagram of switching between two modes according to an embodiment of the present application.
  • you can click the related application in the menu display area 63 displayed by the global menu control. To realize the switch of TV mode.
  • click on the "smart whiteboard” application if the current display device is in the standard mode, the "smart whiteboard” application will be opened, and the display device will also switch the current standard mode to the education mode, and exit at this time. After the application of "Smart Whiteboard", the display device is still in education mode.
  • the global menu may be displayed on all screen content after the call, including "mode switching", which the user can directly click to switch. For example, in standard mode, the "mode switch” control position displays “education mode” to switch to education mode after clicking; in education mode, the "mode switch” control position displays “standard mode” to switch after clicking To standard mode. Refer to the previous introduction for calling up the global menu.
  • FIG. 32 is an interface diagram of an education mode according to an embodiment of the present application. As shown in FIG. 32, a global return control can also be set in the lower left corner of the education mode homepage. When the user clicks, it directly exits the education mode and enters the standard model.
  • switching can also be achieved through an intelligent voice system.
  • the user can input voice commands, such as "education mode”, “switch to education mode”, “return to education homepage”, etc.
  • the display device can directly execute the switching command after receiving the above voice input, from The standard mode is switched to the education mode.
  • the user can collect a number of different types of topics, such as: a certain actor's film and television topic 1, a certain variety show highlights topic 2, a small bridging course topic 3, where the topic 1 includes 5 types: For film and television drama media resources, Topic 2 includes 10 types of film media resources for variety shows, and Topic 3 includes 4 types of education media resources.
  • the display device can set the collection interface to display the user's favorite topics. However, when the user has many favorite topics, if these favorite topics are directly mixed together, the collection interface will look messy, and the user looks for a certain one they want to watch The topic often takes more time. In order to solve the above problems, this application sets up multiple scene modes to display topics, so that users will not be interfered by other types of topics when searching for topics.
  • a favorite control may also be set in the menu display area, and the favorite control is triggered to jump to the favorite interface.
  • the response method of the display device specifically includes: Step S100: Receive an instruction input by the user to start the favorite interface.
  • the first scene mode is the standard mode. Generally, the user uses this mode more times and may also collect more topics. In order to display comprehensive thematic collection data, the display device can be based on all topic types.
  • the collection data of controls the display to display the collection interface, so that all the topics of the collection of the currently logged-in user are displayed on the collection interface.
  • the display device may display the topics that the user favorites according to topic types, so as to facilitate the user to find topics.
  • Step S120 When the display device is in the second scene mode, control the display to display the favorite interface according to the favorite data of the preset theme type.
  • the second scene mode is an education mode. Normally, the user has a strong purpose of educating children when using this mode.
  • the display device may only display Preset the collection data of the topic type, such as the collection data of the education topic type, so that only the education topics collected by the currently logged-in user are displayed on the collection interface.
  • FIG. 33 is a diagram of a topic details interface according to an embodiment of the present application, and the display device can generate the topic details interface according to the detailed data of the topic.
  • the topic details interface is the details page of topic A.
  • the topic posters, topic descriptions, topic media resources and topic collection controls can be displayed.
  • the topic collection controls can be located in the upper right corner of the topic details page.
  • the topic favorite control can display "Favorite", thereby prompting the user that the topic can be favorited.
  • the topic poster can be filled with the topic details interface, the topic introduction can include a text description of the topic, the number of the topic media assets can include multiple, such as media assets A1, media assets A2, media assets A3, media assets A4, media assets A5 , Media Asset A6, Media Asset A7 and Media Asset A8.
  • Receive the topic collection instruction input by the user on the topic details interface in response to the topic collection instruction, obtain the topic type corresponding to the topic collection instruction, and send a collection request containing the topic type to the server, so that the server generates and stores Contains thematic collection data of the thematic type.
  • Receive an instruction to start the collection interface input by the user and in response to the instruction to start the collection interface, control the display to display the collection interface according to the collection data of all the topic types.
  • Fig. 34 is a diagram of a my collection interface according to an embodiment of the application.
  • the collection interface can be a collection interface of all themes with a topic navigation bar.
  • the topic navigation bar can include multiple topic type controls, such as " Movie, “Education”, “K song”, “Actor” and “Special topic”, among them, each topic type control can display the collection data of the corresponding topic type in the collection interface in response to triggering, so that users can find the collection according to the topic type.
  • the “topic” control can display all the collection data of the topic types on the collection interface, which is convenient for users to view all the collection topics.
  • Each topic type control is triggered in response to selection.
  • the selection method may include the user touching the topic type control on the touch screen of the display device, or the user selecting the topic type control through the direction selection key on the remote control. The user can obtain different types of thematic collection data by switching the selected thematic type controls.
  • the user's interaction with the educational touch-sensitive TV also includes the interaction when operating the topic, such as obtaining the detail page, collecting the topic, and interacting with the display device when searching for the topic in the favorite interface.
  • Step S201 Receive a topic detail instruction input by a user, and in response to the topic detail instruction, obtain from the server the detail data of the topic corresponding to the topic detail instruction and the topic type of the topic corresponding to the topic detail instruction, according to the details The data controls the display to display the topic details interface, and stores the topic type.
  • Step S202 Receive the special topic collection instruction input by the user on the topic details interface, in response to the topic collection instruction, obtain the type of the topic corresponding to the topic collection instruction, and send a collection request containing the topic type to the server, so that the server Generate and store thematic collection data containing the thematic type.
  • Step S203 Receive an instruction to start a favorite interface input by the user, and in response to the instruction to start a favorite interface, control the display to display the favorite interface according to the favorite data of the preset topic type.
  • Fig. 35 is an interface in the education mode according to an embodiment of the application. Referring to Fig. 35, the collection interface of the education mode may be provided with a topic navigation bar.
  • the topic navigation bar may include two topic type controls, such as "education" and "topic ", these two topic type controls can display the collection data of the education topic type on the collection interface in response to triggering.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Est divulgué ici un dispositif d'affichage utilisé pour permettre, au moyen de la gestion centralisée et de la commande de divers accès de fonction pour une télévision tactile intelligente, à un utilisateur de trouver rapidement une fonction que l'utilisateur souhaite, ce qui permet d'améliorer l'expérience de l'utilisateur. Le dispositif d'affichage comprend : un afficheur, l'afficheur comprenant un écran tactile ; une interface utilisateur ; et un dispositif de commande, qui est respectivement connecté à l'afficheur et à l'interface utilisateur, et est utilisé pour exécuter les étapes suivantes : en réponse à une première instruction d'entrée, commander l'afficheur pour afficher une commande de menu global, la première instruction d'entrée étant entrée par un utilisateur au moyen d'un contact de l'écran tactile avec un doigt.
PCT/CN2021/090538 2020-04-30 2021-04-28 Dispositif d'affichage WO2021219002A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
CN202010364836 2020-04-30
CN202010364836.5 2020-04-30
CN202011016939.9 2020-09-24
CN202011016939.9A CN112162809B (zh) 2020-09-24 2020-09-24 显示设备及用户收藏显示方法
CN202011551199.9A CN114296623A (zh) 2020-12-24 2020-12-24 一种显示设备
CN202011551199.9 2020-12-24

Publications (1)

Publication Number Publication Date
WO2021219002A1 true WO2021219002A1 (fr) 2021-11-04

Family

ID=78331799

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/090538 WO2021219002A1 (fr) 2020-04-30 2021-04-28 Dispositif d'affichage

Country Status (1)

Country Link
WO (1) WO2021219002A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115022689A (zh) * 2022-05-25 2022-09-06 Vidaa国际控股(荷兰)公司 一种控制装置按键的配置方法及显示设备、控制装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102981768A (zh) * 2012-12-04 2013-03-20 中兴通讯股份有限公司 一种在触屏终端界面实现悬浮式全局按钮的方法及系统
US20140181650A1 (en) * 2012-10-11 2014-06-26 Victoria Isabella Polubinska Self-configuring user interface
CN105487752A (zh) * 2015-11-25 2016-04-13 魅族科技(中国)有限公司 一种应用控制方法及应用该方法的终端
CN105630304A (zh) * 2015-12-18 2016-06-01 北京奇虎科技有限公司 一种操作浏览器的方法及电子设备
CN107831989A (zh) * 2017-11-28 2018-03-23 维沃移动通信有限公司 一种应用程序参数调整方法及移动终端
CN108469965A (zh) * 2018-03-15 2018-08-31 维沃移动通信有限公司 一种应用程序的设置方法及移动终端
US20190324636A1 (en) * 2018-04-20 2019-10-24 Opera Software As Drag menu

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140181650A1 (en) * 2012-10-11 2014-06-26 Victoria Isabella Polubinska Self-configuring user interface
CN102981768A (zh) * 2012-12-04 2013-03-20 中兴通讯股份有限公司 一种在触屏终端界面实现悬浮式全局按钮的方法及系统
CN105487752A (zh) * 2015-11-25 2016-04-13 魅族科技(中国)有限公司 一种应用控制方法及应用该方法的终端
CN105630304A (zh) * 2015-12-18 2016-06-01 北京奇虎科技有限公司 一种操作浏览器的方法及电子设备
CN107831989A (zh) * 2017-11-28 2018-03-23 维沃移动通信有限公司 一种应用程序参数调整方法及移动终端
CN108469965A (zh) * 2018-03-15 2018-08-31 维沃移动通信有限公司 一种应用程序的设置方法及移动终端
US20190324636A1 (en) * 2018-04-20 2019-10-24 Opera Software As Drag menu

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115022689A (zh) * 2022-05-25 2022-09-06 Vidaa国际控股(荷兰)公司 一种控制装置按键的配置方法及显示设备、控制装置
CN115022689B (zh) * 2022-05-25 2023-11-03 Vidaa国际控股(荷兰)公司 一种控制装置按键的配置方法及显示设备、控制装置

Similar Documents

Publication Publication Date Title
US11163425B2 (en) User terminal apparatus and management method of home network thereof
KR102569424B1 (ko) 지능형 인터랙티브 태블릿의 조작 방법, 저장 매체 및 관련 기기
US11592968B2 (en) User terminal apparatus and management method of home network thereof
US10564813B2 (en) User terminal apparatus and management method of home network thereof
JP5233708B2 (ja) 情報処理装置、情報処理方法およびプログラム
US8635544B2 (en) System and method for controlling function of a device
JP6328947B2 (ja) マルチタスキング運用のための画面表示方法及びこれをサポートする端末機
JP5398728B2 (ja) 情報処理装置、情報処理方法、記録媒体、及び集積回路
EP3336672B1 (fr) Procédé et appareil pour la fourniture d'une interface utilisateur graphique dans un terminal mobile
KR102107469B1 (ko) 사용자 단말 장치 및 이의 디스플레이 방법
US11604580B2 (en) Configuration of application execution spaces and sub-spaces for sharing data on a mobile touch screen device
JP2007293849A (ja) 機能アイコンの表示システム及び方法
RU2607272C2 (ru) Способ и устройство для обеспечения графического пользовательского интерфейса в мобильном терминале
WO2014193464A1 (fr) Manipulations de gestes pour configurer des réglages de système
WO2022033104A1 (fr) Procédé d'affichage de page et dispositif d'affichage
WO2021135354A1 (fr) Procédé et appareil de division d'écran sous des applications multiples, et dispositif électronique
US20230244378A1 (en) Split-screen display control method and apparatus, electronic device, and storage medium
JP2011215878A (ja) 端末装置、端末装置の制御方法、通信システム、制御プログラム、及び記録媒体
CN111901646A (zh) 一种显示设备及触控菜单显示方法
WO2021219002A1 (fr) Dispositif d'affichage
CN114115637A (zh) 显示设备及电子画板优化方法
CN114157889B (zh) 一种显示设备及触控协助交互方法
CN113721808A (zh) 一种控制方法及装置
WO2021197078A1 (fr) Procédé et dispositif d'affichage
CN114760513A (zh) 一种显示设备及光标定位方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21796553

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21796553

Country of ref document: EP

Kind code of ref document: A1