CN117806577A - Display apparatus and display apparatus control method - Google Patents

Display apparatus and display apparatus control method Download PDF

Info

Publication number
CN117806577A
CN117806577A CN202310774474.0A CN202310774474A CN117806577A CN 117806577 A CN117806577 A CN 117806577A CN 202310774474 A CN202310774474 A CN 202310774474A CN 117806577 A CN117806577 A CN 117806577A
Authority
CN
China
Prior art keywords
user
display
control
target
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310774474.0A
Other languages
Chinese (zh)
Inventor
陆兴
凌崇森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Electronic Technology Shenzhen Co ltd
Original Assignee
Hisense Electronic Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Electronic Technology Shenzhen Co ltd filed Critical Hisense Electronic Technology Shenzhen Co ltd
Priority to CN202310774474.0A priority Critical patent/CN117806577A/en
Publication of CN117806577A publication Critical patent/CN117806577A/en
Pending legal-status Critical Current

Links

Abstract

Some embodiments of the present application provide a display apparatus and a display apparatus control method. After the user instructs to control the display device by using the eyeballs, the display device controls the eyetracker to acquire the eyeball data of the user, wherein the eyeball data comprises sight line position information. And the display equipment acquires the target function control selected by the user based on the sight line position information and executes the operation corresponding to the target function control. The user can control the display device by utilizing the eyeballs, and the focus cursor is not required to be continuously adjusted by using the control device, so that the operation is simple and convenient.

Description

Display apparatus and display apparatus control method
Technical Field
The application relates to the technical field of display equipment, in particular to display equipment and a display equipment control method.
Background
The display device is a terminal device capable of outputting specific display pictures, along with the rapid development of the display device, the functions of the display device are more and more abundant, the performance is more and more powerful, the bidirectional man-machine interaction function can be realized, and various functions such as video, entertainment and data are integrated, so that the user diversified and personalized requirements are met.
The user may control the display device to perform various functions, such as playing media assets, playing games, etc. The user can control the display device by using a control device matched with the display device. The display device may display a user interface that may include a plurality of functionality controls, such as media asset controls, and the like. The user can utilize the control device to move a focus cursor in the display so as to select a control and execute corresponding operation, thereby enabling the display equipment to play the media corresponding to the control. The user can also control the display device by using the terminal device, and after the terminal device and the terminal device are in communication connection, the user can adjust the focus cursor by using the terminal device so as to select the function control.
However, more functionality controls may be included in the user interface displayed in the display. If a user controls the display device by using the control device or the terminal device, the user may need to continuously adjust the position of the focus cursor to select a required control, which is complex in operation and affects the use experience of the user.
Disclosure of Invention
The application provides a display device and a display device control method, which are used for solving the problems that in the related art, a user controls the display device by using a control device or terminal equipment, and the user can select a required control piece only by continuously adjusting the position of a focus cursor, so that the operation is complicated, and the use experience of the user is affected.
In a first aspect, some embodiments of the present application provide a display device including a display, an eye tracker, and a controller. Wherein the display is configured to display a user interface including a plurality of functionality controls therein; the eye tracker is configured to acquire eye data of a user; the controller is configured to:
controlling the eye tracker to acquire eye data of a user in response to an instruction for controlling a display device with an eye; the eyeball data comprises sight line position information which is used for representing position information of mapping the sight line of a user into the display;
Acquiring a target function control selected by a user based on the sight line position information; the target function control is a function control in a display area determined according to the sight line position information, and the display area is display content obtained by dividing the user interface;
and executing the operation corresponding to the target function control.
In a second aspect, some embodiments of the present application provide a display device control method, applied to a display device, the method including:
controlling an eye tracker to acquire eye data of a user in response to an instruction to control a display device with the eye; the eyeball data comprises sight line position information which is used for representing position information of mapping the sight line of a user into a display;
acquiring a target function control selected by a user based on the sight line position information; the target function control is a function control in a display area determined according to the sight line position information, and the display area is display content obtained by dividing the user interface;
and executing the operation corresponding to the target function control.
According to the technical scheme, some embodiments of the application provide a display device and a display device control method. After the user instructs to control the display device by using the eyeballs, the display device controls the eyetracker to acquire the eyeball data of the user, wherein the eyeball data comprises sight line position information. And the display equipment acquires the target function control selected by the user based on the sight line position information and executes the operation corresponding to the target function control. The user can utilize eyeball control display device, no longer need use controlling means constantly to adjust the focus cursor, and easy and simple to handle improves user's use experience.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 illustrates a usage scenario of a display device according to some embodiments;
fig. 2 shows a hardware configuration block diagram of the control apparatus 100 according to some embodiments;
fig. 3 illustrates a hardware configuration block diagram of a display device 200 according to some embodiments;
FIG. 4 illustrates a software configuration diagram in a display device 200 according to some embodiments;
FIG. 5 illustrates a schematic diagram of a user interface in some embodiments;
FIG. 6 illustrates a schematic diagram of an application panel in some embodiments;
FIG. 7 illustrates a schematic diagram of a terminal device connection interface in some embodiments;
FIG. 8 illustrates a schematic diagram of a control interface of a terminal device in some embodiments;
FIG. 9 illustrates a schematic diagram of a control interface of a terminal device in some embodiments;
FIG. 10 is a schematic diagram of an authentication mode set in a display device in some embodiments;
FIG. 11 is a diagram of eye control mode confirmation information in some embodiments;
FIG. 12 illustrates an interactive flow diagram for components of a display device in some embodiments;
FIG. 13 illustrates a schematic diagram of a user interface in some embodiments;
FIG. 14 illustrates a schematic diagram of a region-dividing profile in some embodiments;
FIG. 15 illustrates a schematic diagram of a zone number setting interface in some embodiments;
FIG. 16 shows a schematic diagram of display 260 displaying a target user interface in some embodiments;
FIG. 17 illustrates a schematic diagram of a functionality control interface in some embodiments.
Detailed Description
For purposes of clarity and implementation of the present application, the following description will make clear and complete descriptions of exemplary implementations of the present application with reference to the accompanying drawings in which exemplary implementations of the present application are illustrated, it being apparent that the exemplary implementations described are only some, but not all, of the examples of the present application.
It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms "first," second, "" third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for limiting a particular order or sequence, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements explicitly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The display device provided in the embodiment of the application may have various implementation forms, for example, may be a television, an intelligent television, a laser projection device, a display (monitor), an electronic whiteboard (electronic bulletin board), an electronic desktop (electronic table), and the like. Fig. 1 and 2 are specific embodiments of a display device of the present application.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display device 200 through the smart device 300 or the control apparatus 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication or bluetooth protocol communication, and other short-range communication modes, and the display device 200 is controlled by a wireless or wired mode. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc.
In some embodiments, a smart device 300 (e.g., mobile terminal, tablet, computer, notebook, etc.) may also be used to control the display device 200. For example, the display device 200 is controlled using an application running on a smart device.
In some embodiments, the display device may receive instructions not using the smart device or control device described above, but rather receive control of the user by touch or gesture, or the like.
In some embodiments, the display device 200 may also perform control in a manner other than the control apparatus 100 and the smart device 300, for example, the voice command control of the user may be directly received through a module configured inside the display device 200 device for acquiring voice commands, or the voice command control of the user may be received through a voice control device configured outside the display device 200 device.
In some embodiments, the display device 200 is also in data communication with a server 400. May be configured to allow the display device 200 to be communicatively coupled via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. The server 400 may be a cluster, or may be multiple clusters, and may include one or more types of servers.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 in accordance with an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive to the display device 200, and function as an interaction between the user and the display device 200.
As shown in fig. 3, the display apparatus 200 includes at least one of a modem 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface.
In some embodiments the controller includes a processor, a video processor, an audio processor, a graphics processor, RAM, ROM, a first interface for input/output to an nth interface.
The display 260 includes a display screen component for presenting a picture, and a driving component for driving an image display, a component for receiving an image signal from the controller output, displaying video content, image content, and a menu manipulation interface, and a user manipulation UI interface.
The display 260 may be a liquid crystal display, an OLED display, a projection device, or a projection screen.
The communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver. The display device 200 may establish transmission and reception of control signals and data signals with the external control device 100 or the server 400 through the communicator 220.
A user interface, which may be used to receive control signals from the control device 100 (e.g., an infrared remote control, etc.).
The detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for capturing the intensity of ambient light; alternatively, the detector 230 includes an image collector such as a camera, which may be used to collect external environmental scenes, user attributes, or user interaction gestures, or alternatively, the detector 230 includes a sound collector such as a microphone, or the like, which is used to receive external sounds.
The external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, or the like. The input/output interface may be a composite input/output interface formed by a plurality of interfaces.
The modem 210 receives broadcast television signals through a wired or wireless reception manner, and demodulates audio and video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, the controller 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the main device in which the controller 250 is located, such as an external set-top box or the like.
The controller 250 controls the operation of the display device and responds to the user's operations through various software control programs stored on the memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command to select a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the controller includes at least one of a central processor (CentralProcessingUnit, CPU), a video processor, an audio processor, a graphics processor (GraphicsProcessingUnit, GPU), a RAM (RandomAccess Memory, RAM), a ROM (Read-only memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
The user may input a user command through a Graphical User Interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
A "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user, which enables conversion between an internal form of information and a user-acceptable form. A commonly used presentation form for user interfaces is a graphical user interface (GraphicUserInterface, GUI), which refers to a graphically displayed user interface associated with computer operations. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
As shown in fig. 4, the system of the display device is divided into three layers, an application layer, a middleware layer, and a hardware layer, from top to bottom.
The application layer mainly comprises a common application on the television and an application framework (application framework), wherein the common application is mainly an application developed based on a Browser, such as: HTML5APPs; native applications (native apps);
an application framework (application framework) is a complete program model with all the basic functions required by standard application software, such as: file access, data exchange … … and the interface for the use of these functions (toolbar, status column, menu, dialog box).
Native applications (native apps) may support online or offline, message pushing, or local resource access.
The middleware layer includes middleware such as various television protocols, multimedia protocols, and system components. The middleware can use basic services (functions) provided by the system software to connect various parts of the application system or different applications on the network, so that the purposes of resource sharing and function sharing can be achieved.
The hardware layer mainly comprises a HAL interface, hardware and a driver, wherein the HAL interface is a unified interface for all the television chips to be docked, and specific logic is realized by each chip. The driving mainly comprises: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (e.g., fingerprint sensor, temperature sensor, pressure sensor, etc.), and power supply drive, etc.
The display device 200 may have various functions such as browsing web pages, playing media assets, entertainment games, projecting screens, etc., thereby providing a wide variety of services to users. The user may control the display device 200 to launch a related application program to thereby launch a corresponding function.
In some embodiments, the controller 250 may control the display 260 to display a user interface when the user controls the display device 200 to power on. The user interface may be a specific target image, for example, various media materials obtained from a network signal source, including video, pictures and the like. The user interface may also be some UI interface of the display device 200, such as a system recommendation page or the like.
FIG. 5 illustrates a schematic diagram of a user interface in some embodiments. The user interface includes a first navigation bar 500, a second navigation bar 510, a function bar 520, and a content display area 530, the function bar 520 including a plurality of function controls such as "watch record", "my favorite", and "my application", among others. Wherein the content displayed in the content display area 530 will change as the selected controls in the first navigation bar 500 and the second navigation bar 510 change. The user interface comprises a plurality of functional controls, and a user can touch a control to enable the display equipment to execute corresponding operations of the control, so that corresponding functions are realized.
A "My applications" control may be included in the user interface. The user may select the control using a control device, such as a remote control, to cause the display device to perform the operation corresponding to the control. The user can utilize the control device to select the control with the focus cursor in the display and confirm, thereby inputting a display instruction for the application panel page to trigger entering the corresponding application panel. It should be noted that, the user may also input the selection operation of the function control in other manners to trigger entering into the application panel. For example, control is entered into the application panel page using a voice control function, a search function, or the like.
The user can view an application program that the display device 200 has installed, i.e., functions supported by the display device 200, through the application panel. The user may select one of the applications and open it to implement the functionality of the application. It should be noted that, the application installed in the display device 200 may be a system application or a third party application. Fig. 6 illustrates a schematic diagram of an application panel in some embodiments. As shown in fig. 6, the application panel includes three controls of "web media", "cable tv", and "video chat". The user can control the display device to open the player application by clicking the network media asset control to search the media asset. The user may click on the "cable" control to view some of the media channels, including various media programs offered by the cable provider, using the display device. The user may click on the "video chat" control to conduct video chat using the display device.
In some embodiments, a user may control the display device 200 not only using a remote controller, but also using a terminal device, such as a cell phone. The user can make a communication connection using the terminal device and the display device 200, thereby achieving information interaction between the terminal device and the display device 200.
The terminal device may transmit a communication connection request to the display device 200 to cause the terminal device and the display device 200 to make a communication connection. After the terminal device and the display device 200 are in communication connection, communication interaction can be performed, for example, a user can control the terminal device to download some media resources from the display device 200, or upload media resources in the terminal device to the display device 200 for playing. The user may also send some control instructions to the display device 200 by using the terminal device to control the display device 200 to implement a corresponding function, and at this time, the function of the remote controller may be implemented by using the terminal device.
The display device 200 may display a terminal device connection interface so that the terminal device transmits a communication connection request to the display device 200 according to the terminal device connection interface. Fig. 7 illustrates a schematic diagram of a terminal device connection interface in some embodiments. As shown in fig. 7, the terminal device connection interface includes a communication connection prompt message and a communication connection control. The communication connection prompt information is used for prompting the user to control the display device 200 by using the terminal device, and may be "scan code connection display device 200, and control the display device 200 by using the terminal device". The communication connection control can be a two-dimensional code for establishing communication connection, and a user can control the terminal equipment to scan the communication connection control so as to establish communication connection.
After scanning the communication connection control, the terminal device may send a communication connection request to the display device 200. In response to the communication connection request transmitted by the terminal device, the controller 250 may establish a communication connection between the display device 200 and the terminal device for data interaction. The user may send control instructions, such as instructions to adjust a focus cursor, to the display device 200 using the terminal device. The controller 250 accepts a control instruction sent by the terminal device, and can execute the control instruction.
The terminal device may display a control interface of the display device 200. The control interface of the display device 200 includes a plurality of operation controls, and each operation control may correspond to a control instruction for controlling the display device 200. When the user clicks a certain operation control, the terminal device may send a control instruction corresponding to the operation control to the display device 200. Fig. 8 shows a schematic diagram of a control interface of the display device 200 in the terminal device in some embodiments.
The control interface comprises 9 operation controls: a power control 801, a program source control 802, a return control 803, a main interface control 804, a keyboard toggle control 805, a signal source control 806, an image recognition control 807, and a volume setting interface 808. The power control 801 is used to control on and off of the display device 200. The program source control 802 is a numeric keypad, and is used for adjusting the currently playing media channel of the display device 200, so that the user can press a specific numeric key to switch to a corresponding media channel, or can switch media channels back and forth according to the sequence of media channels. The return control 803 is used to return to the last operation of the user. The main interface control 804 is used to return to the main interface of the display device 200. Keyboard toggle control 805 is used to toggle displayed keyboards, including numeric keyboards and directional keyboards. The signal source control 806 is configured to display a list of signal sources of the display device 200, including HDMI, USB, ATV, for selection by a user. The image recognition control 807 is used to recognize images displayed by the display device 200. The volume setting interface 808 is used to adjust the volume of the display device 200 to a preset volume value, which may be 50. Under the current control interface, when the user clicks the keyboard switch control 805, the numeric keyboard in the control interface is switched to a directional keyboard.
Fig. 9 shows a schematic diagram of a control interface of a terminal device in some embodiments. As shown in fig. 9, focus control 809 is a directional keypad with which a user can adjust the position of a focus cursor in display device 200. The user may select a function control in the user interface of the display device 200 by controlling the movement of the focus cursor up, down, left, and right.
In some embodiments, when the display apparatus 200 is in the communication mode, an authentication mode may be further set in consideration of security. When the authentication mode is turned off, the display device 200 does not authenticate the terminal device, and the terminal device may directly perform communication connection with the display device 200. That is, when the terminal device transmits a communication connection request to the display device 200, the display device 200 may not verify the communication connection request, thereby directly making a communication connection with the terminal device.
When the authentication mode is turned on, the display device 200 performs authentication on the terminal device. That is, when the display device 200 receives a communication connection request transmitted from the terminal device, it verifies the communication connection request, and when the verification passes, it allows the communication connection request to perform communication connection with the terminal device.
Fig. 10 shows a schematic diagram of setting an authentication mode in the display device 200 in some embodiments. As shown in fig. 10, when the user selects the on communication mode, the on authentication mode or the off authentication mode may be further selected.
It should be noted that, in the process of controlling the display device 200 by using the control device or the terminal device, the user interface displayed by the display device 200 may include more functional controls. If the user wants to control the focus cursor to select a certain functional control, the focus cursor is controlled to sequentially move to the target control by utilizing a lattice-by-lattice searching mode. This process may require continuous adjustment of the position of the focus cursor, which is cumbersome to operate, affecting the user's experience.
To solve the above-described problem, the display device 200 provided in the embodiment of the present application may have a function of controlling the display device by an eyeball, and the display device 200 may be provided with an eyeball control mode. When the display device is in the eyeball control mode, the user can control the display device by using the eyeball. The controller 250 may acquire the situation that the eyeball of the user gazes in the user interface, so as to determine the functional control that the user is gazing at, and thus execute the operation corresponding to the functional control. The condition that a user can select the functional control only by continuously adjusting the focus cursor is avoided, the operation is convenient, and the use experience of the user is improved.
In some embodiments, the user may send the eyeball control mode instruction to the display device 200 by operating a designated key of the remote control. In the practical application process, the corresponding relation between the eyeball control mode instruction and the remote control key is bound in advance. For example, an eyeball control mode key is set on the remote controller, when the user touches the key, the remote controller sends an eyeball control mode instruction to the controller 250, and at this time, the controller 250 controls the display device 200 to enter the eyeball control mode.
An eye control mode option may also be set in the system UI interface of the display apparatus 200, and when the user clicks the option, the display apparatus 200 may be controlled to enter or exit the eye control mode.
In some embodiments, to prevent the user from erroneously triggering the eye control mode, when the controller 250 receives the eye control mode instruction, the display 260 may be controlled to display the eye control mode confirmation information, so that the user performs a secondary confirmation as to whether to control the display apparatus 200 to enter the eye control mode. Fig. 11 is a schematic diagram showing eyeball control pattern confirmation information in some embodiments.
FIG. 12 shows an interactive flow diagram of components of display device 200 in some embodiments, including the steps of:
S101, controlling the eyeball tracker to acquire eyeball data of a user in response to an instruction for controlling a display device by utilizing the eyeball; the eyeball data comprises sight line position information which is used for representing position information of mapping the sight line of a user into the display;
s102, acquiring a target function control selected by a user based on the sight line position information; the target function control is a function control in a display area determined according to the sight line position information, and the display area is display content obtained by dividing the user interface;
s103, executing the operation corresponding to the target function control.
In some embodiments, a user may input a power-on instruction to the display device 200 to control the display device 200 to be powered on. In response to the power-on instruction, the controller 250 may turn on the display device 200 and control the display 260 to display a user interface. The user interface includes a plurality of functional controls, and the user selects one of the functional controls to cause the controller 250 to perform an operation corresponding to the functional control. In the embodiment of the application, the function control selected by the user is called a target function control.
FIG. 13 illustrates a schematic diagram of a user interface in some embodiments. As shown in fig. 13, the user interface is an interface for enabling the user to select a media channel, and includes a plurality of media channel controls corresponding to media channel a-media channel L, respectively. The user may select one of the media channel controls, such as media channel F, which controller 250 may activate to cause display device 200 to play the media program corresponding to media channel F.
In the eye control mode, the user may utilize the eye to look at a certain functionality control, i.e., a target functionality control, in the user interface. The controller 250 may acquire the target function control selected by the user and perform an operation corresponding to the target function control.
In some embodiments, to determine that the user's eyes are looking at the user interface, the controller 250 may obtain relevant data of the user using the eyes looking at the user interface, referred to herein as eye data. The controller 250 may determine the target functionality control selected by the user based on the eyeball data of the user.
The display device 200 is provided with an external device interface 240, and the external device interface 240 is used for connecting an external device, so that the display device 200 is connected with the external device, and the external device can be a human eye data acquisition device, such as an eyeball tracker. An eye tracker is a device capable of tracking the movement of the human eye, and by recording the movement track and gaze point of the eye, the visual behavior of the person can be analyzed, for example, the situation that the user gazes at the user interface can be analyzed. The eye tracker includes an infrared camera and a processor. Wherein the infrared camera can acquire a sequence of images of the user's eye region. The processor may process the image sequence of the human eye region based on a gaze tracking technique (EyeTracking/GazeTracking), for example, detect key points of eyes of a human eye, so as to track a user's gaze, thereby obtaining eyeball data when the user gazes at the user interface. The eyeball data may include line-of-sight position information, eye-fixation duration, eye-fixation actions, and the like. The gaze location information may characterize, among other things, location information of the user's gaze mapping into the display, which may be coordinate information. The region where the user gazes can be represented by a focus cursor corresponding to an eyeball, and the sight line position information is the coordinate of the eyeball focus cursor in the display interface. The human eye gaze duration may characterize a duration of time that the user gazes at a certain area, e.g., a duration of time that the user continues to gaze at a certain functionality control. The eye gaze motion may characterize an eye motion of the user, such as a blink or the like. It should be noted that, the eye tracker may be an external device, and may be externally connected to the display device 200 through the external device interface 240, or may be a detector built into the display device 200.
For an eye tracker built in the display device 200, a lifting function can be supported. The eyeball tracker can be arranged on the lifting mechanism, and when eyeball data acquisition is needed, the lifting mechanism is controlled to move through a specific lifting instruction, so that the eyeball tracker is driven to rise to acquire eyeball data of a user. When the eyeball data is not required to be collected, the lifting mechanism can be controlled to move through a specific lifting instruction, so that the eyeball tracker is driven to be lowered, and the eyeball tracker is hidden.
For an eye tracker external to the display apparatus 200, the eye tracker may be connected to the external device interface 240 of the display apparatus 200, and connected to the display apparatus. The external eye tracker may be a separate peripheral device and may be connected to the display device 200 via a specific data interface. The display device 200 and the eye tracker are provided with data interfaces of the same interface specification and function, which can be a high definition multimedia interface (HighDefinitionMultimediaInterface, HDMI). Through the HDMI interface, the display device 200 can receive the user image collected by the eye tracker.
Other means of connection, such as DVI (Digital VisualInterface), VGA (VideoGraphicsArray), USB (UniversalSerialBus), etc., may also be supported for an eye tracker external to the display device 200; and can also be wireless connection modes, such as wireless local area network, bluetooth connection, infrared connection and the like.
To facilitate the collection of the user's eye data, in some embodiments, an eye tracker external to the display device 200 may be positioned near the display device 200, such as by clamping the eye tracker on top of the display device 200 by a clamping device, or by placing the eye tracker on a table near the display device 200.
It should be noted that, whether the eye tracker is built in the display device 200 or externally connected to the display device 200, the user may start the eye tracker to collect the eye data through a specific interaction instruction or an application program control during the process of using the display device 200, and perform corresponding processing on the collected eye data according to different needs.
In some embodiments, in response to an instruction to control the display device 200 with an eyeball, the controller 250 may control the eye tracker to acquire eyeball data of the user. The eyeball data may include first eyeball data, which is eyeball data of a user while looking at the user interface. Wherein the first eye ball data comprises first gaze location information characterizing a mapping of a user gaze into location information in the user interface. The first gaze location information may be coordinates at which the user gazes at the user interface.
The controller 250 may obtain the user-selected target functionality control based on the first gaze location information.
In some embodiments, the controller 250 may obtain the corresponding functionality control at the first gaze location information. In the user interface, each function control occupies a certain size of user interface area, and each user interface area corresponds to one piece of position information and is used for representing coordinate information of the function control in the user interface. Therefore, the controller 250 can directly obtain the corresponding function control at the first sight line position information, that is, the target function control selected by the user can be obtained.
It should be noted that, considering that the number of functionality controls included in the user interface may be greater, the size of each functionality control may be smaller. The first sight line position information collected by the eye tracker may be coordinate information of a region, and not coordinate information of a certain pixel point. Because the size of the functional control is smaller, the number of the functional controls corresponding to the first sight line position information may be multiple, that is, the area corresponding to the first sight line position information may contain multiple functional controls, so that the functional control selected by the user cannot be determined. For this purpose, it may be determined that the user gazes at a display area in the user interface, and then the function control selected by the user in this display area is determined.
In some embodiments, after the user instructs to control the display apparatus 200 using the eyeballs, the controller 250 may first divide the user interface into a plurality of display areas.
The controller 250 may first acquire the number of display areas corresponding to the user interface, that is, first determine how many display areas the user interface needs to be divided into. The number of display areas may be a preset number, and may be 4 or 6.
The controller 250 may generate the region division profile according to the number of display regions. In the embodiment of the application, the region division outline is used for dividing the user interface into a plurality of display regions positively related to the number of the display regions. The number of display areas may be the same as the number of display areas. FIG. 14 illustrates a schematic diagram of a region-dividing profile in some embodiments. As shown in fig. 14, when the number of display areas is 4, one area division profile in the horizontal direction and one area division profile in the vertical direction may be generated, thereby dividing the user interface shown in fig. 13 into 4 display areas. The upper left corner display area comprises channels A-C, and 3 functional controls. The upper right corner display area includes channel D-channel F for a total of 3 functionality controls. The lower left corner display area includes channel G-channel I for a total of 3 functionality controls. The lower right corner display area includes channel J-channel L for a total of 3 functionality controls.
The controller 250 may control the display 260 to display the area dividing profile. The user can determine the current display area division condition of the user interface according to the area division outline, and when the user selects the target function control, the user can firstly select the display area where the target function control is located, which is called a target display area in the embodiment of the present application.
In some embodiments, the user may set the number of display areas.
In response to a user-entered instruction to set the number of display regions, the controller 250 may control the display 260 to display a region number setting interface including a plurality of number controls therein. FIG. 15 illustrates a schematic diagram of a zone number setting interface in some embodiments. As shown in fig. 15, the zone number setting interface includes a plurality of number controls, such as control "1", control "2", control "4", control "6", and "custom". The user may select a control to set the number of display areas to a value corresponding to the control. If the user selects the custom control, the controller 250 may control the display 260 to display a number input box, and the user may input a number in the number input box, so as to set the number of display areas by himself, for example, 10.
In response to a user selection operation of the target number control, the controller 250 may acquire the target number corresponding to the target number control, and set the number of display areas of the user interface as the target number.
In some embodiments, after controlling the display 260 to display the area dividing outline, the controller 250 may control the eye tracker to acquire the first eye data of the user, and parse the first eye data to obtain the first line-of-sight position information. The controller 250 may obtain the target functionality control selected by the user according to the first gaze location information.
The controller 250 may first determine a gaze focus of the user when looking at the user interface based on the first gaze location information, referred to herein in embodiments as a first user gaze focus. The controller 250 may obtain the display area selected by the first user's gaze focus. In the embodiments of the present application, the target display area is referred to.
The user selects the target display area, which means that the user will select a certain function control in the target display area to execute, i.e. the target function control is located in the target display area. The controller 250 may reacquire the target functionality controls selected by the user at the target display area.
The controller 250 may control the display 260 to display a target user interface. In this embodiment of the present application, the target user interface is a display content corresponding to the target display area in the user interface.
In some embodiments, to ensure accuracy of target display area identification, the controller 250 may enable the user to confirm the target display area.
The controller 250 may control the display 260 to highlight the target display area in the user interface, for example, to turn the brightness of the target display area up and up while turning the brightness of the other display areas down.
The user can confirm whether or not the display area selected by himself is the target display area, and input a confirmation instruction to the display apparatus 200 when correct. The user may input a confirmation instruction using the control means, for example by pressing a confirmation key. The user can also input a confirmation instruction by using a voice input mode. The user may also input a confirmation instruction using the eyes, for example, may gaze at the target display area for a preset time, or perform a preset action such as blinking.
In response to a confirmation instruction entered by the user, the controller 250 may control the display 260 to display the target user interface, which may be a full screen display. FIG. 16 shows a schematic diagram of display 260 displaying a target user interface in some embodiments. As shown in fig. 16, when the target display area selected by the user is the upper right corner display area, the controller 250 controls the display 260 to display the contents of the area, including the channel D-channel F, in total, 3 function controls.
In some embodiments, the user may select a target functionality control in the target user interface that needs to be actuated.
After the display 260 displays the target user interface, the controller 250 may control the eye tracker to acquire the eyeball data of the user when the user gazes at the target user interface, which is referred to as second eyeball data in the embodiment of the present application. The second eye data includes second gaze location information, which is used to characterize the location information of the user's gaze mapping into the target user interface, which may be coordinates at which the user gazes at the target user interface.
The controller 250 may first determine a gaze focus of the user when looking at the target user interface based on the second gaze location information, referred to herein in embodiments as the second user gaze focus. The controller 250 may obtain a target function control based on the second user gaze focus, where the target function control is a function control selected by the second user gaze focus.
In some embodiments, the controller 250 may also enable the user to confirm whether the target functionality control is correct.
After acquiring the user-selected target functionality control, the controller 250 may control the display 260 to highlight the target functionality control in the target user interface. The user can confirm whether the target function control is correct or not, and input a confirmation instruction when the target function control is correct. The user may input a confirmation instruction using the control means, for example by pressing a confirmation key. The user can also input a confirmation instruction by using a voice input mode.
The user may also enter a confirmation instruction using the eye. The input method of the confirmation command may be set in advance, and the confirmation command may be set from the gazing time period and the gazing action. For example, for the gazing duration, a duration threshold may be preset, and the confirmation instruction may be set so that when the duration of the gazing of the user reaches the preset duration threshold, the confirmation instruction is confirmed to be input to the user. The fixation operation may be preset, and in this embodiment, the first operation may be a certain number of times (1/2/3 times) of blinking of the left eye or the right eye, or may be a certain number of times (1/2/3 times) of blinking of the left eye and the right eye simultaneously.
After the target functionality control is acquired, the controller 250 may control the eye tracker to acquire third eye data. The third eyeball data includes the human eye gazing duration data and/or the human eye gazing action data.
The controller 250 may detect a confirmation instruction input by the user based on the third eyeball data.
The controller 250 may obtain a target duration of the user's gaze at the target functionality control based on the human eye gaze duration data. And if the target time length is greater than or equal to a preset time length threshold value, generating a confirmation instruction.
And/or the number of the groups of groups,
the controller 250 acquires a human eye gaze motion of the user based on the human eye gaze motion data, and generates a confirmation instruction if the human eye gaze motion is a preset first motion.
In response to a confirmation instruction input by the user, the controller 250 may perform an operation corresponding to the target function control.
In some embodiments, the user interface is the target user interface as a whole, so that the user may not need to divide the target user interface, considering that the number of display areas may be set to 1.
For this purpose, after determining the first user line-of-sight focus, the controller 250 may first acquire the number of display areas corresponding to the user interface, so as to detect whether the display areas need to be divided for the user interface.
If the number of display areas is greater than 1, the controller 250 may perform a step of acquiring a target display area selected by the first user's focus of vision to allow the user to select the target display area by dividing the display area.
If the number of display areas is equal to 1, the controller 250 does not need to divide the display areas, so that the controller 250 can directly obtain the target function control selected by the user based on the first user sight focus, where the target function control is the function control selected by the first user sight focus.
In some embodiments, some special operations contained in the display device 200 cannot be directly displayed in the user interface, such as copy, paste, or screenshot operations. The user may also instruct the display device 200 to perform these operations using the eyeballs.
The controller 250 may control the eye tracker to obtain a user's eye gaze motion.
If the human eye gaze is the preset second action, the controller 250 may control the display 260 to display the functionality control interface. The function controls comprise at least one preset function control, and the preset function control is used for executing at least one of copying, pasting or screenshot operation on the user interface. FIG. 17 illustrates a schematic diagram of a functionality control interface in some embodiments. As shown in fig. 17, the functionality control interface includes a copy control, a paste control, and a screenshot control.
In response to a user's selection operation of the preset function control, the controller 250 may perform an operation corresponding to the preset function control.
For example, after the user selects the copy control, the user may first place the focus of the line of sight at the beginning of the content to be copied for a period of time (which may be 3 seconds). After controller 250 detects the action of the user's continued gaze, the origin of replication may be confirmed. The user may move the line of sight focus from the replication origin to the replication endpoint and look at the replication endpoint for a period of time such that the user-indicated replication endpoint may be detected by controller 250. A replication region is formed between the replication origin and the replication termination point. The controller 250 may perform a copy operation on the copy area to obtain the contents of the copy area.
After the user selects the paste control, the user can watch for a period of time in the area to be pasted. After the controller 250 detects the action of the user's continuous gaze, a paste operation may be performed.
The user selects the screenshot control, and can sequentially select a starting point and an ending point of the screenshot, wherein a diagonal line of the screenshot image can be arranged between the starting point and the ending point. After the controller 250 determines the start point and the end point, a screenshot operation may be performed to obtain a screenshot image.
In some embodiments, the controller 250 may control the eye tracker to acquire eye data in real time in consideration of the user's use experience. The controller 250 may generate a cursor corresponding to the focus of the user's vision in real time from the eyeball data and control the display 260 to display the cursor.
The embodiment of the application also provides a display device control method, which comprises the following steps:
step 101, controlling the eyeball tracker to acquire eyeball data of a user in response to an instruction for controlling a display device by utilizing eyeballs; the eyeball data comprises sight line position information which is used for representing position information of mapping the sight line of a user into the display;
102, acquiring a target function control selected by a user based on the sight line position information; the target function control is a function control in a display area determined according to the sight line position information, and the display area is display content obtained by dividing the user interface;
And 103, executing the operation corresponding to the target function control.
In some embodiments, the eye data includes first eye data including first gaze location information therein, the first gaze location information being used to characterize a user gaze mapping to location information in the user interface.
Acquiring the target function control selected by the user based on the sight line position information, and further comprising:
a first user gaze focus is determined based on the first gaze location information. Acquiring a target display area selected by a first user sight focus; the user interface includes a number of display areas. Controlling the display to display a target user interface; the target user interface is the display content corresponding to the target display area in the user interface.
Controlling an eyeball tracker to acquire second eyeball data; the second eye data includes second gaze location information therein for characterizing a user gaze mapping to location information in the target user interface. A second user gaze focus is determined based on the second gaze location information. And acquiring a target function control selected by the user based on the second user sight focus, wherein the target function control is the function control selected by the second user sight focus.
In some embodiments, before controlling the eye tracker to acquire the eye data of the user, further comprises:
and acquiring the number of display areas corresponding to the user interface. An area division contour is generated for dividing the user interface into a number of display areas positively correlated to the number of display areas. And controlling the display area of the display to divide the outline.
In some embodiments, controlling the display to display the target user interface further comprises:
the control display highlights the target display area in the user interface. And responding to the confirmation instruction input by the user, and controlling the display to display the target user interface in a full screen mode.
In some embodiments, obtaining the number of display areas corresponding to the user interface further includes:
responding to an instruction input by a user for indicating to set the number of display areas, and controlling a display area number setting interface; the area quantity setting interface comprises a plurality of quantity controls. And responding to the selection operation of the user on the target quantity control, acquiring the target quantity corresponding to the target quantity control, and setting the quantity of the display areas of the user interface as the target quantity.
In some embodiments, after determining the first user gaze focus based on the first gaze location information, further comprises:
And acquiring the number of display areas corresponding to the user interface. And if the number of the display areas is greater than 1, acquiring a target display area selected by the first user sight focus. And if the number of the display areas is equal to 1, acquiring a target function control selected by the user based on the first user sight focus, wherein the target function control is the function control selected by the first user sight focus.
In some embodiments, executing the operation corresponding to the target function control further includes:
the control display highlights the target functionality control in the target user interface. And controlling the eyeball tracker to acquire third eyeball data. A confirmation instruction input by a user is detected based on the third eyeball data. And responding to the confirmation instruction, and executing the operation corresponding to the target function control.
In some embodiments, the third eyeball data includes therein human eye gazing duration data and/or human eye gazing motion data. Detecting a confirmation instruction input by a user based on the third eyeball data, further comprising:
acquiring the target time length of a user watching a target function control based on the human eye watching time length data; and if the target time length is greater than or equal to a preset time length threshold value, generating a confirmation instruction. And/or acquiring the eye gazing movement of the user based on the eye gazing movement data, and generating a confirmation instruction if the eye gazing movement is used as a preset first movement.
In some embodiments, further comprising:
and controlling the eyeball tracker to acquire the eye gazing action of the user. If the eye watching action is a preset second action, controlling a display to display a function control interface; the function controls comprise at least one preset function control, and the preset function control is used for executing at least one of copying, pasting or screenshot operation on the user interface. Responding to the selected operation of the user on the preset functional control, executing the operation corresponding to the preset functional control
The same and similar parts of the embodiments in this specification are referred to each other, and are not described herein.
It will be apparent to those skilled in the art that the techniques of embodiments of the present invention may be implemented in software plus a necessary general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be embodied essentially or in parts contributing to the prior art in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method of the embodiments or some parts of the embodiments of the present invention.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, characterized by comprising:
a display configured to display a user interface including a plurality of functionality controls therein;
An eye tracker configured to acquire eye data of a user;
a controller configured to:
controlling the eye tracker to acquire eye data of a user in response to an instruction for controlling a display device with an eye; the eyeball data comprises sight line position information which is used for representing position information of mapping the sight line of a user into the display;
acquiring a target function control selected by a user based on the sight line position information; the target function control is a function control in a display area determined according to the sight line position information, and the display area is display content obtained by dividing the user interface;
and executing the operation corresponding to the target function control.
2. The display device according to claim 1, wherein the eyeball data includes first eyeball data; the first eyeball data comprises first sight line position information which is used for representing the mapping of the sight line of a user to the position information in the user interface; the controller executing a target functionality control selected by a user based on the gaze location information is further configured to:
Determining a first user gaze focus based on the first gaze location information;
acquiring a target display area selected by the first user sight focus; the user interface comprises a plurality of display areas;
controlling the display to display a target user interface; the target user interface is display content corresponding to the target display area in the user interface;
controlling the eyeball tracker to acquire second eyeball data; the second eyeball data comprises second sight line position information which is used for representing the position information of mapping the sight line of a user into the target user interface;
determining a second user gaze focus based on the second gaze location information;
and acquiring a target function control selected by a user based on the second user sight focus, wherein the target function control is a function control selected by the second user sight focus.
3. The display device of claim 2, wherein the controller, prior to performing control of the eye tracker to acquire the user's eye data, is further configured to:
acquiring the number of display areas corresponding to the user interface;
generating an area dividing contour, wherein the area dividing contour is used for dividing the user interface into a plurality of display areas positively related to the number of the display areas;
And controlling a display to display the regional division outline.
4. The display device of claim 2, wherein the controller executing the control display displays a target user interface is further configured to:
controlling a display to highlight the target display area in the user interface;
and responding to a confirmation instruction input by a user, and controlling a display to display the target user interface in a full screen mode.
5. The display device of claim 3, wherein the controller performing the obtaining the corresponding number of display areas of the user interface is further configured to:
responding to an instruction input by a user for indicating to set the number of display areas, and controlling a display area number setting interface; the area quantity setting interface comprises a plurality of quantity controls;
and responding to the selection operation of the user on the target quantity control, acquiring the target quantity corresponding to the target quantity control, and setting the quantity of the display areas of the user interface as the target quantity.
6. The display device of claim 2, wherein the controller, upon performing the determination of the first user gaze focus based on the first gaze location information, is further configured to:
Acquiring the number of display areas corresponding to the user interface;
if the number of the display areas is greater than 1, executing the step of acquiring the target display area selected by the first user sight focus;
and if the number of the display areas is equal to 1, acquiring a target function control selected by a user based on the first user sight focus, wherein the target function control is the function control selected by the first user sight focus.
7. The display device of claim 2, wherein the controller performs an operation corresponding to the target functionality control, further configured to:
controlling a display to highlight the target function control in the target user interface;
controlling the eyeball tracker to acquire third eyeball data;
detecting a confirmation instruction input by a user based on the third eyeball data;
and responding to the confirmation instruction, and executing the operation corresponding to the target function control.
8. The display device according to claim 7, wherein the third eyeball data includes therein eye-fixation time length data and/or eye-fixation motion data; the controller executing an acknowledgement instruction that detects user input based on the third eyeball data is further configured to:
Acquiring the target time length of the user gazing at the target functional control based on the human eye gazing time length data; if the target time length is greater than or equal to a preset time length threshold value, generating a confirmation instruction;
and/or the number of the groups of groups,
and acquiring the eye gazing action of the user based on the eye gazing action data, and generating a confirmation instruction if the eye gazing action is a preset first action.
9. The display device of claim 1, wherein the controller is further configured to:
controlling the eyeball tracker to acquire the eye gazing action of a user;
if the eye watching action is a preset second action, controlling a display to display a function control interface; the function controls comprise at least one preset function control, and the preset function control is used for executing at least one of copying, pasting or screenshot operation on the user interface;
and responding to the selected operation of the user on the preset functional control, and executing the operation corresponding to the preset functional control.
10. A display device control method applied to a display device, the method comprising:
controlling an eye tracker to acquire eye data of a user in response to an instruction to control a display device with the eye; the eyeball data comprises sight line position information which is used for representing position information of mapping the sight line of a user into a display;
Acquiring a target function control selected by a user based on the sight line position information; the target function control is a function control in a display area determined according to the sight line position information, and the display area is display content obtained by dividing the user interface;
and executing the operation corresponding to the target function control.
CN202310774474.0A 2023-06-27 2023-06-27 Display apparatus and display apparatus control method Pending CN117806577A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310774474.0A CN117806577A (en) 2023-06-27 2023-06-27 Display apparatus and display apparatus control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310774474.0A CN117806577A (en) 2023-06-27 2023-06-27 Display apparatus and display apparatus control method

Publications (1)

Publication Number Publication Date
CN117806577A true CN117806577A (en) 2024-04-02

Family

ID=90430647

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310774474.0A Pending CN117806577A (en) 2023-06-27 2023-06-27 Display apparatus and display apparatus control method

Country Status (1)

Country Link
CN (1) CN117806577A (en)

Similar Documents

Publication Publication Date Title
CN112866734B (en) Control method for automatically displaying handwriting input function and display device
US9723123B2 (en) Multi-screen control method and device supporting multiple window applications
CN112055240B (en) Display device and operation prompt display method for pairing display device with remote controller
CN114286173A (en) Display device and sound and picture parameter adjusting method
US20230017791A1 (en) Display method and display apparatus for operation prompt information of input control
CN112788422A (en) Display device
CN113918010A (en) Display apparatus and control method of display apparatus
CN112188249A (en) Electronic specification-based playing method and display device
CN111835969A (en) Interactive method for controlling angle of camera and display equipment
CN113630656B (en) Display device, terminal device and communication connection method
WO2022100262A1 (en) Display device, human body posture detection method, and application
CN112272331B (en) Method for rapidly displaying program channel list and display equipment
CN112799576A (en) Virtual mouse moving method and display device
CN117806577A (en) Display apparatus and display apparatus control method
CN113810747B (en) Display equipment and signal source setting interface interaction method
CN114286153A (en) Window adjusting method based on Bluetooth AOA and display equipment
CN112367550A (en) Method for realizing multi-title dynamic display of media asset list and display equipment
CN111914565A (en) Electronic equipment and user statement processing method
CN115514998B (en) Display equipment and network media resource switching method
CN117793424A (en) Display apparatus and resolution setting method
CN115993919A (en) Display device and terminal device
US20240121464A1 (en) Display apparatus and display method
CN117406886A (en) Display device and floating window display method
CN113645502B (en) Method for dynamically adjusting control and display device
CN113825001B (en) Panoramic picture browsing method and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination