CN114760513A - Display device and cursor positioning method - Google Patents

Display device and cursor positioning method Download PDF

Info

Publication number
CN114760513A
CN114760513A CN202210429564.1A CN202210429564A CN114760513A CN 114760513 A CN114760513 A CN 114760513A CN 202210429564 A CN202210429564 A CN 202210429564A CN 114760513 A CN114760513 A CN 114760513A
Authority
CN
China
Prior art keywords
coordinate
cursor
coordinates
mapping point
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210429564.1A
Other languages
Chinese (zh)
Inventor
范颜岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Electronic Technology Wuhan Co ltd
Original Assignee
Hisense Electronic Technology Wuhan Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Electronic Technology Wuhan Co ltd filed Critical Hisense Electronic Technology Wuhan Co ltd
Priority to CN202210429564.1A priority Critical patent/CN114760513A/en
Publication of CN114760513A publication Critical patent/CN114760513A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42225User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details characterized by types of remote control, e.g. universal remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

When a control instruction which indicates cursor movement and is input by a user is received, responding to the control instruction, detecting an input coordinate of a control device in a reference coordinate system of a sensing area, and acquiring a mapping point coordinate according to the input coordinate to calculate a distance between the mapping point coordinate and an operation control center coordinate; if the distance between the mapping point coordinate and the center coordinate is smaller than or equal to a first threshold value, controlling the cursor to move to the position of the center coordinate; and if the distance between the mapping point coordinate and the center coordinate is greater than a first threshold value, controlling the cursor to move to the mapping point coordinate position so as to solve the problems of cursor position drift and low positioning accuracy caused by low sampling precision or hand shake of a user and the like.

Description

Display device and cursor positioning method
Technical Field
The application relates to the technical field of smart television drawing boards, in particular to a display device and a cursor positioning method.
Background
The display device refers to a terminal device capable of outputting a specific display picture, such as a smart television, a mobile terminal, a smart advertisement screen, a projector, and the like. Taking an intelligent television as an example, the intelligent television can be based on an Internet application technology, is provided with an open operating system and a chip, is provided with an open application platform, and can realize a bidirectional man-machine interaction function, when a user interacts with the intelligent television, the user can directly send a control instruction to the intelligent television through voice, and also can send a control instruction to the intelligent television through a control device, for example, the user can press a direction key on a remote controller to control a selection frame on the intelligent television to move up and down and left and right so as to select related functions on the television, and the like. However, it is inconvenient to issue a control command to the smart television through a designated key on the remote controller, for example, when the remote controller controls the selection box to be in the item a, if the selection box is moved to the item B which is far from the item a (there are some items between the item a and the item B), the remote controller key needs to be pressed many times.
In view of the above problems, in the related art, a scheme similar to a mouse for controlling a cursor on a computer display screen is derived, a cursor may also be displayed in a user interface of a display device, and a user may move a control device to control the cursor on the user interface.
However, different from a scheme that a mouse controls a cursor on a computer display screen, when a user uses a control device, the control device is mostly in a scene of moving in the air, and often depends on an acceleration sensor to perform accumulated calculation on a spatial position, so as to determine a relative displacement generated by the control device to control the cursor to move, after the user moves fast or uses the control device for a long time, position drift occurs due to sampling precision and the like, and the accuracy of cursor positioning is reduced.
Disclosure of Invention
The application provides a display device and a cursor positioning method, which aim to solve the problem that a control device is low in cursor control precision on the display device.
In one aspect, the present application provides a display device, including:
the display is used for displaying a user interface, and the user interface comprises a cursor and an operation control;
the communicator is used for receiving a control signal transmitted by a user through the control device in the sensing area;
A controller configured to:
receiving a control instruction which is input by a user and indicates the movement of a cursor;
detecting input coordinates of the control device in a reference coordinate system of the sensing area in response to the control command;
acquiring mapping point coordinates according to the input coordinates, wherein the mapping point coordinates are coordinates converted from the input coordinates into a reference coordinate system of the user interface;
calculating the distance between the mapping point coordinates and the center coordinates of the operation control;
if the distance between the mapping point coordinate and the center coordinate is smaller than or equal to a first threshold value, controlling the cursor to move to the position of the center coordinate;
and if the distance between the mapping point coordinate and the center coordinate is larger than a first threshold value, controlling the cursor to move to the mapping point coordinate position.
In another aspect, the present application provides a cursor positioning method, including:
receiving a control instruction which is input by a user and indicates the movement of a cursor;
in response to the control instruction, detecting input coordinates of the control device in a reference coordinate system of the sensing area;
acquiring mapping point coordinates according to the input coordinates, wherein the mapping point coordinates are coordinates converted from the input coordinates into a reference coordinate system of the user interface;
Calculating the distance between the mapping point coordinates and the center coordinates of the operation control;
if the distance between the mapping point coordinate and the center coordinate is smaller than or equal to a first threshold value, controlling the cursor to move to the position of the center coordinate;
and if the distance between the mapping point coordinate and the center coordinate is larger than a first threshold value, controlling the cursor to move to the mapping point coordinate position.
According to the technical scheme, when a control instruction which is input by a user and indicates the movement of the cursor is received, the display device and the cursor positioning method provided by the application respond to the control instruction, detect the input coordinates of the control device in the reference coordinate system of the sensing area, and acquire the coordinates of the mapping point according to the input coordinates so as to calculate the distance between the coordinates of the mapping point and the central coordinates of the operation control; if the distance between the mapping point coordinate and the center coordinate is smaller than or equal to a first threshold value, controlling the cursor to move to the position of the center coordinate; and if the distance between the mapping point coordinate and the center coordinate is greater than a first threshold value, controlling the cursor to move to the mapping point coordinate position so as to solve the problems of cursor position drift and low positioning accuracy caused by low sampling precision or hand shake of a user and the like.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments are briefly described below, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a usage scenario of a display device in some embodiments;
a block diagram of a hardware configuration of a display device according to some embodiments is illustrated in fig. 2;
fig. 3 is a block diagram of a hardware configuration of a display device in some embodiments;
FIG. 4 is a diagram of software configuration in a display device in some embodiments;
FIG. 5 is a diagram of a user interface displayed by the display device in some embodiments;
FIG. 6 is a diagram of a user interface displayed by the display device in cursor mode in some embodiments;
FIG. 7 is a flow diagram illustrating cursor positioning in some embodiments;
FIG. 8 is a diagram of a user interface displayed by the display device in some embodiments;
FIG. 9 is a diagram of a user interface displayed by the display device in some embodiments;
FIG. 10 is a diagram of a user interface displayed by a display device in some embodiments;
FIG. 11 is a diagram of a user interface displayed by a display device in some embodiments;
FIG. 12 is a diagram of a user interface displayed by the display device in some embodiments;
FIG. 13 is a diagram of a user interface displayed by the display device in some embodiments;
FIG. 14 is a diagram of a user interface displayed by the display device in some embodiments;
FIG. 15 is a diagram of a user interface displayed by the display device in some embodiments;
FIG. 16 is a diagram of a user interface displayed by the display device in some embodiments;
fig. 17 is a schematic flowchart of a cursor positioning method provided in the present application.
Detailed Description
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following examples do not represent all embodiments consistent with the present application. But merely as examples of systems and methods consistent with certain aspects of the application, as detailed in the claims.
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The display device provided by the embodiment of the present application may have various implementation forms, and for example, the display device may be a television, a smart television, a laser projection device, a display (monitor), an electronic whiteboard (electronic whiteboard), an electronic desktop (electronic table), and the like. Fig. 1 and 2 are specific embodiments of a display device of the present application.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200.
In some embodiments, the smart device 300 (e.g., mobile terminal, tablet, computer, laptop, etc.) may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device.
In some embodiments, the display device may not receive instructions using the smart device or control device described above, but rather receive user control through touch or gestures, or the like.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice command control of the user may be directly received by a module configured inside the display device 200 to obtain a voice command, or may be received by a voice control device provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control device 100 may receive an input operation instruction of a user, and convert the operation instruction into an instruction recognizable and responsive to the display device 200, so as to perform an interaction intermediary function between the user and the display device 200, where the control device 100 may be a remote controller, a handle, a mobile phone, a tablet computer, or the like, and the application is not limited thereto. Taking the control apparatus 100 as a remote controller as an example, the control apparatus 100 is distributed with a plurality of functional keys, and a user can generate a corresponding control instruction by pressing the keys on the remote controller, and send the control instruction to the display device through the remote controller, so that the display device executes the control instruction. For example, the control device is provided with a function key for instructing the display device to enter a cursor mode, and a user can select the key and send a mode selection instruction for instructing to enter the cursor mode to the display device, so that the display device enters the cursor mode to establish mapping between the remote controller and a cursor displayed by the display device, and the cursor displayed by the display device can be controlled to move by moving the remote controller.
As shown in fig. 3, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface.
In some embodiments the controller comprises a processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
The display 260 includes a display screen component for presenting a picture, and a driving component for driving image display, and a component for receiving an image signal from the controller output, performing display of video content, image content, and a menu manipulation interface, and a user manipulation UI interface.
The display 260 may be a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
The communicator 220 is a component for communicating with an external device or a server according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of a control signal and a data signal with the external control apparatus 100 or the server 400 through the communicator 220.
A user interface for receiving control signals for controlling the apparatus 100 (e.g., an infrared remote control, etc.).
The detector 230 is used to collect signals of an external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting the intensity of ambient light; alternatively, the detector 230 includes an image collector, such as a camera, which can be used to collect external environment scenes, attributes of the user, or user interaction gestures; alternatively, the detector 230 includes a sound collector, such as a microphone, etc., for receiving external sound; alternatively, the detector 230 includes an infrared collector for collecting pose information of the user and/or the control device; alternatively, the detector 230 includes a cursor collector for collecting the position information of the cursor that establishes the mapping relationship with the control device.
The external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
The tuner demodulator 210 receives a broadcast television signal through a wired or wireless reception manner, and demodulates an audio/video signal, such as an EPG data signal, from a plurality of wireless or wired broadcast television signals.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
The controller 250 controls the operation of the display device and responds to the user's operation through various software control programs stored in the memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
The user may input a user command through a Graphical User Interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
A "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A common presentation form of a User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
Referring to fig. 4, in some embodiments, the system is divided into four layers, which are an Application (Applications) layer (abbreviated as "Application layer"), an Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer from top to bottom.
In some embodiments, at least one application program runs in the application program layer, and the application programs may be windows (Window) programs carried by an operating system, system setting programs, clock programs or the like; or an application developed by a third party developer. In particular implementations, the application packages in the application layer are not limited to the above examples.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application. The application framework layer includes a number of predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. The application program can access the resources in the system and obtain the services of the system in execution through the API interface.
As shown in fig. 4, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to an application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the various applications and the usual navigation fallback functions, such as controlling exit, opening, fallback, etc. of the applications. The window manager is configured to manage all window programs, for example, obtain a size of a display screen, determine whether a status bar exists, lock the screen, intercept the screen, control a change of the display window (for example, reduce the display window, perform dithering display, perform distortion display, and the like), detect whether to start a cursor mode, control a cursor to move to a corresponding position after starting the cursor mode, and the like.
In some embodiments, the system runtime layer provides support for an upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the core layer includes at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
The above embodiments describe the hardware/software architecture and functional implementation of the display device. In some application scenarios, after the display device is started, the preset user interface shown in fig. 5 may be directly entered, where the user interface includes at least one operation control, and the program in the application layer may be displayed through a specific operation control, or may be further displayed after the operation control is selected, and in addition, the user interface also includes a selector 510 indicating that the operation control is selected, and the position of the selector 510 may be moved according to an instruction input by a user to select a different operation control.
A user may interact with the display device through the control apparatus 100, for example, if the control apparatus 100 is a remote controller, the user may generate a corresponding control instruction by pressing a key on the remote controller, and send the control instruction to the display device through the remote controller, so that the display device executes the control instruction. For example, referring to fig. 5 further, when the current selector 510 is located at the position of the operation control a, when the user wants to perform a selection operation on the operation control D, the user needs to continuously press the "right" direction key on the remote controller three times to issue an instruction to the display device, and the display device may control the selector 510 to move to the position of the operation control D in response to the instruction. Therefore, when the operation control is selected through the keys of the remote controller, multiple operations are often needed, that is, the keys of the remote controller are pressed multiple times to enable the selection frame to reach the position of the specified operation control, so that the operation is inconvenient.
In some embodiments, a scheme similar to a mouse for controlling a cursor on a computer display screen is derived, the cursor may also be displayed in a user interface of the display device, and a user may control the cursor to move on the user interface by moving the control device. However, different from the scheme that a mouse controls a cursor on a computer display screen, when a user uses a control device, most of the control devices are in a scene of moving in the air, and often rely on an acceleration sensor to perform accumulated calculation on a spatial position to determine the relative displacement of the control device so as to control the movement of the cursor, after the control device is rapidly moved or used for a long time, position drift can occur due to sampling precision and other reasons, the accuracy of cursor positioning is reduced, so that the user needs to debug the position of the control device for many times to move the cursor to the position of an appointed operation control, and the cursor is not beneficial to user operation.
Therefore, in order to solve the above technical problem and increase the experience of the user, an embodiment of the present application provides a display device, where the display device includes at least a display, a communicator, and a controller. The display is used for displaying a user interface, the communicator user receives a control signal sent by a user through the control device in the sensing area, and the controller is used for executing a corresponding application program according to the control signal received by the communicator. The user may send a mode selection instruction indicating to enter a cursor mode to the controller, and after receiving the instruction, the controller controls to start the cursor mode and controls to display a cursor 610 shown in fig. 6 in the user interface, where the cursor mode is a mode in which the user can control the cursor to move in the user interface by moving the control device.
In some embodiments, a user may send a mode selection instruction indicating entering a cursor mode to a display device by operating a designated key of a control device, taking the control device as a remote controller as an example, a corresponding relationship between a cursor mode and a key of the remote controller may be bound in advance, when the user touches the key bound to the cursor mode, the control device sends the mode selection instruction indicating entering the cursor mode to a controller, and after receiving the mode selection instruction, the controller controls to open the cursor mode, and when the user touches the key again, the controller may control to close the cursor mode.
In some embodiments, a user may use a sound collector of the display device, such as a microphone, to send a mode selection instruction indicating entry into the cursor mode to the display device by way of voice input, so that the display device enters the cursor mode.
In some embodiments, the user may also send a mode selection instruction to the display device indicating entry into the cursor mode through a preset gesture or action. The display device can detect the user's behavior in real time through the image collector. When detecting that the user makes a preset gesture or action, the controller may control to open the cursor mode.
In some embodiments, when a user controls the display device using a smart device, such as a cell phone, a mode selection instruction may also be sent to the display device indicating entry into a cursor mode. In the process of practical application, a corresponding control can be set in the mobile phone, and whether the mobile phone enters the cursor mode is selected through the control, so that a mode selection instruction for indicating the mobile phone to enter the cursor mode is sent to the display device.
In some embodiments, a cursor mode option may be set in the UI interface of the display device, and when the user clicks on the option, the display device may be controlled to enter or exit cursor mode.
In some embodiments, after the controller controls to start the cursor mode, the user may send a control instruction indicating the movement of the cursor to the controller through the movement control device, and after receiving the control instruction, the controller may detect input coordinates of the control device in the reference coordinate system of the sensing area, so as to control the position of the cursor in the user interface according to the input coordinates.
The sensing area is an area of a real physical space within a certain range around the display device, for example, the sensing area may be an area in front of the display device, and the range of the sensing area may be an area in a cubic space with a length, a width, and a height in front of the display device.
For convenience of description, in the embodiment of the present application, the reference coordinate system of the sensing region is referred to as a first coordinate system, and the reference coordinate system of the user interface/display device is referred to as a second coordinate system, wherein the first coordinate system is a three-dimensional coordinate system including an X-axis, a Y-axis and a Z-axis, the second coordinate system is a two-dimensional coordinate system including an X-axis and a Y-axis, the X-axis and the Y-axis may be two adjacent boundaries of the user interface, and the X-axis and the Y-axis may be in the same direction as the X-axis, the Y-axis and the Y-axis are in the same direction, and the Z-axis is perpendicular to the user interface, that is, the Z-axis is perpendicular to a plane formed by the X-axis and the Y-axis.
In some embodiments, the control device is a device capable of emitting a radio frequency signal, and when a user sends a radio frequency signal for indicating a cursor in the user interface to move to the display device through the control device within the sensing area, the display device receives the radio frequency signal and obtains input coordinates of the position of the control point in the first coordinate system. The control point is a point on the control device, specifically, a position of a signal source from which the control device emits a signal may be used as the control point, or a pose of the control device in the sensing region may be acquired by the image acquisition device to acquire pose coordinates of each point on the housing of the control device in the first coordinate system, a center point coordinate of the control device may be calculated according to the pose coordinates, and the center point of the control device may be used as the control point, or an input coordinate of the control point in the first coordinate system may be calculated according to the signal source coordinate and the pose coordinates, for example, a midpoint between a position of the signal source coordinate and a position of the pose coordinate farthest from the position of the signal source coordinate may be used as the control point, and a coordinate of the control point in the first coordinate system is the input coordinate.
In some embodiments, the controller may convert the input coordinates into coordinates in a second coordinate system by a conversion coefficient, where the conversion coefficient is a ratio of a maximum value on each coordinate axis in the first coordinate system to a maximum value on each coordinate axis in the second coordinate system, where the maximum value on each coordinate axis in the first coordinate system is a value corresponding to the length, the width, and the height of the sensing area, respectively, and the maximum value on each coordinate axis in the second coordinate system is a value corresponding to the length and the width of the user interface, respectively. For example, if the length, width and height of the sensing region correspond to Xmax、Ymax、ZmaxThen the maximum value set P on each coordinate axis in the first coordinate systemmaxCan be expressed as:
Figure BDA0003609556960000071
if the length and width of the user interface respectively correspond to the numerical value xmaxAnd ymaxThen the maximum value set Q on each coordinate axis in the second coordinate systemmaxCan be expressed as:
Figure BDA0003609556960000081
therefore, the conversion coefficient δ can be expressed as:
Figure BDA0003609556960000082
if the coordinate P is input1=(X1,Y1,Z1) From the conversion coefficient, the input coordinates P can be obtained1Coordinates of the mapped points in a second coordinate system
Figure BDA0003609556960000083
In some embodiments, a first layer and a second layer may be displayed on the user interface, the operation control may be located at any position of the first layer, and each operation control occupies an area of one rectangular element frame on the first layer. The cursor can be located on the second layer and can move freely on the second layer. The second layer is arranged on the upper layer of the first layer in a suspending mode, the area of the second layer is larger than or equal to that of the first layer, and the second layer is a transparent layer, so that the effect that a cursor can move freely among the operation controls can be displayed in a user interface. The shape, color and display mode of the cursor may be set by default by a corresponding control program inside the display device, or may be set by a user, which is not limited in the embodiment of the present invention. For example, the cursor may be in a black cross pattern.
Fig. 7 is a schematic flowchart of cursor positioning in some embodiments, as shown in fig. 7, each operation control corresponds to a respective core area, in S701, after the controller acquires the mapping point coordinates, it may detect whether the mapping point is within the core area range according to the mapping point coordinates, if the mapping point is within the core area range, the cursor is controlled to move to the center coordinate position of the operation control corresponding to the core area, and if the mapping point is outside the core area range, the cursor is controlled to move to the position of the mapping point coordinates, and it should be understood that the mapping point is not displayed in the user interface.
In some embodiments, boundary coordinates of the boundary of the operation control in the second coordinate system may be detected to obtain center coordinates of each operation control. The center coordinates of the element frame may be obtained by detecting median coordinates of the boundaries of the operation control in the X-axis direction and the Y-axis direction in the user interface, for example, selecting any two adjacent boundaries of the operation control, where the edges parallel to the X-axis are selectedThe median coordinate on the boundary is (x)a,ya) The median coordinate on the boundary parallel to the Y axis is (x)b,yb) Then, the center coordinate of the operation control can be obtained as (x) a,yb)。
In some embodiments, the core area may be an area whose distance from the center of the corresponding operation control is smaller than a first threshold, and after the controller acquires the mapping point coordinates, the distance between the mapping point coordinates and the center coordinates of each operation control may be calculated; if the distance between the mapping point coordinate and the center coordinate of an operation control is smaller than or equal to a first threshold, the mapping point is in the range of the core area corresponding to the operation control, and the cursor is controlled to move to the position of the center coordinate of the operation control; if the distance between the mapping point coordinate and the center coordinate of any operation control is larger than the first threshold value, the mapping point is not in any core area range, the cursor is controlled to move to the mapping point coordinate position, and it is understood that the mapping point is not displayed in the user interface. For example, referring to fig. 8, the user interface diagram displayed by the display device in some embodiments includes an operation control 810, an operation control 820, and an operation control 830, where the operation control 810 corresponds to a core area W1, the operation control 820 corresponds to a core area W2, the operation control 830 corresponds to a core area W3, a position of the mapped point in the user interface is obtained according to coordinates of the mapped point, if the position of the mapped point in the user interface is at a1 point or a2 point, the cursor is controlled to move to an O1 point position (a center coordinate position of the operation control 810), and if the position of the mapped point in the user interface is at a B1 point, the cursor is controlled to move to a B1 point.
In some embodiments, if the mapping point is located in the range of two core regions, the distance between the mapping point and the center coordinate position of the operation control corresponding to the two core regions is obtained, and the mapping point is moved to the center coordinate position of the operation control closest to the mapping point, for example, further referring to fig. 8, the mapping point is located at point C1, the distance d1 between point C1 and point O2 and the distance d2 between point C1 and point O3 are obtained, and since d1< d2, the cursor is controlled to be moved to the position of point O2.
In some embodiments, the core area may be an area within a boundary range of the corresponding operation control, after the controller acquires the mapping point coordinates, the boundary coordinates of each operation control in the second coordinate system may be acquired, and by comparing the boundary coordinates with the mapping point coordinates, it may be determined whether the mapping point coordinates are within the boundary coordinate range of any operation control, and if the mapping point coordinates are within the boundary coordinate range of an operation control, it is determined that the mapping point is within the core area range corresponding to the operation control, the cursor is controlled to move to the center coordinate position of the operation control; and if the mapping point coordinate is not in the range of the boundary coordinate of any operation control, indicating that the mapping point is not in any core area range, controlling the cursor to move to the mapping point coordinate position. For example, referring to fig. 9, in the user interface displayed by the display device in some embodiments, the operation control 810 corresponds to the core area W4, and the boundary of the operation control 810 coincides with the boundary of the core area W4, after the controller obtains the mapping point coordinates, the controller may obtain the boundary coordinates of each operation control in the user interface to obtain the number of core areas in the user interface equal to the number of operation controls, obtain the corresponding position of the mapping point in the user interface according to the mapping point coordinates, control the cursor to move to the O1 point position (the center coordinate position of the operation control 810) if the position of the mapping point in the user interface is the A3 position, and control the cursor to move to the B2 position if the position of the mapping point is the B2 position.
In some embodiments, the core area may be an area surrounded by a point that is outside a boundary range of the operation control and is equal to the second threshold from the boundary of the operation control, and after the controller acquires the mapping point coordinates, if the mapping point coordinates are not within the boundary coordinate range of the operation control, the buffer point coordinate set in each core area in the user interface may be acquired, and it is detected whether the mapping point coordinates are equal to any buffer point coordinates, and if the mapping point coordinates are equal to a certain buffer point coordinate, it is indicated that the mapping point is within the core area range corresponding to the operation control, the boundary coordinates of the core area where the buffer point is located are acquired, so as to acquire the center coordinates of the operation control within the boundary coordinate range of the core area, and control the cursor to move to the center coordinate position. And if the mapping point coordinate is not equal to any buffer point coordinate, and the mapping point coordinate is not in the boundary coordinate range of each operation control, indicating that the mapping point is not in any core area range, controlling the cursor to move to the mapping point coordinate position. For example, referring to fig. 10, the user interface diagram displayed by the display device in some embodiments is a user interface diagram in which a buffer area surrounds each operation control, that is, a core area is a sum of an area of the corresponding operation control and a buffer area corresponding to the area, in fig. 10, the operation control 810 corresponds to the core area W5, after the controller acquires the mapping point coordinates, the corresponding position of the mapping point in the user interface is acquired according to the mapping point coordinates, if the mapping point is located at the position of a4 or a5, the cursor is controlled to move to the position of O1 (the center coordinate position of the operation control 810), and if the mapping point is located at the position of B3, the cursor is controlled to move to the position of B3.
In some embodiments, when the mapping point is within the core area range, the controller controls the cursor to be moved to the center of the operation control, and then, if the user inputs a corresponding movement instruction to the display device by moving the control device, the controller calculates whether the mapping point is within the core area range, controls the cursor to be always at the center coordinate position of the operation control if the mapping point is always within the core area range, and controls the cursor to be moved to the position of the mapping point coordinate at a certain moment if the mapping point is outside the core area range. For example, referring to fig. 11, taking the core region as an area smaller than the first threshold value from the center of the corresponding operation control, that is, taking the core region corresponding to the operation control 830 as W3, the position of the mapping point passes through B1 ', C1 ', D1 ', and E1 ' in sequence from the start position a1 ', the cursor is always at the O3 position (the center coordinate position of the operation control 830) when the mapping point position is between the sections a1 ' -B1 ', the cursor is at the same position as the mapping point when the mapping point position is in the sections B1 ' -C1 ' -D1 ', that is, the cursor is moved to the C1 ' position when the mapping point position is the C1 ' position, the cursor is moved to the O3 position when the mapping point reaches the D1 ', and the cursor is still at the O3 position when the mapping point position is shifted between the sections D1 ' -E1 '.
In some embodiments, when the mapping point is within the core area range, and the controller controls the cursor to be moved to the center of the operation control, the controller may control to add a focus mark to the operation control and extract a focus rendering parameter, so as to render the operation control according to the focus rendering parameter, so that the marked operation control is displayed in the user interface, where the focus rendering parameter may be set by default by a corresponding control program inside the display device, or may be set by a user in advance, which is not limited in the embodiments of the present invention. For example, referring to fig. 12, when the cursor is located at the center coordinate position of the operation control 910, the focus rendering parameter is extracted, wherein the focus rendering parameter may include a color rendering parameter according to which the operation control 910 to which the focus mark is added is rendered, and the operation control 910 filled in a preset color may be displayed in the user interface, or, referring to fig. 13, the focus rendering parameter may include a size rendering parameter according to which the operation control 910 to which the focus mark is added is rendered, and the enlarged operation control 910 may be displayed in the user interface, or, referring to fig. 14, the focus rendering parameter may include a line rendering parameter according to which the operation control 910 to which the focus mark is added is rendered, and the operation control 910 in which the boundary line is thickened, the above-described color rendering parameter, and the focus rendering parameter may be displayed in the user interface, The size rendering parameter and the line rendering parameter are only focus rendering parameters exemplarily provided by the present application, and in practical application, the focus rendering parameter should include at least one of the color rendering parameter, the size rendering parameter, and the line rendering parameter, and in addition, the focus rendering parameter may further include other types of rendering parameters, which is not limited in the present application.
In some embodiments, when the mapping point moves from outside the core area range to inside the core area range, the controller controls the cursor to move to the central coordinate position of the operation control and render the operation control, and meanwhile, the controller calls the transparency parameter of the cursor, adjusts the transparency parameter of the cursor to 100%, and renders the cursor according to the adjusted transparency parameter, so that only the marked operation control is displayed in the user interface, and the cursor is not displayed.
In some embodiments, when the mapping point moves from the range of the core area to the range of the core area, the controller controls the cursor to move from the center coordinate position of the operation control to the coordinate position of the mapping point and removes the mark added to the operation control, and meanwhile, the controller calls the transparency parameter of the cursor, adjusts the transparency parameter of the cursor to the preset transparency, and renders the cursor according to the adjusted transparency parameter, so as to display the cursor with the preset transparency at the coordinate position of the mapping point and the operation control removing the effect of the focus mark in the user interface.
For example, referring to fig. 15, when the mapping point moves from the P position to the Q position, the controller controls the cursor to move to the center coordinate position of the operation control 910, and adds a mark to the operation control 910, obtains a color rendering parameter, a size rendering parameter, and a line rendering parameter in the focus rendering parameters by extracting the focus rendering parameter, and renders the operation control 910 according to the color rendering parameter, the size rendering parameter, and the line rendering parameter, and meanwhile, the controller calls the transparency parameter of the cursor, adjusts the transparency parameter of the cursor to 100%, and renders the cursor according to the adjusted transparency parameter, so as to display only the marked operation control in the user interface without displaying the cursor.
Referring to fig. 16, when the mapping point moves from the Q position to the P position, the controller controls to move the cursor from the center coordinate position of the operation control D to the P position and cancel the mark added to the operation control D, and at the same time, the controller calls the transparency parameter of the cursor, adjusts the transparency parameter of the cursor to 0%, and renders the cursor according to the adjusted transparency parameter, so as to display only the cursor with the transparency of 0% at the P position and the operation control with the focus mark effect removed in the user interface.
In some embodiments, the memory stores a sound effect program, and when the mapping point moves from outside the core area to inside the core area, the controller controls the cursor to move to the central coordinate position of the operation control, and at the same time, controls the sound effect program to be started, so as to play a preset sound effect through the loudspeaker to remind the user that the cursor has moved to the central coordinate position of the operation control.
In some embodiments, the control device may be provided with a vibration module, when the mapping point moves from outside the core area to inside the core area, the controller controls the cursor to move to the center coordinate position of the operation control, and meanwhile, the controller sends feedback information of the cursor moving to the center coordinate position of the operation control to the control device, and after receiving the feedback information, the control device controls the vibration module to be started to generate vibration with a preset frequency and a preset duration, so as to remind a user that the cursor has moved to the center coordinate position of the operation control.
According to the above embodiments, an embodiment of the present application further provides a cursor positioning method, referring to fig. 17, which is a schematic flowchart of the cursor positioning method provided by the present application, as shown in fig. 17, the method may be executed on a display device side, where an execution subject of the method is a controller in the display device, and the cursor positioning method includes:
s101: receiving a control instruction which is input by a user and indicates the movement of a cursor;
in some embodiments, the receiving a control instruction indicating cursor movement input by a user further comprises: receiving a mode selection instruction which is input by a user and indicates that a cursor mode is entered, wherein the cursor mode refers to a mode that the control device controls the cursor to move in the user interface; responding to the mode selection instruction, and starting the cursor mode; displaying the cursor in the user interface.
In some embodiments, said detecting input coordinates of said control device in a reference coordinate system of said sensing region further comprises: acquiring signal source coordinates of the signal source position of the control instruction in a reference coordinate system of the induction area; acquiring a pose coordinate of the pose of the control device in a reference coordinate system of the induction area; and calculating the input coordinate according to the signal source coordinate and the pose coordinate.
In some embodiments, the obtaining mapping point coordinates further comprises: acquiring a conversion coefficient of the input coordinate and the mapping point coordinate, wherein the conversion coefficient is a ratio of a maximum value on each coordinate axis in a reference coordinate system of the sensing area to a maximum value on each coordinate axis in a reference coordinate system of the user interface; and calculating the coordinates of the input coordinates in a reference coordinate system of the user interface according to the conversion coefficient.
In some embodiments, the calculating a distance between the mapping point coordinates and the center coordinates of the operational control further comprises: acquiring boundary coordinates of the boundary position of the operation control in a reference coordinate system of the user interface; acquiring a median coordinate of the boundary coordinate on each edge boundary of the operation control; and obtaining the central coordinate according to the median coordinate.
In some embodiments, said controlling said cursor to move to said center coordinate position further comprises: adding a focus mark for the operation control; extracting focus rendering parameters; and displaying the marked operation control in the user interface according to the focus rendering parameter.
S102: detecting input coordinates of the control device in a reference coordinate system of the sensing area in response to the control command;
s103: acquiring mapping point coordinates according to the input coordinates, wherein the mapping point coordinates are coordinates converted from the input coordinates into a reference coordinate system of the user interface;
s104: calculating the distance between the mapping point coordinates and the center coordinates of the operation control;
s105: if the distance between the mapping point coordinate and the center coordinate is smaller than or equal to a first threshold value, controlling the cursor to move to the position of the center coordinate;
s106: and if the distance between the mapping point coordinate and the center coordinate is larger than a first threshold value, controlling the cursor to move to the mapping point coordinate position.
In some embodiments, the method further comprises: comparing the boundary coordinates and the mapping point coordinates; if the mapping point coordinate is within the range of the boundary coordinate, controlling the cursor to move to the central coordinate position; and if the mapping point coordinate is out of the range of the boundary coordinate, controlling the cursor to move to the mapping point coordinate position.
In some embodiments, said controlling said cursor to move to said mapped point coordinate location further comprises: obtaining a buffer point coordinate set in a reference coordinate system of the user interface, wherein the buffer point set comprises a plurality of buffer points, the buffer points are located outside the boundary coordinate range, and the distance between each buffer point and the boundary coordinate is smaller than a second threshold; if the mapping point coordinate is equal to any one of the buffer point coordinates, controlling the cursor to move to the central coordinate position of the operation control; and if the mapping point coordinate is not equal to any one of the buffer point coordinates, controlling the cursor to move to the mapping point coordinate position.
In some embodiments, the method further comprises: acquiring a moving instruction input by a user; calculating a difference value between the mapping point coordinates and the center coordinates after the movement in response to the movement instruction; if the difference value is smaller than or equal to the first threshold value, adding a focus mark to the operation control; and if the difference value is larger than the first threshold value, removing the focus mark from the operation control.
In a specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in each embodiment of the cursor positioning method provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of a software product, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
The same and similar parts among the various embodiments in this specification may be referred to each other. In particular, for the embodiment of the display device, since it is substantially similar to the embodiment of the method, the description is simple, and for the relevant points, refer to the description in the embodiment of the method.
The above-described embodiments of the present invention should not be construed as limiting the scope of the present invention.

Claims (10)

1. A display device, comprising:
the display is used for displaying a user interface, and the user interface comprises a cursor and an operation control;
the communicator is used for receiving a control signal transmitted by a user through the control device in the sensing area;
a controller configured to:
receiving a control instruction which is input by a user and indicates the movement of a cursor;
in response to the control instruction, detecting input coordinates of the control device in a reference coordinate system of the sensing area;
acquiring mapping point coordinates according to the input coordinates, wherein the mapping point coordinates are coordinates converted from the input coordinates into a reference coordinate system of the user interface;
calculating the distance between the mapping point coordinate and the central coordinate of the operation control;
if the distance between the mapping point coordinate and the center coordinate is smaller than or equal to a first threshold value, controlling the cursor to move to the position of the center coordinate;
And if the distance between the mapping point coordinate and the center coordinate is larger than a first threshold value, controlling the cursor to move to the mapping point coordinate position.
2. The display device of claim 1, wherein prior to the receiving a user input of a control instruction indicating cursor movement, the controller is further configured to:
receiving a mode selection instruction which is input by a user and indicates that a cursor mode is entered, wherein the cursor mode refers to a mode that the cursor is controlled to move in the user interface through the control device;
responding to the mode selection instruction, and starting the cursor mode;
displaying the cursor in the user interface.
3. The display device of claim 1, wherein in the step of detecting the input coordinates of the control device in the reference coordinate system of the sensing region, the controller is further configured to:
acquiring signal source coordinates of the signal source position of the control instruction in a reference coordinate system of the induction area;
acquiring a pose coordinate of the pose of the control device in a reference coordinate system of the induction area;
and calculating the input coordinate according to the signal source coordinate and the pose coordinate.
4. The display device according to claim 1, wherein in the acquiring mapping point coordinates step, the controller is further configured to:
acquiring a conversion coefficient of the input coordinate and the mapping point coordinate, wherein the conversion coefficient is a ratio of a maximum value on each coordinate axis in a reference coordinate system of the sensing area to a maximum value on each coordinate axis in a reference coordinate system of the user interface;
and calculating the coordinate of the input coordinate in a reference coordinate system of the user interface according to the conversion coefficient.
5. The display device according to claim 1, wherein in the calculating of the distance between the mapping point coordinates and the center coordinates of the operation control, the controller is further configured to:
acquiring boundary coordinates of the boundary position of the operation control in a reference coordinate system of the user interface;
acquiring a median coordinate of the boundary coordinate on each edge boundary of the operation control;
and obtaining the center coordinate according to the median coordinate.
6. The display device according to claim 5, wherein the controller is further configured to:
comparing the boundary coordinates and the mapping point coordinates;
If the mapping point coordinate is within the range of the boundary coordinate, controlling the cursor to move to the central coordinate position;
and if the mapping point coordinate is out of the range of the boundary coordinate, controlling the cursor to move to the mapping point coordinate position.
7. The display device according to claim 6, wherein in the controlling the cursor to move to the mapping point coordinate position, the controller is further configured to:
obtaining a buffer point coordinate set in a reference coordinate system of the user interface, wherein the buffer point coordinate set comprises a plurality of buffer point coordinates, the buffer point coordinates are located outside the boundary coordinate range, and the distance between the buffer point coordinates and the boundary coordinates is smaller than a second threshold;
if the mapping point coordinate is equal to any one of the buffer point coordinates, controlling the cursor to move to the central coordinate position of the operation control;
and if the mapping point coordinate is not equal to any one of the buffer point coordinates, controlling the cursor to move to the mapping point coordinate position.
8. The display device according to claim 1, wherein in the controlling the cursor to move to the center coordinate position, the controller is further configured to:
Adding a focus mark for the operation control;
extracting focus rendering parameters;
displaying the marked operation control in the user interface according to the focus rendering parameter.
9. The display device of claim 1, wherein the controller is further configured to
Acquiring a moving instruction input by a user;
calculating a difference value between the mapping point coordinates and the center coordinates after the movement in response to the movement instruction;
if the difference value is smaller than or equal to the first threshold value, adding a focus mark to the operation control;
and if the difference value is larger than the first threshold value, removing the focus mark from the operation control.
10. A cursor positioning method, comprising:
receiving a control instruction which is input by a user and indicates the movement of a cursor;
in response to the control instruction, detecting input coordinates of the control device in a reference coordinate system of the sensing area;
acquiring mapping point coordinates according to the input coordinates, wherein the mapping point coordinates are coordinates converted from the input coordinates into a reference coordinate system of the user interface;
calculating the distance between the mapping point coordinate and the central coordinate of the operation control;
If the distance between the mapping point coordinate and the center coordinate is smaller than or equal to a first threshold value, controlling the cursor to move to the position of the center coordinate;
and if the distance between the mapping point coordinate and the center coordinate is larger than a first threshold value, controlling the cursor to move to the mapping point coordinate position.
CN202210429564.1A 2022-04-22 2022-04-22 Display device and cursor positioning method Pending CN114760513A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210429564.1A CN114760513A (en) 2022-04-22 2022-04-22 Display device and cursor positioning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210429564.1A CN114760513A (en) 2022-04-22 2022-04-22 Display device and cursor positioning method

Publications (1)

Publication Number Publication Date
CN114760513A true CN114760513A (en) 2022-07-15

Family

ID=82331215

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210429564.1A Pending CN114760513A (en) 2022-04-22 2022-04-22 Display device and cursor positioning method

Country Status (1)

Country Link
CN (1) CN114760513A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116048313A (en) * 2022-08-25 2023-05-02 荣耀终端有限公司 Cursor control method, cursor control device and storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5805165A (en) * 1995-08-31 1998-09-08 Microsoft Corporation Method of selecting a displayed control item
CN101568896A (en) * 2007-06-08 2009-10-28 索尼株式会社 Information processing apparatus, input device, information processing system, information processing method, and program
US20110169734A1 (en) * 2010-01-12 2011-07-14 Cho Sanghyun Display device and control method thereof
CN103914156A (en) * 2013-01-02 2014-07-09 三星电子株式会社 Method For Compensating Coordinates By Using Display Apparatus And Input Apparatus
CN103997668A (en) * 2014-02-25 2014-08-20 华为技术有限公司 Method for displaying selection of mobile equipment, and terminal equipment
CN104331212A (en) * 2013-07-22 2015-02-04 原相科技股份有限公司 Cursor positioning method of handheld pointing device
CN104363495A (en) * 2014-11-27 2015-02-18 北京奇艺世纪科技有限公司 Method and device for conducting focus switching control through remote control of terminal device
CN104869470A (en) * 2015-05-25 2015-08-26 广州创维平面显示科技有限公司 Realization method of automatically capturing UI focal point according to remote control cursor position and system thereof
US20160139692A1 (en) * 2014-11-19 2016-05-19 Screenovate Technologies Ltd. Method and system for mouse control over multiple screens
CN106162276A (en) * 2015-03-30 2016-11-23 腾讯科技(深圳)有限公司 Intelligent television system input method and device, terminal auxiliary input method and device
CN108255317A (en) * 2018-02-08 2018-07-06 北京硬壳科技有限公司 Method and device for controlling cursor
CN108886634A (en) * 2016-04-25 2018-11-23 Lg电子株式会社 Display device and the method for operating display device
CN112799576A (en) * 2021-02-22 2021-05-14 Vidaa美国公司 Virtual mouse moving method and display device

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5805165A (en) * 1995-08-31 1998-09-08 Microsoft Corporation Method of selecting a displayed control item
CN101568896A (en) * 2007-06-08 2009-10-28 索尼株式会社 Information processing apparatus, input device, information processing system, information processing method, and program
US20110169734A1 (en) * 2010-01-12 2011-07-14 Cho Sanghyun Display device and control method thereof
CN103914156A (en) * 2013-01-02 2014-07-09 三星电子株式会社 Method For Compensating Coordinates By Using Display Apparatus And Input Apparatus
CN104331212A (en) * 2013-07-22 2015-02-04 原相科技股份有限公司 Cursor positioning method of handheld pointing device
CN103997668A (en) * 2014-02-25 2014-08-20 华为技术有限公司 Method for displaying selection of mobile equipment, and terminal equipment
US20160139692A1 (en) * 2014-11-19 2016-05-19 Screenovate Technologies Ltd. Method and system for mouse control over multiple screens
CN104363495A (en) * 2014-11-27 2015-02-18 北京奇艺世纪科技有限公司 Method and device for conducting focus switching control through remote control of terminal device
CN106162276A (en) * 2015-03-30 2016-11-23 腾讯科技(深圳)有限公司 Intelligent television system input method and device, terminal auxiliary input method and device
CN104869470A (en) * 2015-05-25 2015-08-26 广州创维平面显示科技有限公司 Realization method of automatically capturing UI focal point according to remote control cursor position and system thereof
CN108886634A (en) * 2016-04-25 2018-11-23 Lg电子株式会社 Display device and the method for operating display device
CN108255317A (en) * 2018-02-08 2018-07-06 北京硬壳科技有限公司 Method and device for controlling cursor
CN112799576A (en) * 2021-02-22 2021-05-14 Vidaa美国公司 Virtual mouse moving method and display device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116048313A (en) * 2022-08-25 2023-05-02 荣耀终端有限公司 Cursor control method, cursor control device and storage medium
CN116048313B (en) * 2022-08-25 2024-04-16 荣耀终端有限公司 Cursor control method, cursor control device and storage medium

Similar Documents

Publication Publication Date Title
KR101733115B1 (en) Method and apparatus for controlling content of the remote screen
CN112463269B (en) User interface display method and display equipment
CN111970549B (en) Menu display method and display device
CN114157889B (en) Display equipment and touch control assisting interaction method
CN114501107A (en) Display device and coloring method
CN115129214A (en) Display device and color filling method
CN115437542A (en) Display device and screen projection inverse control method
CN111901646A (en) Display device and touch menu display method
CN114115637A (en) Display device and electronic drawing board optimization method
CN113778217A (en) Display apparatus and display apparatus control method
CN114760513A (en) Display device and cursor positioning method
CN112473121B (en) Display device and avoidance ball display method based on limb identification
CN113825002A (en) Display device and focus control method
CN112926420B (en) Display device and menu character recognition method
CN112199560B (en) Search method of setting items and display equipment
WO2021219002A1 (en) Display device
CN111259639B (en) Self-adaptive adjustment method of table and display equipment
CN116801027A (en) Display device and screen projection method
CN114296623A (en) Display device
CN112367550A (en) Method for realizing multi-title dynamic display of media asset list and display equipment
CN114143580B (en) Display equipment and handle control pattern display method
CN111913621B (en) Screen interface interactive display method and display equipment
CN117406886A (en) Display device and floating window display method
CN116935431A (en) Determination method of centroid of human hand region and display device
CN115550716A (en) Display device and color mixing display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination