CN107835461B - Focus movement control method, smart television and computer-readable storage medium - Google Patents

Focus movement control method, smart television and computer-readable storage medium Download PDF

Info

Publication number
CN107835461B
CN107835461B CN201711066952.3A CN201711066952A CN107835461B CN 107835461 B CN107835461 B CN 107835461B CN 201711066952 A CN201711066952 A CN 201711066952A CN 107835461 B CN107835461 B CN 107835461B
Authority
CN
China
Prior art keywords
focus
sub
position information
current
direction key
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711066952.3A
Other languages
Chinese (zh)
Other versions
CN107835461A (en
Inventor
李贞贞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen leynew Network Media Co Ltd
Original Assignee
Shenzhen Leiniao Network Media Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Leiniao Network Media Co ltd filed Critical Shenzhen Leiniao Network Media Co ltd
Priority to CN201711066952.3A priority Critical patent/CN107835461B/en
Publication of CN107835461A publication Critical patent/CN107835461A/en
Application granted granted Critical
Publication of CN107835461B publication Critical patent/CN107835461B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a focus movement control method. The method comprises the following steps: acquiring position information of a sub-control corresponding to a current focus, and determining the position information of a corresponding focus frame according to the position information of the sub-control corresponding to the current focus; receiving direction key information triggered by a user, and calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame; judging whether the position information of the sub-control corresponding to the next focus exceeds a preset focus moving boundary or not; if so, calling a system native interface to control the sub-control corresponding to the current focus to move to the corresponding direction according to the direction key information, and controlling the position of the focus frame to be unchanged; if not, controlling the focus frame to move towards the corresponding direction according to the direction key information. The invention also discloses the intelligent television and a computer readable storage medium. The invention can save system resources without destroying the original system structure.

Description

Focus movement control method, smart television and computer-readable storage medium
Technical Field
The invention relates to the technical field of focus movement control, in particular to a focus movement control method, an intelligent television and a computer readable storage medium.
Background
In the development of the Android platform, in order to enable a user to have a better experience effect, focus movement and interface content scrolling generally adopt a smooth movement mode. At present, the method for implementing the smooth moving effect mainly includes the following two methods: 1) the method is realized through an application layer, and usually, all the sub-controls and the focus frames in a display interface are drawn frame by frame through a copying drawing function, although the system is not modified, when the sub-controls or the focus frames are not changed, the waste of system resources is caused; 2) the method is realized through a frame framework layer, and the method is realized by modifying the original control of an Android system layer, so that the structure of the original Android system can be damaged to different degrees, and the system stability is poor.
Disclosure of Invention
The invention mainly aims to provide a focus frame movement control method, a smart television and a computer readable storage medium, and aims to save system resources while not damaging the original system structure.
To achieve the above object, the present invention provides a focus movement control method including:
acquiring position information of a sub-control corresponding to a current focus, and determining the position information of a corresponding focus frame according to the position information of the sub-control corresponding to the current focus;
receiving direction key information triggered by a user, and calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame;
judging whether the position information of the sub-control corresponding to the next focus exceeds a preset focus moving boundary or not;
if so, calling a system native interface to control the sub-control corresponding to the current focus to move to the corresponding direction according to the direction key information, and controlling the position of the focus frame to be unchanged;
if not, controlling the focus frame to move towards the corresponding direction according to the direction key information.
Optionally, if so, invoking a system native interface to control the sub-control corresponding to the current focus to move in the corresponding direction according to the direction key information, and controlling the position of the focus frame to be unchanged, including:
if so, determining whether the row/column of the sub-control corresponding to the next focus is an edge row/column according to the position information of the sub-control corresponding to the next focus;
when the row/column of the sub-control corresponding to the next focus is not an edge row/column, calling a system native interface to control the sub-control corresponding to the current focus to move to the corresponding direction according to the direction key information, and controlling the position of the focus frame to be unchanged;
and when the line/column of the sub-control corresponding to the next focus is an edge line/column, supplementing the line/column outside the edge line/column by adopting a blank sub-control, calling a system native interface to control the sub-control corresponding to the current focus to move towards the corresponding direction according to the direction key information, and controlling the position of the focus frame to be unchanged.
Optionally, the step of obtaining the position information of the sub-control corresponding to the current focus and determining the position information of the corresponding focus frame according to the position information of the sub-control corresponding to the current focus includes:
acquiring position information and width and height data of a sub-control corresponding to a current focus;
calculating width and height data of a corresponding focus frame according to the width and height data of the sub-control corresponding to the current focus;
and creating a corresponding focus frame according to the width and height data of the corresponding focus frame and the position information of the sub-control corresponding to the current focus, and determining the position information of the focus frame.
Optionally, before the step of receiving direction key information triggered by a user and calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame, the method includes:
and acquiring the slidable direction of the current screen, wherein the slidable direction comprises up-down sliding and left-right sliding.
Optionally, the step of receiving direction key information triggered by a user, and calculating position information of a sub-control corresponding to a next focus according to the direction key information and the position information of the current focus frame includes:
when the slidable direction of the current screen is up-down sliding and the direction key information triggered by a user is received as left/right, calculating the distance between the current focus and the left/right boundary of the current screen according to the position information of the current focus frame and the column number of the sub-controls in the current screen;
and when the distance between the current focus and the left/right boundary of the current screen is larger than zero, calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame.
Optionally, the step of receiving direction key information triggered by a user, and calculating position information of a sub-control corresponding to a next focus according to the direction key information and the position information of the current focus frame further includes:
and when the slidable direction of the current screen is up-and-down sliding and the received direction key information triggered by the user is up/down, calculating the position information of the sub-control corresponding to the next focus according to the direction key information, the position information of the current focus frame, the number of columns of the sub-controls in the current screen and the total number of the sub-controls.
Optionally, the step of receiving direction key information triggered by a user, and calculating position information of a sub-control corresponding to a next focus according to the direction key information and the position information of the current focus frame includes:
when the slidable direction of the current screen is left-right sliding and the direction key information triggered by a user is received to be up/down, calculating the distance between the current focus and the upper/lower boundary of the current screen according to the position information of the current focus frame and the line number of the sub-control in the current screen;
and when the distance between the current focus and the upper/lower boundary of the current screen is greater than zero, calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame.
Optionally, the step of receiving direction key information triggered by a user, and calculating position information of a sub-control corresponding to a next focus according to the direction key information and the position information of the current focus frame further includes:
and when the slidable direction of the current screen is left-right sliding and the received direction key information triggered by the user is left/right, calculating the position information of the sub-control corresponding to the next focus according to the direction key information, the position information of the current focus frame, the line number of the sub-controls in the current screen and the total number of the sub-controls.
In addition, to achieve the above object, the present invention further provides a smart tv, including: a memory, a processor and a focus movement control program stored on the memory and executable on the processor, the focus movement control program when executed by the processor implementing the steps of the focus movement control method as described above.
Further, to achieve the above object, the present invention also provides a computer-readable storage medium having stored thereon a focus movement control program which, when executed by a processor, realizes the steps of the focus movement control method as described above.
The method comprises the steps of obtaining position information of a sub-control corresponding to a current focus, and determining the position information of a corresponding focus frame according to the position information of the sub-control corresponding to the current focus; receiving direction key information triggered by a user, and calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame; judging whether the position information of the sub-control corresponding to the next focus exceeds a preset focus moving boundary or not; if so, calling a system native interface to control the sub-control corresponding to the current focus to move to the corresponding direction according to the direction key information, and controlling the position of the focus frame to be unchanged; if not, controlling the focus frame to move towards the corresponding direction according to the direction key information. Through the mode, the position information of the corresponding focus frame is determined according to the acquired position information of the sub-control corresponding to the current focus, then when the direction key information triggered by a user through a remote controller of the smart television or a direction keyboard of an external keyboard is received, the position information of the sub-control corresponding to the next focus is calculated according to the direction key information, namely the position information of the current focus frame, because of the existence of system rules, the focus can only move in a certain range on a screen, namely, the corresponding focus moving boundary exists, whether the position information of the sub-control corresponding to the next focus exceeds the preset focus moving boundary or not needs to be judged, if the position information of the sub-control corresponding to the next focus exceeds the preset focus moving boundary, a system native interface is called to control the sub-control corresponding to the current focus to move to the corresponding direction according to the direction key information, and the position of the focus frame is controlled to be unchanged, so that the sub-control of the screen is controlled to move by calling the screen rolling function of the native interface. And if the position information of the sub-control corresponding to the next focus does not exceed the preset focus moving boundary, controlling the focus frame to move towards the corresponding direction according to the direction key information, and at the moment, only controlling the focus frame to move, so that the terminal only needs to draw the focus frame at the moment without drawing the sub-control, thereby saving system resources.
Drawings
Fig. 1 is a schematic terminal structure diagram of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a focus movement control method according to a first embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a detailed flow of determining position information of a corresponding focus frame according to position information of a sub-control corresponding to a current focus in an embodiment of the present invention;
fig. 4 is a schematic detailed flow chart illustrating a refinement process of calling a system native interface to control the sub-control corresponding to the current focus to move in a corresponding direction according to the direction key information and to control the position of the focus frame to be unchanged if yes in the embodiment of the present invention;
FIG. 5 is a flowchart illustrating a focus movement control method according to a second embodiment of the present invention;
FIG. 6 is a schematic diagram of a focus frame moving process when the slidable direction of the current screen is up and down according to an embodiment of the focus movement control method of the present invention;
fig. 7 is a schematic diagram of another focus frame movement when the slidable direction of the current screen is sliding left and right according to the embodiment of the focus movement control method of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the prior art, when an Android platform is developed, in order to enable a user to have a better experience effect, focus movement and interface content scrolling generally adopt a smooth movement mode. At present, the method for implementing the smooth moving effect mainly includes the following two methods: 1) the method is realized through an application layer, and usually, all the sub-controls and the focus frames in a display interface are drawn frame by frame through a copying drawing function, although the system is not modified, when the sub-controls or the focus frames are not changed, the waste of system resources is caused; 2) the method is realized through a frame framework layer, and the method is realized by modifying the original control of an Android system layer, so that the structure of the original Android system can be damaged to different degrees, and the system stability is poor.
In order to solve the technical problem, the invention provides a focus movement control method, a smart television and a computer readable storage medium, wherein the position information of a corresponding focus frame is determined by acquiring the position information of a sub-control corresponding to a current focus and according to the position information of the sub-control corresponding to the current focus; receiving direction key information triggered by a user, and calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame; judging whether the position information of the sub-control corresponding to the next focus exceeds a preset focus moving boundary or not; if so, calling a system native interface to control the sub-control corresponding to the current focus to move to the corresponding direction according to the direction key information, and controlling the position of the focus frame to be unchanged; if not, controlling the focus frame to move towards the corresponding direction according to the direction key information. Through the mode, the position information of the corresponding focus frame is determined according to the acquired position information of the sub-control corresponding to the current focus, then when the direction key information triggered by a user through a remote controller of the smart television or a direction keyboard of an external keyboard is received, the position information of the sub-control corresponding to the next focus is calculated according to the direction key information, namely the position information of the current focus frame, because of the existence of system rules, the focus can only move in a certain range on a screen, namely, the corresponding focus moving boundary exists, whether the position information of the sub-control corresponding to the next focus exceeds the preset focus moving boundary or not needs to be judged, if the position information of the sub-control corresponding to the next focus exceeds the preset focus moving boundary, a system native interface is called to control the sub-control corresponding to the current focus to move to the corresponding direction according to the direction key information, and the position of the focus frame is controlled to be unchanged, so that the sub-control of the screen is controlled to move by calling the screen rolling function of the native interface. And if the position information of the sub-control corresponding to the next focus does not exceed the preset focus moving boundary, controlling the focus frame to move towards the corresponding direction according to the direction key information, and at the moment, only controlling the focus frame to move, so that the terminal only needs to draw the focus frame at the moment without drawing the sub-control, thereby saving system resources.
Referring to fig. 1, fig. 1 is a schematic terminal structure diagram of a hardware operating environment according to an embodiment of the present invention.
The terminal is an intelligent television provided with an Android system.
As shown in fig. 1, the terminal may include: a processor 1001, such as a CPU, a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., a Wi-Fi interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the terminal structure shown in fig. 1 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and a focus movement control program.
In the terminal shown in fig. 1, the network interface 1004 is mainly used for connecting to a backend server and performing data communication with the backend server; the user interface 1003 is mainly used for connecting a client and performing data communication with the client; and the processor 1001 may be configured to call the focus movement control program stored in the memory 1005 and perform the following operations:
acquiring position information of a sub-control corresponding to a current focus, and determining the position information of a corresponding focus frame according to the position information of the sub-control corresponding to the current focus;
receiving direction key information triggered by a user, and calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame;
judging whether the position information of the sub-control corresponding to the next focus exceeds a preset focus moving boundary or not;
if so, calling a system native interface to control the sub-control corresponding to the current focus to move to the corresponding direction according to the direction key information, and controlling the position of the focus frame to be unchanged;
if not, controlling the focus frame to move towards the corresponding direction according to the direction key information.
Further, the processor 1001 may call the focus movement control program stored in the memory 1005, and also perform the following operations:
if so, determining whether the row/column of the sub-control corresponding to the next focus is an edge row/column according to the position information of the sub-control corresponding to the next focus;
when the row/column of the sub-control corresponding to the next focus is not an edge row/column, calling a system native interface to control the sub-control corresponding to the current focus to move to the corresponding direction according to the direction key information, and controlling the position of the focus frame to be unchanged;
and when the line/column of the sub-control corresponding to the next focus is an edge line/column, supplementing the line/column outside the edge line/column by adopting a blank sub-control, calling a system native interface to control the sub-control corresponding to the current focus to move towards the corresponding direction according to the direction key information, and controlling the position of the focus frame to be unchanged.
Further, the processor 1001 may call the focus movement control program stored in the memory 1005, and also perform the following operations:
acquiring position information and width and height data of a sub-control corresponding to a current focus;
calculating width and height data of a corresponding focus frame according to the width and height data of the sub-control corresponding to the current focus;
and creating a corresponding focus frame according to the width and height data of the corresponding focus frame and the position information of the sub-control corresponding to the current focus, and determining the position information of the focus frame.
Further, the processor 1001 may call the focus movement control program stored in the memory 1005, and also perform the following operations:
and acquiring the slidable direction of the current screen, wherein the slidable direction comprises up-down sliding and left-right sliding.
Further, the processor 1001 may call the focus movement control program stored in the memory 1005, and also perform the following operations:
when the slidable direction of the current screen is up-down sliding and the direction key information triggered by a user is received as left/right, calculating the distance between the current focus and the left/right boundary of the current screen according to the position information of the current focus frame and the column number of the sub-controls in the current screen;
and when the distance between the current focus and the left/right boundary of the current screen is larger than zero, calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame.
Further, the processor 1001 may call the focus movement control program stored in the memory 1005, and also perform the following operations:
and when the slidable direction of the current screen is up-and-down sliding and the received direction key information triggered by the user is up/down, calculating the position information of the sub-control corresponding to the next focus according to the direction key information, the position information of the current focus frame, the number of columns of the sub-controls in the current screen and the total number of the sub-controls.
Further, the processor 1001 may call the focus movement control program stored in the memory 1005, and also perform the following operations:
when the slidable direction of the current screen is left-right sliding and the direction key information triggered by a user is received to be up/down, calculating the distance between the current focus and the upper/lower boundary of the current screen according to the position information of the current focus frame and the line number of the sub-control in the current screen;
and when the distance between the current focus and the upper/lower boundary of the current screen is greater than zero, calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame.
Further, the processor 1001 may call the focus movement control program stored in the memory 1005, and also perform the following operations:
and when the slidable direction of the current screen is left-right sliding and the received direction key information triggered by the user is left/right, calculating the position information of the sub-control corresponding to the next focus according to the direction key information, the position information of the current focus frame, the line number of the sub-controls in the current screen and the total number of the sub-controls.
Based on the above hardware structure, an embodiment of the focus movement control method of the present invention is provided.
The invention provides a focus movement control method.
Referring to fig. 2, fig. 2 is a flowchart illustrating a focus movement control method according to a first embodiment of the present invention.
In an embodiment of the present invention, the focus movement control method includes:
step S10, acquiring the position information of the sub-control corresponding to the current focus, and determining the position information of the corresponding focus frame according to the position information of the sub-control corresponding to the current focus;
in the embodiment of the invention, the terminal is an intelligent television provided with an Android system. The embodiment of the invention is suitable for the user to control the focus movement of the terminal through the remote controller or the direction keyboard of the external keyboard so as to perform sliding checking on each control in the intelligent television screen, wherein the focus exists in the form of the focus frame, so that the user can conveniently check the position of the selected sub-control.
In the embodiment of the invention, the terminal firstly determines the position information of the corresponding focus frame according to the position information of the sub-control attached to the current focus. Specifically, referring to fig. 3, fig. 3 is a schematic diagram of a refining process of determining the position information of the corresponding focus frame according to the position information of the sub-control corresponding to the current focus in the embodiment of the present invention. Step S10 includes:
step S11, acquiring position information and width and height data of the child control corresponding to the current focus;
step S12, calculating the width and height data of the corresponding focus frame according to the width and height data of the sub-control corresponding to the current focus;
and step S13, creating a corresponding focus frame according to the width and height data of the corresponding focus frame and the position information of the sub-control corresponding to the current focus, and determining the position information of the focus frame.
The terminal first obtains position information and width and height data of a sub-control corresponding to a current focus, where the position information of each sub-control may be marked as CVw width and CVh height of the sub-control according to a number label manner as in fig. 6 or fig. 7, and then calculates the width and height data of a corresponding focus frame according to the width and height data of the sub-control to which the current focus is attached, and marks the width and height of the focus frame as Fw and height of Fh, where the width and height data of the focus frame may be calculated according to a frame size in a preset focus frame display rule, for example, when the preset focus frame display rule is that the frame size is the same as the size of the sub-control, the width and height of the focus frame are the same as the width and height of the sub-control, and for example, when the preset focus frame display rule size width and height are all 1.2 times the width and height of the sub-control, the width and height Fw of the focus frame is CVw-1.2, and the height Fh of the focus frame is CVh-1.2. And then, creating a corresponding focus frame according to the width and height data of the corresponding focus frame and the position information of the sub-control corresponding to the current focus, and determining the position information of the focus frame. The position information of the focus frame may be represented in the same manner as the child control.
It should be noted that the preset focus frame display rule may further include a display style of the focus frame, where the display style may include a fillet effect (fillet rate) and a frame effect (frame thickness, frame color, and the like) of the focus frame, so that the focus frame may be created according to the focus frame display style in the preset focus frame display rule, thereby providing a better visual effect for a user, and improving user experience. Of course, in a specific embodiment, the focus frame display rule may be user-defined, so as to meet the requirements of different users, and further improve the user experience.
In addition, it should be noted that the terminal may divide the page into N × M sub-controls according to the layout of the controls in the page, where N is a row and M is a column. To facilitate the calculation of the correlation formula in the following embodiments, the coordinates laid out by the control elements are (SVx, SVy) and (EVx, EVy); the width CVw and height CVh of the sub-elements of the control, the lateral spacing Hs and the vertical spacing Vs between the sub-controls.
Step S20, receiving direction key information triggered by a user, and calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame;
and when receiving direction key information triggered by a user, the terminal calculates the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame, wherein the user can trigger the direction key information by pressing a direction keyboard of a remote controller or an external keyboard of the smart television, and the direction key information comprises upward information, downward information, leftward information and rightward information.
Step S30, judging whether the position information of the sub-control corresponding to the next focus exceeds a preset focus moving boundary;
due to the existence of the system rule, the focus can only move within a certain range on the screen, that is, the system is preset with a corresponding focus moving boundary, and at this time, it is necessary to determine whether the position information of the sub-control corresponding to the next focus exceeds the preset focus moving boundary. To facilitate the calculation of the correlation formula in the following embodiments, the upper boundary T, the upper coordinate (Tx, Ty) of the preset focus movement boundary is recorded; lower boundary B, coordinates (Bx, By); left boundary L, coordinates (Lx, Ly); right border R, coordinates (Rx, Ry). At this time, when the slidable direction of the current screen is up and down sliding, the number of lines that the focus can move: when the slidable direction of the current screen is left-right sliding, the number of columns the focus can move: fm ═ (Bx-Tx)/Fw.
Specifically, referring to fig. 6 and 7, fig. 6 is a schematic diagram of a focus frame moving when a slidable direction of a current screen is sliding up and down according to an embodiment of the focus movement control method of the present invention, and fig. 7 is a schematic diagram of another focus frame moving when the slidable direction of the current screen is sliding left and right according to an embodiment of the focus movement control method of the present invention. As shown in fig. 6, when the slidable direction of the current screen is up and down sliding, the numbering manner of the sub-controls is as shown in fig. 6, and according to the system rule, the number of rows in which the focus can move is 2, that is, the focus can only move within the range of the display positions of the sub-controls 1-6 in the drawing. As shown in fig. 7, when the slidable direction of the current screen is left-right sliding, the numbering manner of the sub-controls is as shown in fig. 7, and according to the system rule, the number of rows in which the focus can move is 2 columns, that is, the focus can only move within the range of the display positions of the sub-controls 1-6 in the drawing.
Step S41, if yes, calling a system native interface to control the sub-control corresponding to the current focus to move to the corresponding direction according to the direction key information, and controlling the position of the focus frame to be unchanged;
if the position information of the sub-control corresponding to the next focus exceeds the preset focus moving boundary, the system native interface is called to control the sub-control corresponding to the current focus to move towards the corresponding direction according to the direction key information, and the position of the focus frame is controlled to be unchanged, so that the sub-control of the screen is controlled to move through the screen rolling function of the native interface.
Specifically, please refer to fig. 4, where fig. 4 is a schematic diagram of a refining process in which if the current focus is changed, a system native interface is invoked to control the sub-control corresponding to the current focus to move in the corresponding direction according to the direction key information, and the position of the focus frame is controlled to be unchanged. Step S41 includes:
step S411, if yes, determining whether the row/column of the sub-control corresponding to the next focus is an edge row/column according to the position information of the sub-control corresponding to the next focus;
step S412, when the row/column of the sub-control corresponding to the next focus is not an edge row/column, calling a system native interface to control the sub-control corresponding to the current focus to move to the corresponding direction according to the direction key information, and controlling the position of the focus frame to be unchanged;
step S413, when the row/column of the sub-control corresponding to the next focus is an edge row/column, a blank sub-control is used to supplement the rows/columns outside the edge row/column, and a system native interface is called to control the sub-control corresponding to the current focus to move in the corresponding direction according to the direction key information, so as to control the position of the focus frame to be unchanged.
Considering that, due to the setting of the system rule, the focus cannot be moved to some edge rows/columns, and thus some sub-controls cannot be selected, in the embodiment of the present invention, it is first determined whether the row/column where the sub-control corresponding to the next focus is located is an edge row/column, and when the row/column where the sub-control corresponding to the next focus is located is an edge row/column, a manner of adding a blank sub-control may be adopted to supplement the rows/columns outside the edge row/column, so that the focus may be moved to the edge row/column.
In the embodiment of the present invention, when the position information of the sub-control corresponding to the next focus exceeds the preset focus moving boundary, it is first determined whether the row/column of the sub-control corresponding to the next focus is an edge row/column according to the position information of the sub-control corresponding to the next focus.
And when the row/column of the sub-control corresponding to the next focus is not the edge row/column, calling a system native interface to control the sub-control corresponding to the current focus to move towards the corresponding direction according to the direction key information, and controlling the position of the focus frame to be unchanged. For example, as shown in fig. 6, the slidable direction of the current screen is up-down sliding, when the user wants to move the focus frame from position 4 to position 7, since the row where the sub-control at position 7 is located is not an edge row, that is, the last row after all the sub-controls are arranged and laid out, at this time, it is only necessary to call the screen scroll function of the system native interface to control the corresponding sub-control to move upward, and keep the position of the focus frame unchanged. The movement of the sub-control can achieve the effect of smooth movement according to the position information, the offset and the movement time length (the time length preset according to the effect) of the current focus and the next focus. For another example, as shown in fig. 7, the slidable direction of the current screen is left-right sliding, when the user wants to move the focus frame from the position 4 to the position 7, since the column where the sub-control at the position 7 is located is not an edge column, that is, the last column after all the sub-controls are arranged and laid out, at this time, it is only necessary to call the screen scroll function of the system native interface to control the corresponding sub-control to move left, and keep the position of the focus frame unchanged.
And when the line/column of the sub-control corresponding to the next focus is the edge line/column, supplementing the lines/columns outside the edge line/column by adding a blank sub-control, calling a system native interface to control the sub-control corresponding to the current focus to move towards the corresponding direction according to the direction key information, and simultaneously controlling the position of the focus frame to be unchanged.
For example, as shown in fig. 6, the slidable direction of the current screen is up-down sliding, when the user wants to move the focus frame from position 7 to position 10, because the line edge row where the sub-controls at position 10 are located, that is, the last line after all the sub-controls are arranged and laid out, at this time, the method of adding blank sub-controls needs to be adopted to supplement below the last line, the number of rows BHn of the sub-controls needs to be complemented below is (EVy-By-bottom mapping)/(CVh + Vs), and the number of sub-controls needs to be complemented below is BHn M. And meanwhile, calling a screen scrolling function of the system native interface to control the corresponding sub-control to move upwards, and keeping the position of the focus frame unchanged. Wherein, BottomPadding represents the lower inner margin (bottom blank) of the control.
For another example, as shown in fig. 7, the slidable direction of the current screen is left-right sliding, when the user wants to move the focus frame from position 7 to position 10, because the column where the sub-controls at position 10 are located is an edge column, that is, the last column after all the sub-controls are arranged and laid out, at this time, the right of the last column needs to be supplemented by adding blank sub-controls, the right needs to complement the column number RVn of the sub-controls (EVx-Bx-rightpacking)/(CVw + Hs), and the right needs to complement the number REn of the sub-controls renn (RVn). Meanwhile, calling a screen scrolling function of the system native interface to control the corresponding sub-control to move leftwards, and keeping the position of the focus frame unchanged. Where BottomPadding represents the right inner margin (right blank) of the control.
And step S42, if not, controlling the focus frame to move towards the corresponding direction according to the direction key information.
And if the position information of the sub-control corresponding to the next focus does not exceed the preset focus moving boundary, controlling the focus frame to move towards the corresponding direction according to the direction key information. At this time, only the focus frame needs to be moved, so that the terminal only needs to control the focus frame to draw at this time, and the sub-control does not need to be drawn, thereby saving system resources.
For example, as shown in fig. 6, the slidable direction of the current screen is up and down sliding, when the user wants to move the focus frame from position 1 to position 4, since the position of the sub-control at position 4 does not exceed the preset focus moving boundary, at this time, the focus frame only needs to be moved down. For another example, as shown in fig. 7, the slidable direction of the current screen is left-right sliding, and when the user wants to move the focus frame from position 1 to position 4, since the position of the sub-control at position 4 does not exceed the preset focus movement boundary, at this time, the focus frame only needs to be moved to the right.
The invention provides a focus movement control method, which comprises the steps of obtaining the position information of a sub-control corresponding to a current focus, and determining the position information of a corresponding focus frame according to the position information of the sub-control corresponding to the current focus; receiving direction key information triggered by a user, and calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame; judging whether the position information of the sub-control corresponding to the next focus exceeds a preset focus moving boundary or not; if so, calling a system native interface to control the sub-control corresponding to the current focus to move to the corresponding direction according to the direction key information, and controlling the position of the focus frame to be unchanged; if not, controlling the focus frame to move towards the corresponding direction according to the direction key information. Through the mode, the position information of the corresponding focus frame is determined according to the acquired position information of the sub-control corresponding to the current focus, then when the direction key information triggered by a user through a remote controller of the smart television or a direction keyboard of an external keyboard is received, the position information of the sub-control corresponding to the next focus is calculated according to the direction key information, namely the position information of the current focus frame, because of the existence of system rules, the focus can only move in a certain range on a screen, namely, the corresponding focus moving boundary exists, whether the position information of the sub-control corresponding to the next focus exceeds the preset focus moving boundary or not needs to be judged, if the position information of the sub-control corresponding to the next focus exceeds the preset focus moving boundary, a system native interface is called to control the sub-control corresponding to the current focus to move to the corresponding direction according to the direction key information, and the position of the focus frame is controlled to be unchanged, so that the sub-control of the screen is controlled to move by calling the screen rolling function of the native interface. And if the position information of the sub-control corresponding to the next focus does not exceed the preset focus moving boundary, controlling the focus frame to move towards the corresponding direction according to the direction key information, and at the moment, only controlling the focus frame to move, so that the terminal only needs to draw the focus frame at the moment without drawing the sub-control, thereby saving system resources.
Referring to fig. 5, fig. 5 is a flowchart illustrating a focus movement control method according to a second embodiment of the present invention.
Based on the first embodiment shown in fig. 2, before step S20, the focus movement control method further includes:
in step S50, the slidable direction of the current screen is obtained, wherein the slidable direction includes up-down sliding and left-right sliding.
In the embodiment of the present invention, because the slidable directions of the current screen are different, and the calculation manners of the position information of the sub-control corresponding to the next focus are different, before calculating the position information of the sub-control corresponding to the next focus according to the received direction key information and the position information of the current focus frame, the slidable directions of the current screen also need to be obtained, where the slidable directions include up-down sliding and left-right sliding.
At this time, step S20 may include:
step S21, when the slidable direction of the current screen is up-down sliding and the received direction key information triggered by the user is left/right, calculating the distance between the current focus and the left/right boundary of the current screen according to the position information of the current focus frame and the number of columns of the sub-controls in the current screen;
and step S22, when the distance between the current focus and the left/right boundary of the current screen is larger than zero, calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame.
When the slidable direction of the current screen is up-down sliding and when the received direction key information triggered by the user is left/right, calculating the distance between the current focus and the left/right boundary of the current screen according to the position information of the current focus frame and the column number of the sub-controls in the current screen, specifically, the calculation process is as follows: distance between current focus and left boundary of current screen: FML — Cposition% M, distance from the right border of the current screen: FMR ═ cposion + 1)% M.
And when the distance between the current focus and the left/right boundary of the current screen is larger than zero, calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame. Taking the label manner of the focus shown in fig. 6 as an example, when FML >0, when the direction key information is left, the position Cposition-1 of the sub-control corresponding to the next focus is Cposition-1; when FMR >0, when the direction key information is right, the next focus corresponds to the position of the sub-control: cposition ═ Cposition + 1.
It should be noted that, when the distance between the current focus and the left/right boundary of the current screen is equal to zero, it indicates that the position of the current focus (current focus frame) is located in the leftmost/right column of the sub-control, and at this time, the user operation is ignored, the sub-control and the focus frame are not moved, and the positions of the sub-control and the focus frame are kept unchanged.
Step S20 may further include:
and step S23, when the slidable direction of the current screen is up-down sliding and the received direction key information triggered by the user is up/down, calculating the position information of the sub-control corresponding to the next focus according to the direction key information, the position information of the current focus frame, the number of columns of the sub-controls in the current screen and the total number of the sub-controls.
And when the slidable direction of the current screen is up-down sliding and the received direction key information triggered by the user is up/down, calculating the position information of the sub-control corresponding to the next focus according to the direction key information, the position information Cposition of the current focus frame, the column number M of the sub-controls in the current screen and the total number TotalSize of the sub-controls. The focus reference numerals shown in fig. 6 will be described as an example.
The specific calculation process is as follows:
when the direction key information is upward, when (Cposition +1) > M, Cposition ═ Cposition-M; when (Cposition +1) < ═ M, Cposition does not change, i.e., the position of the focal frame does not change at this time.
When the direction key information is down,
when TotalSize > (Cposition +1+ M), Cposition ═ Cposition + M;
when TotalSize ═ is (Cposition +1), Cposition does not change, i.e., the position of the focus frame does not change at this time;
when TotalSize% M ═ 0 and TotalSize < (Cposition +1+ M), Cposition does not change, i.e., the position of the focus frame does not change at this time;
when TotalSize% M >0 and TotalSize < (Cposition +1+ M) and (Cposition +1) < ═ TotalSize-TotalSize% M, Cposition ═ TotalSize-1;
when TotalSize% M >0 and TotalSize < (Cposition +1+ M) and (Cposition +1) > (TotalSize-TotalSize% M), Cposition does not change, i.e., the position of the focus frame does not change at this time.
Step S20 may further include:
step S24, when the slidable direction of the current screen is left-right sliding and the information of the direction key triggered by the user is received as up/down, calculating the distance between the current focus and the upper/lower boundary of the current screen according to the position information of the current focus frame and the number of rows of the sub-controls in the current screen;
and step S25, when the distance between the current focus and the upper/lower boundary of the current screen is greater than zero, calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame.
When the slidable direction of the current screen is left-right sliding and when the received direction key information triggered by the user is up/down, calculating the distance between the current focus and the upper/lower boundary of the current screen according to the position information of the current focus frame and the number of lines of the sub-controls in the current screen, specifically, the calculation process is as follows: distance between current focus and upper boundary of current screen: FMT — Cposition% N, distance from the lower border of the current screen: FMB ═ cposion + 1)% N.
And when the distance between the current focus and the upper/lower boundary of the current screen is greater than zero, calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame. Taking the label manner of the focus shown in fig. 7 as an example, when FMT >0, when the direction key information is up, the position Cposition-1 of the sub-control corresponding to the next focus is Cposition-1; when FMB >0, when the direction key information is downward, the next focus corresponds to the position of the sub-control: cposition ═ Cposition + 1.
It should be noted that, when the distance between the current focus and the upper/lower boundary of the current screen is equal to zero, it indicates that the position of the current focus (current focus frame) is located at the top/bottom column of the sub-control, and at this time, the operation of the user is ignored, the sub-control and the focus frame are not moved, and the positions of the sub-control and the focus frame are kept unchanged.
Step S20 may further include:
and step S26, when the slidable direction of the current screen is left-right sliding and the received direction key information triggered by the user is left/right, calculating the position information of the sub-control corresponding to the next focus according to the direction key information, the position information of the current focus frame, the line number of the sub-controls in the current screen and the total number of the sub-controls.
And when the slidable direction of the current screen is left-right sliding and the received direction key information triggered by the user is left/right, calculating the position information of the sub-control corresponding to the next focus according to the direction key information, the position information Cposition of the current focus frame, the line number N of the sub-control in the current screen and the total number TotalSize of the sub-controls. The description will be given by taking the focus reference numerals shown in fig. 7 as an example.
The specific calculation process is as follows:
when the direction key information is to the left, when (Cposition +1) > N, Cposition ═ Cposition-N; when (Cposition +1) < ═ N, Cposition does not change, i.e., the position of the focal frame does not change at this time.
When the direction key information is to the right,
when TotalSize > (Cposition +1+ N), Cposition ═ Cposition + N;
when TotalSize ═ is (Cposition +1), Cposition does not change, i.e., the position of the focus frame does not change at this time;
when TotalSize% N ═ 0 and TotalSize < (Cposition +1+ N), Cposition does not change, i.e., the position of the focus frame does not change at this time;
when TotalSize% N >0 and TotalSize > (Cposition +1) and TotalSize < (Cposition +1+ N) and (Cposition +1) < ═ TotalSize-TotalSize% N, Cposition ═ TotalSize-1;
when (Cposition +1) > (TotalSize-TotalSize% N), Cposition is unchanged, that is, the position of the focus frame is unchanged at this time.
The present invention also proposes a computer-readable storage medium having stored thereon a focus movement control program which, when executed by a processor, implements the steps of the focus movement control method according to any one of the above embodiments.
The specific embodiment of the computer-readable storage medium of the present invention is substantially the same as the embodiments of the focus movement control method described above, and will not be described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (9)

1. A focus movement control method, characterized by comprising:
acquiring position information of a sub-control corresponding to a current focus, and determining the position information of a corresponding focus frame according to the position information of the sub-control corresponding to the current focus;
receiving direction key information triggered by a user, and calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame;
judging whether the position information of the sub-control corresponding to the next focus exceeds a preset focus moving boundary or not;
if the position information of the sub-control corresponding to the next focus exceeds a preset focus moving boundary, determining whether the line/column of the sub-control corresponding to the next focus is an edge line/column according to the position information of the sub-control corresponding to the next focus;
when the row/column of the sub-control corresponding to the next focus is not an edge row/column, calling a system native interface to control the sub-control corresponding to the current focus to move to the corresponding direction according to the direction key information, and controlling the position of the focus frame to be unchanged;
when the line/column of the sub-control corresponding to the next focus is an edge line/column, supplementing the line/column outside the edge line/column by using a blank sub-control, calling a system native interface to control the sub-control corresponding to the current focus to move towards the corresponding direction according to the direction key information, and controlling the position of the focus frame to be unchanged;
and if the position information of the sub-control corresponding to the next focus does not exceed a preset focus moving boundary, controlling the focus frame to move towards the corresponding direction according to the direction key information.
2. The focus movement control method according to claim 1, wherein the step of obtaining the position information of the sub-control corresponding to the current focus and determining the position information of the corresponding focus frame according to the position information of the sub-control corresponding to the current focus comprises:
acquiring position information and width and height data of a sub-control corresponding to a current focus;
calculating width and height data of a corresponding focus frame according to the width and height data of the sub-control corresponding to the current focus;
and creating a corresponding focus frame according to the width and height data of the corresponding focus frame and the position information of the sub-control corresponding to the current focus, and determining the position information of the focus frame.
3. The method for controlling focus movement according to claim 1, wherein before the step of receiving direction key information triggered by a user and calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame, the method comprises:
and acquiring the slidable direction of the current screen, wherein the slidable direction comprises up-down sliding and left-right sliding.
4. The method for controlling focus movement according to claim 3, wherein the step of receiving direction key information triggered by a user and calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame comprises:
when the slidable direction of the current screen is up-down sliding and the direction key information triggered by a user is received as left/right, calculating the distance between the current focus and the left/right boundary of the current screen according to the position information of the current focus frame and the column number of the sub-controls in the current screen;
and when the distance between the current focus and the left/right boundary of the current screen is larger than zero, calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame.
5. The method for controlling focus movement according to claim 3, wherein the step of receiving direction key information triggered by a user and calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame further comprises:
and when the slidable direction of the current screen is up-and-down sliding and the received direction key information triggered by the user is up/down, calculating the position information of the sub-control corresponding to the next focus according to the direction key information, the position information of the current focus frame, the number of columns of the sub-controls in the current screen and the total number of the sub-controls.
6. The method for controlling focus movement according to claim 3, wherein the step of receiving direction key information triggered by a user and calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame comprises:
when the slidable direction of the current screen is left-right sliding and the direction key information triggered by a user is received to be up/down, calculating the distance between the current focus and the upper/lower boundary of the current screen according to the position information of the current focus frame and the line number of the sub-control in the current screen;
and when the distance between the current focus and the upper/lower boundary of the current screen is greater than zero, calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame.
7. The method for controlling focus movement according to claim 3, wherein the step of receiving direction key information triggered by a user and calculating the position information of the sub-control corresponding to the next focus according to the direction key information and the position information of the current focus frame further comprises:
and when the slidable direction of the current screen is left-right sliding and the received direction key information triggered by the user is left/right, calculating the position information of the sub-control corresponding to the next focus according to the direction key information, the position information of the current focus frame, the line number of the sub-controls in the current screen and the total number of the sub-controls.
8. An intelligent television, characterized in that the intelligent television comprises: memory, a processor and a focus movement control program stored on the memory and executable on the processor, the focus movement control program when executed by the processor implementing the steps of the focus movement control method according to any one of claims 1 to 7.
9. A computer-readable storage medium, characterized in that a focus movement control program is stored thereon, which when executed by a processor implements the steps of the focus movement control method according to any one of claims 1 to 7.
CN201711066952.3A 2017-11-02 2017-11-02 Focus movement control method, smart television and computer-readable storage medium Active CN107835461B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711066952.3A CN107835461B (en) 2017-11-02 2017-11-02 Focus movement control method, smart television and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711066952.3A CN107835461B (en) 2017-11-02 2017-11-02 Focus movement control method, smart television and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN107835461A CN107835461A (en) 2018-03-23
CN107835461B true CN107835461B (en) 2021-09-28

Family

ID=61651613

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711066952.3A Active CN107835461B (en) 2017-11-02 2017-11-02 Focus movement control method, smart television and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN107835461B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108984239B (en) * 2018-05-29 2021-07-20 北京五八信息技术有限公司 Processing method, device and equipment for selecting control and storage medium
CN109871741A (en) * 2018-12-28 2019-06-11 青岛海信电器股份有限公司 The mask method and device of object are identified in a kind of image for smart television
CN109618206B (en) * 2019-01-24 2021-11-05 海信视像科技股份有限公司 Method and display device for presenting user interface
CN110505509B (en) * 2019-09-02 2021-03-16 四川长虹电器股份有限公司 Method for realizing global wall-hitting sound effect in smart television
CN111107408B (en) * 2019-12-30 2022-09-02 深圳Tcl数字技术有限公司 Focus movement control method, television and storage medium
CN111629245B (en) * 2020-05-29 2022-12-13 深圳Tcl数字技术有限公司 Focus control method, television and storage medium
CN111757154A (en) * 2020-06-01 2020-10-09 海信电子科技(深圳)有限公司 Method for controlling webpage cursor by remote controller and display equipment
WO2022120079A1 (en) * 2020-12-03 2022-06-09 VIDAA USA, Inc. Display apparatus
CN113703626A (en) * 2021-08-13 2021-11-26 北京小米移动软件有限公司 Focus control method, device, electronic device and storage medium

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7155675B2 (en) * 2001-08-29 2006-12-26 Digeo, Inc. System and method for focused navigation within a user interface
CN102281415A (en) * 2010-06-08 2011-12-14 深圳Tcl新技术有限公司 Method for selecting options from page displayed in television
CN102984569A (en) * 2012-11-29 2013-03-20 中兴通讯股份有限公司 Method, device and system for controlling television interface focus
CN103533417A (en) * 2013-05-02 2014-01-22 乐视网信息技术(北京)股份有限公司 Human-computer interaction method and system based on list type rolling wheel group
CN103546818B (en) * 2013-10-31 2017-01-04 乐视致新电子科技(天津)有限公司 The focus control method of the list display interface of intelligent television and device
CN103970538B (en) * 2014-05-07 2017-12-12 Tcl集团股份有限公司 A kind of Android focuses transform method and system
CN106303652B (en) * 2015-05-27 2019-09-06 阿里巴巴集团控股有限公司 A kind of method for drafting and device of interface element
US10455270B2 (en) * 2016-03-15 2019-10-22 Sony Corporation Content surfing, preview and selection by sequentially connecting tiled content channels
CN105847930A (en) * 2016-03-22 2016-08-10 乐视网信息技术(北京)股份有限公司 Focus frame control method and device

Also Published As

Publication number Publication date
CN107835461A (en) 2018-03-23

Similar Documents

Publication Publication Date Title
CN107835461B (en) Focus movement control method, smart television and computer-readable storage medium
US20180232135A1 (en) Method for window displaying on a mobile terminal and mobile terminal
US11500513B2 (en) Method for icon display, terminal, and storage medium
US8907990B2 (en) Display system, display method, program, and recording medium
US10318605B2 (en) Method and device for relocating input box to target position in mobile terminal browser, storage medium
EP3220249A1 (en) Method, device and terminal for implementing regional screen capture
WO2015139408A1 (en) Method for managing application program icons, and terminal
CN104503655A (en) Application program interface display control method and device
US20140215308A1 (en) Web Page Reflowed Text
CN108156510B (en) Page focus processing method and device and computer readable storage medium
JP6255495B2 (en) Method and apparatus for displaying browser resource, and computer readable storage medium
CN110568974B (en) Sliding view display method and device and mobile terminal
WO2020000971A1 (en) Method and apparatus for switching global special effects, terminal device, and storage medium
US20160203628A1 (en) Information processing device editing map acquired from server
WO2014108024A1 (en) Method for interface object movement and apparatus for supporting interface object movement
CN107977342B (en) Document comparison method and device
CN107861711B (en) Page adaptation method and device
CN107566603B (en) Desktop layout method, intelligent terminal and computer storage medium
CN107391148B (en) View element saving method and device, electronic equipment and computer storage medium
US20160364031A1 (en) Storage medium, display control device, display control system, and display method
CN106126057B (en) Screen capture method and device and terminal equipment
US10140258B2 (en) Portable device and image displaying method thereof
CN105930090A (en) Method and system for transmitting coordinate data of touch screens on basis of mobile terminals
US10632379B2 (en) Method and apparatus for performing interaction in chessboard interface
CN108363525B (en) Method and device for responding to user gesture operation in webpage and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20200214

Address after: 201, room 518000, building A, No. 1, front Bay Road, Qianhai Shenzhen Guangdong Shenzhen Hong Kong cooperation zone (Qianhai business secretary)

Applicant after: Shenzhen leynew Network Media Co Ltd

Address before: 518000 Guangdong city of Shenzhen province Qianhai Shenzhen Hong Kong cooperation zone before Bay Road No. 1 building 201 room A (located in Shenzhen Qianhai business secretary Co. Ltd.)

Applicant before: Shenzhen leynew Mdt InfoTech Ltd

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant