CN109343923B - Method and equipment for zooming user interface focus frame of intelligent television - Google Patents

Method and equipment for zooming user interface focus frame of intelligent television Download PDF

Info

Publication number
CN109343923B
CN109343923B CN201811098713.0A CN201811098713A CN109343923B CN 109343923 B CN109343923 B CN 109343923B CN 201811098713 A CN201811098713 A CN 201811098713A CN 109343923 B CN109343923 B CN 109343923B
Authority
CN
China
Prior art keywords
focus frame
area
image
key instruction
volume
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811098713.0A
Other languages
Chinese (zh)
Other versions
CN109343923A (en
Inventor
高斯太
宋虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Jukanyun Technology Co ltd
Original Assignee
Juhaokan Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juhaokan Technology Co Ltd filed Critical Juhaokan Technology Co Ltd
Priority to CN201811098713.0A priority Critical patent/CN109343923B/en
Publication of CN109343923A publication Critical patent/CN109343923A/en
Priority to PCT/CN2019/106003 priority patent/WO2020057466A1/en
Application granted granted Critical
Publication of CN109343923B publication Critical patent/CN109343923B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The utility model discloses a method and a device for zooming a user interface focus frame of an intelligent television, which comprises the following steps: displaying a user interface, wherein a first focus frame is displayed in the user interface; responding to a remote controller key instruction acting on the first focus frame, deleting the first focus frame, displaying a second focus frame, and if the remote controller key instruction is used for indicating the focus frame to be amplified, determining that the area range of the first focus frame is within the area range of the second focus frame; if the remote controller key instruction is used for indicating the zoom-out operation of the focus frame, the area range of the second focus frame is within the area range of the first focus frame; the remote controller key command comprises a volume increasing key command, a volume decreasing key command and a confirmation key command. According to the method and the device, the size of the focus frame of the user interface can be changed through the keys of the remote controller, and the user experience is improved.

Description

Method and device for zooming user interface focus frame of intelligent television
Technical Field
The present disclosure relates to the field of communications technologies, and in particular, to a method and an apparatus for zooming a user interface focus frame of a smart television.
Background
The smart television is a television product for meeting diversified and personalized requirements of users. The smart television is based on an Internet (Internet) application technology, is provided with an open operating system and a chip, is provided with an open application platform, can realize a bidirectional human-computer interaction function, integrates various functions such as audio and video, entertainment, data and the like, aims to bring more convenient experience to users, and has become a development trend of televisions at present.
The smart television can provide rich man-machine interaction functions, for example, a user can be allowed to select a partial area in a user interface through a remote controller to form a focus frame, so that processing such as image processing or image screenshot can be performed on an image in the focus frame.
However, after the focus frame is formed, how to zoom the focus frame through the remote controller is a problem that needs to be solved.
Disclosure of Invention
The embodiment of the disclosure provides a method and equipment for zooming a user interface focus frame of an intelligent television, which are used for changing the size of the user interface focus frame through a remote controller and improving user experience.
In a first aspect, a method for zooming a user interface focus frame of a smart television is provided, which includes: displaying a user interface, wherein the user interface comprises a first focus frame; responding to a remote controller key instruction acting on the first focus frame, deleting the first focus frame, and displaying a second focus frame; if the remote controller key instruction is used for indicating the focus frame to be amplified, the area range of the first focus frame is within the area range of the second focus frame; and if the remote controller key instruction is used for indicating the zoom-out operation of the focus frame, the area range of the second focus frame is within the area range of the first focus frame.
In the foregoing embodiment of the present disclosure, the currently displayed user interface includes a first focus frame, and after receiving a remote controller key that acts on the first focus frame, in response, the first focus frame is deleted, and a second focus frame is displayed, where an area range of the second focus frame is within an area range of the first focus frame or an area range of the first focus frame is within an area range of the second focus frame, so that the focus frame may be zoomed by using a remote controller.
Optionally, the remote controller key instruction is a volume increasing key instruction, and the volume increasing key instruction is an instruction generated by operating a volume increasing key on the remote controller; and the volume increasing key instruction is used for indicating the focus frame to enlarge.
Optionally, a magnification of the second focus frame compared to the first focus frame is equal to a first setting multiple, and a value of the first setting multiple is greater than 1.
Optionally, if the size of the first focus frame is equal to the size of the maximum focus frame or the edge of the first focus frame coincides with the maximum boundary of the focus frame, the first focus frame is retained.
Optionally, the remote controller key instruction is a volume reduction key instruction, and the volume reduction key instruction is an instruction generated by operating a volume reduction key on the remote controller; and the volume reduction key instruction is used for indicating the zoom-out operation of the focus frame.
Optionally, a reduction multiple of the second focus compared to the first focus frame is equal to a second setting multiple, and a value of the second setting multiple is greater than 1.
Optionally, if the size of the first focus frame is equal to the size of the minimum focus frame, the first focus frame is retained.
Optionally, the remote controller key instruction is a confirm key instruction, and the confirm key instruction is an instruction generated by operating a confirm key on the remote controller; when responding to the instruction of the confirmation key, the method further comprises the following steps: acquiring zooming indication information, wherein the zooming indication information is used for indicating a focus frame zooming-in operation or a focus frame zooming-out operation; if the zooming indication information indicates the zooming operation of the focus frame, the area range of the first focus frame is within the area range of the second focus frame, otherwise, the area range of the second focus frame is within the area range of the first focus frame.
Optionally, the magnification of the second focus frame compared with the first focus frame is equal to a third setting multiple, and the values of the third setting multiple are all greater than 1; or the reduction multiple of the second focus frame compared with the first focus frame is equal to a fourth setting multiple, and the values of the fourth setting multiple are all greater than 1.
Optionally, if the size of the second focus frame is equal to the size of the maximum focus frame or the edge of the second focus frame coincides with the maximum boundary of the focus frame, setting the zoom instruction information as being used for instructing a focus frame zoom-out operation; and if the size of the second focus frame is equal to the size of the minimum focus frame, setting the zooming indication information to be used for indicating the zooming-in operation of the focus frame.
Optionally, if the size of the first focus frame is equal to the size of the maximum focus frame or the edge of the first focus frame coincides with the maximum boundary of the focus frame, the size of the second focus frame displayed in response to the enter key instruction is the minimum focus frame size or the default focus frame size or the size of the initial focus frame; the initial focus frame is a focus frame selected by a user through a remote controller; or if the size of the first focus frame is equal to the size of the minimum focus frame, responding to the confirmation key instruction to display the size of a second focus frame as the size of the maximum focus frame or the size of a default focus frame or the size of an initial focus frame; and the initial focus frame is a focus frame selected by a user through a remote controller.
Optionally, when the second focus frame is larger than the first focus frame, the area range of the first focus frame is within the area range of the second focus frame; when the second focus frame is smaller than the first focus frame, the area range of the second focus frame is within the area range of the first focus frame.
Optionally, the shape of the second focus frame is the same as or matches the shape of the first focus frame.
In a second aspect, a device for zooming a user interface focus frame of a smart television is provided, which includes: the display unit is used for displaying a user interface, and the user interface comprises a first focus frame; the response unit is used for responding to a remote controller key instruction acting on the first focus frame, deleting the first focus frame and controlling the display unit to display a second focus frame; the display unit is further used for displaying the second focus frame according to the control of the response unit; if the remote controller key instruction is used for indicating the focus frame to be amplified, the area range of the first focus frame is within the area range of the second focus frame; and if the remote controller key instruction is used for indicating the zoom-out operation of the focus frame, the area range of the second focus frame is within the area range of the first focus frame.
In a third aspect, a smart tv is provided, which includes: a processor, a memory, and a display; the memory to store computer instructions; the processor to execute the computer instructions to implement: controlling the display to display a user interface, wherein the user interface comprises a first focus frame; responding to a remote controller key instruction acting on the first focus frame, controlling a display to delete the first focus frame and display a second focus frame; if the remote controller key instruction is used for indicating the focus frame to be amplified, the area range of the first focus frame is within the area range of the second focus frame; and if the remote controller key instruction is used for indicating the zoom-out operation of the focus frame, the area range of the second focus frame is within the area range of the first focus frame.
Drawings
FIG. 1 is a diagram of a system architecture to which the present disclosure is applicable;
fig. 2 is a flowchart of a method for zooming a user interface focus frame of an intelligent television according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a first focus box displayed in a user interface in an image processing application of an intelligent television according to an embodiment of the present disclosure;
FIG. 4a is a schematic diagram illustrating an enlarged focus frame based on a center point in the embodiment of the present disclosure;
FIG. 4b is a schematic diagram illustrating an enlargement of a focus frame based on a vertex at a lower left corner in the embodiment of the present disclosure;
FIG. 5a is a schematic diagram of a first focus frame and a second focus frame having the same shape according to an embodiment of the disclosure;
FIG. 5b is a schematic diagram of the shape of the first focus frame matching the shape of the second focus frame in the embodiment of the disclosure;
FIG. 6 is a flowchart of a method for implementing a user interface focus frame enlargement by an increase volume button on a remote control according to an embodiment of the present disclosure;
FIG. 7 is a diagram illustrating a second focus box drawn by a volume up key on a remote control unit according to an embodiment of the present disclosure;
fig. 8 is a schematic diagram of an operating system architecture of an intelligent television set to which the embodiment of the present disclosure is applicable;
FIG. 9 is a flowchart illustrating a method for zooming out a focus frame of a user interface via a volume-down button on a remote control according to an embodiment of the disclosure;
FIG. 10 is a diagram illustrating a second focus box drawn by a volume-down key on a remote control according to an embodiment of the present disclosure;
FIG. 11 is a flowchart illustrating a method for implementing a zoom-in or zoom-out of a focus box of a user interface by operating a confirmation key on a remote control according to an embodiment of the present disclosure;
fig. 12 is a schematic diagram of a user interface focus box zoom display device for a smart tv set provided in an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of an intelligent television set provided in an embodiment of the present disclosure.
Detailed Description
The embodiments of the present disclosure are described in detail below with reference to the accompanying drawings.
It is noted that the terms "first," "second," "third," and the like in this disclosure are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
Fig. 1 illustrates a system architecture suitable for use with embodiments of the present disclosure.
As shown in fig. 1, the system architecture 100 includes an intelligent tv 101 and a remote controller 102. The remote controller 102 may send a control instruction to the smart tv 101 in a wireless manner.
The smart television 101 is provided with an operating system, which allows a user to install and uninstall various application programs. The smart tv remote controller 102 is a wireless transmitter, and a user can interact with the smart tv 101 by operating the remote controller 102. In a possible application scenario, the remote controller 102 encodes the key information of the user by using a digital encoding technology, emits a light wave through an infrared diode, converts the received infrared signal into an electrical signal by the infrared receiver of the smart tv 101 through the light wave, decodes the electrical signal by using the processor, demodulates a corresponding instruction, and executes the instruction, so as to achieve the purpose of controlling the smart tv 101 to complete the required operation.
Optionally, a set-top box (not shown) connected to the smart tv may also be included in the system architecture 100. The set-top box is also called a digital video conversion box and is used for converting the compressed digital signals into television contents and displaying the television contents on the intelligent television. The set-top box can be arranged in the intelligent television and can also be externally connected with the intelligent television. The control instruction sent by the remote controller is received and decoded by the set-top box, and the set-top box sends the decoded control instruction to the intelligent television.
The remote control may also comprise other mobile user equipment. The mobile user equipment can establish wireless communication connection with the intelligent television and send a control instruction to the intelligent television through the wireless communication connection. The mobile user equipment may be a smartphone, a tablet computer, a Personal Digital Assistant (PDA), an intelligent wearable device, or the like.
The operating system installed on the smart television 101 may be an Android (Android) system, a Windows (Windows) system, or a device provider self-built system. The present disclosure is not particularly limited as to the type of operating system mounted on the smart tv 101.
Based on the system architecture 100, in the embodiment of the present disclosure, a user may perform a human-computer interaction with the smart tv 101 through the remote controller 102 to perform a zoom operation on a focus frame in an interface of the smart tv 101.
The zoom operation of the focus frame of the smart television 101 interface provided by the present disclosure may be implemented by a software module in an application frame in the smart television operating system.
As shown in fig. 2, a flowchart of a method for zooming a user interface focus frame for a smart tv set is provided for an embodiment of the present disclosure. The process can be realized by the intelligent television, or by a user interface focus frame zooming device in the intelligent television.
As shown, the method includes:
s201: and displaying a user interface, wherein a first focus frame is displayed in the user interface.
Herein, for the purpose of description convenience, a focus frame in the user interface is referred to as a first focus frame.
The first focus frame may be user-selectable, for example, a user may select an area on the user interface via a remote control to form the first focus frame. The first focus frame may also be automatically selected by an application to which the user interface belongs.
For example, taking an image processing application of a smart television as an example, a user enters the image processing application installed in the smart television by operating a key of a remote controller, selects a photo to be subjected to image processing and displays the photo in a user interface, and the application can determine an area where a portrait of a person is located through image recognition and set a focus frame based on the area (i.e., the area is set as a focus frame area). The user can also select an area in the photo by operating the remote controller, and set a focus frame so as to perform image recognition on the area in the focus frame, as shown in fig. 3, which is the first focus frame displayed in the user interface of the image processing application of the smart television.
In the present disclosure, the shape of the first focus frame may be a square, a circle, a rectangle, an irregular shape, or the like, which is not limited in the present disclosure.
S202: and deleting the first focus frame and displaying the second focus frame in response to a remote controller key instruction acting on the first focus frame.
Optionally, when the second focus frame is larger than the first focus frame, the area range of the first focus frame is within the area range of the second focus frame. That is, when the focus frame is enlarged, the area of the second focus frame after enlargement covers the area of the first focus frame before enlargement. When the focus frame is enlarged, in some embodiments, the center point of the first focus frame may be used as a reference point for enlargement, so that the center points of the second focus frame and the first focus frame coincide; in other embodiments, taking a rectangular focus frame as an example, the left corner vertex of the first focus frame may also be used as a reference point for zooming in, so that the left corner vertex of the second focus frame coincides with the left corner vertex of the first focus frame.
Optionally, when the second focus frame is smaller than the first focus frame, the area range of the second focus frame is within the area range of the first focus frame. That is, when the focus frame is reduced, the area of the second focus frame after the reduction is located within the area of the first focus frame before the reduction. When the focus frame is reduced, in some embodiments, the focus frame may be reduced by using the center point of the first focus frame as a reference point, so that the center points of the second focus frame and the first focus frame coincide; in other embodiments, taking a rectangular focus frame as an example, the left corner vertex of the first focus frame may also be used as a reference point for performing the reduction, so that the left corner vertex of the second focus frame coincides with the left corner vertex of the first focus frame. Fig. 4a exemplarily shows a schematic diagram of performing focus frame enlargement with a focus frame center point as a reference point when taking a rectangular focus frame as an example, wherein a center point of the second focus frame 402 coincides with a center point of the first focus frame 401. Fig. 4b exemplarily shows a schematic diagram of performing focus frame enlargement by using the lower left vertex of the focus frame as a reference point when taking a rectangular focus frame as an example, wherein the lower left vertex of the second focus frame 404 coincides with the lower left vertex of the first focus frame 403. In fig. 4a and 4b, the first focus frame is shown in a dashed-line frame manner, so as to compare the positions of the focus frames before and after enlargement, and in particular, the first focus frame is deleted before the second focus frame is displayed.
It should be noted that, the embodiment of the present disclosure does not limit the manner of zooming in or out the focus frame (for example, a reference point on which zooming in or zooming out is based).
Alternatively, the magnification may be set in advance, so that the size of the first focus frame is multiplied by the magnification to obtain the size of the second focus frame. For example, taking a square focus frame as an example, if the magnification is 1.5, the side length of the second focus frame is equal to the product of the side length of the first focus frame and the magnification (1.5), specifically, if the side length of the first focus frame is l, the side length of the second focus frame is 1.5l.
Alternatively, a reduction factor may be preset, so that the size of the first focus frame is divided by the reduction factor to obtain the size of the second focus frame. For example, taking a square focus frame as an example, if the reduction factor is 1.2, the side length of the second focus frame is equal to the side length of the first focus frame divided by the reduction factor (1.2).
Optionally, the shape of the second focus frame is the same as or matches the shape of the first focus frame. Fig. 5a exemplarily shows an example that the second focus frame has the same shape as the first focus frame, and as shown, the first focus frame 501 has a rectangular shape and is enlarged in a manner of enlarging with the center point as a reference point. The enlarged second focus frame 502 is also rectangular. Fig. 5b exemplarily shows an example of matching the shape of the second focus frame with the shape of the first focus frame, and as shown, the first focus frame 503 is circular and enlarged by taking the center point as the base station point. Since the first focus frame 503 is located at the boundary of the user interface, the shape of the second focus frame 504 is no longer circular after the focus frame is enlarged, but its shape matches the shape of the first focus frame 503. In fig. 5a and 5b, the first focus frame is shown in a dashed-line frame manner, so as to compare the positions of the focus frames before and after enlargement, and in particular, the first focus frame is deleted before the second focus frame is displayed.
In the above embodiment, the currently displayed user interface includes the first focus frame, and after receiving a remote controller key instruction acting on the first focus frame, in response, the first focus frame is deleted, and the second focus frame is displayed, where an area range of the second focus frame is within an area range of the first focus, or an area range of the first focus frame is within an area range of the second focus frame, so that zooming processing on the focus frame and flexibility of zooming operation on the user interface focus frame through the remote controller can be implemented, and user experience is improved.
Based on the above process of the present disclosure, the zoom operation of the focus frame may be implemented by one or more of a volume up key, a volume down key, a confirm key, or other keys on the remote controller.
The focus frame zooming method provided by the present disclosure is described below by taking an increase volume key, a decrease volume key, and a confirm key on a remote controller as examples, respectively.
Referring to fig. 6, a flowchart of a method for implementing zoom-in of a focus frame of a user interface by an increase volume button on a remote controller is provided for the present disclosure.
The method comprises the following steps:
s601: and receiving a volume increasing key instruction which is generated by a volume increasing key on the remote controller and acts on the first focus frame.
S602: and deleting the first focus frame according to the increase amount key instruction, and displaying a second focus frame, wherein the size of the second focus frame is larger than that of the first focus frame.
Based on the above process, the amplification factor can be preset, and the amplification factor is equal to the first setting factor and is greater than 1. In S602, the size of the second focus frame may be determined according to the set magnification such that the magnification of the size of the second focus frame compared to the size of the first focus frame is equal to the first set magnification.
Taking the first focus frame shown in fig. 3 as an example, the side length of the first focus frame is l, the magnification is 1.5 times, and the magnification is based on the center. The user clicks the volume increasing key on the remote controller, responds to the volume increasing instruction corresponding to the volume increasing key, deletes the first focus frame of the display interface, multiplies 1.5 by the side length l of the first focus frame for amplification, and redraws to obtain a second focus frame as shown in fig. 7, wherein the side length is 1.5l. In this manner, if the user clicks the volume up button on the remote control again, the interface displays a square focus box with a side length of 2.25 l. It should be noted that fig. 7 shows the first focus frame in a dashed line frame manner, so as to compare the positions of the focus frames before and after enlargement, and in particular, the first focus frame is deleted before the second focus frame is displayed.
In a possible implementation manner, the magnification factor of the focus frame may be additionally set, where the magnification factor is equal to a third setting factor, and is used when the user presses the volume increasing button for a long time to enlarge the focus frame, and the third setting factor is different from the first setting factor in size, for example, may be smaller than the first setting factor, so that from the intuitive experience of the user, the size of the focus frame may be slowly enlarged by pressing the volume increasing button on the remote controller for a long time, and the flexibility of controlling the size of the focus frame may be improved, and the user experience may be improved. Specifically, if the user presses the volume up key on the remote controller for a long time, the focus frame is drawn at the focus frame magnification that is additionally set. Further, the currently drawn focus frame may be displayed only when the user lifts the increase volume key. Because the volume increasing key is pressed for a long time, the volume increasing key instruction is sent to the smart television at a certain time interval, and the smart television responds based on each received volume increasing key instruction, when the method is adopted, the longer the time for pressing the volume increasing key is, the larger the size of the displayed focus frame is, and the user can obtain the focus frame with the corresponding amplification effect by controlling the time length for pressing the volume increasing key for a long time.
When the size of the focus frame is equal to the size of the maximum focus frame or the edge of the focus frame coincides with the maximum boundary of the focus frame, and a user operates a volume increasing button on the remote controller, the user interface keeps the focus frame, and prompt information appears to prompt that the size of the focus frame reaches the maximum value and cannot be amplified any more.
It should be noted that the size of the maximum size focus frame in the above embodiments is determined by the position and shape of the focus frame and the size of the user interface, and the disclosure is not limited herein.
The following describes the flow shown in fig. 8 in detail by taking a square focus frame as an example and combining the user interface shown in fig. 3 and the system architecture shown in fig. 8 as an example.
Fig. 8 exemplarily shows a system architecture of an intelligent television based on an Android system, and the system architecture shown in fig. 8 includes a system kernel 801, an application framework layer 802, and an application layer 803. The system kernel 801 includes modules such as an Input Device Driver (i.e., driver). The application framework layer 802 includes system services such as input management service, listening control service, window management service, and the like. The application layer 803 includes a plurality of applications such as an image processing application.
Taking an image processing application as an example, when the application is installed, it registers a key value that it wants to monitor with a monitoring control service in the application framework layer 802, and stores the key value registered by the application and related monitoring information in a database of the monitoring control service.
Table 1: monitoring information corresponding to volume-up key registered by image processing application program
Application name Key value Action Whether or not to distribute
Image processing application program Key_vol_inc Response to False
The remote controller keys registered by the image processing application program comprise volume increasing keys, and Key values of the volume increasing keys are represented as Key _ vol _ inc. The Action indicates that a response needs to be performed if an event corresponding to the key value is received. False indicates that downward distribution is not continued if an event corresponding to the key value is received, that is, the application only executes a response corresponding to the key value and does not execute a default response of the key value. For example, the image processing application program receives the key value event and only performs the zoom-in operation of the focus frame without performing the default volume-up operation.
A wireless signal (volume increasing Key instruction) generated after a user operates a volume increasing Key on a remote controller is read by an Input Device Driver in an intelligent television system architecture, and the signal is processed by the Input Device Driver to obtain a volume increasing event (the event comprises a Key value Key _ vol _ inc) and sent to a monitoring control service in an application framework layer 802; the interception control service determines whether the current application (image processing application) registers the key value and registers as responding to the key value according to the interception information (shown in table 1) registered by the application, and if so, sends the volume increase event to the image processing application in the application layer 803; the image processing application executes corresponding processing, in this example, an enlargement operation of the focus frame, in accordance with the received sound volume increasing event. And when the focus frame amplification operation is completed, the monitoring control service of the application framework layer determines that the voice volume increasing event is not distributed any more according to the key value registered by the application program and the related information, so that the focus frame amplification processing operation is completed.
The above description mainly introduces the scheme provided by the embodiments of the present disclosure from the perspective of human-computer interaction. It is understood that the terminal includes corresponding hardware structures and/or software modules for performing the respective functions in order to implement the above-described functions. The units (devices, means) and algorithm steps of each example described in connection with the embodiments disclosed in the disclosure may be implemented in hardware or a combination of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
Referring to fig. 9, a flowchart of a method for reducing a focus frame of a user interface by a volume-down button on a remote controller is provided.
The method comprises the following steps:
s901: and receiving a volume reduction key instruction which is generated by a volume reduction key on the remote controller and acts on the first focus frame.
S902: and deleting the first focus frame according to the volume reduction key instruction, and displaying a second focus frame, wherein the size of the second focus frame is smaller than that of the first focus frame.
Based on the above process, a reduction factor equal to the second setting factor and greater than 1 may be preset. In S902, the size of the second focus frame may be determined according to the set reduction factor such that the reduction factor of the size of the second focus frame compared to the size of the first focus frame is equal to the second set factor.
Taking the first focus frame in fig. 3 as an example, the side length of the first focus frame is l, the reduction factor is 2 times, and the magnification mode is to magnify the first focus frame with the center as a base point. The user clicks the volume-reducing key on the remote controller, responds to the volume-reducing instruction corresponding to the volume-reducing key, deletes the first focus frame of the display interface, reduces the first focus frame by dividing the first focus frame by 2 on the basis of the side length l of the first focus frame, and redraws the second focus frame to obtain a second focus frame as shown in fig. 10, wherein the side length is 0.5l. In this manner, if the user clicks the volume down button on the remote control again, the interface displays a square focus box with a side length of 0.25 l. It should be noted that fig. 10 shows the first focus frame in a dashed line frame manner, so as to compare the positions of the focus frames before and after the zoom-out, and in the specific implementation, the first focus frame is deleted before the second focus frame is displayed.
In a possible implementation manner, a focus frame reduction multiple can be additionally set, wherein the reduction multiple is equal to a fourth setting multiple and is used when the focus frame is reduced when the user presses the volume reduction key for a long time, the fourth setting multiple is different from the second setting multiple in size, for example, the fourth setting multiple can be smaller than the second setting multiple, so that the size of the focus frame can be slowly reduced through long pressing the volume reduction key on the remote controller from the visual experience of the user, the flexibility of size control of the focus frame can be improved, and the user experience can be improved. Specifically, if the user presses the volume-down key on the remote controller for a long time, the focus frame is drawn at the focus frame reduction factor set additionally. Further, the currently drawn focus frame may be displayed only when the user lifts the volume-down key. Because the volume increasing key is pressed for a long time, the volume increasing key instruction is sent to the smart television at a certain time interval, and the smart television responds based on the received volume key instruction of each key, when the method is adopted, the longer the volume decreasing key is pressed for a long time, the smaller the size of the displayed focus frame is, and the user can obtain the focus frame with a corresponding reducing effect by controlling the time length for pressing the volume decreasing key for a long time.
When the size of the focus frame is equal to that of the minimum focus frame and the user operates the volume reduction key on the remote controller, the user interface keeps the focus frame and presents prompt information to prompt that the size of the focus frame reaches the minimum value and cannot be reduced.
It should be noted that, the size of the minimum-sized focus frame in the above embodiments is determined by the position and shape of the focus frame and the size of the user interface, and the disclosure is not limited thereto.
The following describes the flow shown in fig. 9 in detail by taking a square focus frame as an example and combining the user interface shown in fig. 3 and the system architecture shown in fig. 8 as an example.
Taking an image processing application as an example, when the application is installed, it registers a key value that it wants to monitor with the monitoring control service in the application framework layer 802, and stores the key value registered by the application and related monitoring information in the database of the monitoring control service.
Table 2: monitoring information corresponding to volume-down key registered by image processing application program
Application name Key value Action Whether or not to distribute
Image processing application program Key_vol_dec Response to False
The remote controller keys registered by the image processing application program comprise volume reduction keys, and the Key value of the volume reduction keys is represented as Key _ vol _ dec. The Action indicates that a response needs to be performed if an event corresponding to the key value is received. False indicates that downward distribution is not continued if an event corresponding to the key value is received, that is, the application only executes a response corresponding to the key value and does not execute a default response of the key value. For example, the image processing application program receives the key value event and only performs the zoom-out operation of the focus frame without performing the default volume-down operation.
A wireless signal (volume-reducing Key instruction) generated after a user operates a volume-reducing Key on a remote controller is read by an Input Device Driver in an intelligent television system architecture, and the signal is processed by the Input Device Driver to obtain a volume-reducing event (the event comprises a Key value Key _ vol _ dec) and sent to a monitoring control service in an application framework layer 802; the interception control service determines whether the current application (image processing application) registers the key value and registers as responding to the key value according to the interception information (shown in table 2) registered by the application, and if so, sends the volume reduction event to the image processing application in the application layer 803; the image processing application performs corresponding processing, in this example, a zoom-out operation of the focus frame, in accordance with the received volume-down event. And when the focal point frame reducing operation is executed, the monitoring control service of the application framework layer determines that the volume reducing event is not distributed any more according to the key value registered by the application program and the related information, so that the focal point frame reducing processing operation is completed.
Referring to fig. 11, a flowchart of a method for implementing zoom-in or zoom-out of a user interface focus box through a confirmation key on a remote controller is provided for the present disclosure.
The method comprises the following steps:
s1101: and receiving a confirmation key command which is generated by a confirmation key on the remote controller and acts on the first focus frame.
S1102: acquiring zooming instruction information according to the received confirmation key instruction, judging whether the zooming instruction information is used for instructing the focus frame zooming-in operation or the focus frame zooming-out operation, if the zooming instruction information is the focus frame zooming-in operation, executing S1103, otherwise, executing S1104.
S1103: and deleting the first focus frame and displaying a second focus frame according to the zooming indication information as the zooming operation, wherein the size of the second focus frame is larger than that of the first focus frame.
In S1103, the size of the second focus frame may be determined according to the first setting factor such that the magnification of the size of the second focus frame compared to the size of the first focus frame is equal to the first setting factor.
In a possible implementation manner, the user presses the confirmation key for a long time, the focus frame can be enlarged according to the third setting multiple, and the focus frame currently drawn when the user lifts the confirmation key is displayed, so that the size of the drawn focus frame is larger than that of the first focus frame.
Optionally, if a partial edge of the first focus frame coincides with a partial boundary of a maximum boundary of the focus frame, and the zoom instruction information indicates a zoom-in operation of the focus frame, a corresponding edge of the second focus frame displayed in response to the volume increase key instruction still coincides with the partial boundary of the maximum boundary of the focus frame.
S1104: and deleting the first focus frame and displaying a second focus frame according to the zoom indication information as the zoom-out operation, wherein the size of the second focus frame is smaller than that of the first focus frame.
In S1104, the size of the second focus frame may be determined according to a second set multiple such that the reduction multiple of the size of the second focus frame compared to the size of the first focus frame is equal to the second set multiple.
In a possible implementation manner, the user presses the confirmation key for a long time, the focus frame can be reduced according to the fourth setting multiple, and the focus frame currently drawn when the user lifts the confirmation key is displayed, so that the size of the drawn focus frame is smaller than that of the first focus frame.
The zoom instruction information in the above-described embodiment may be set by the system, for example, setting an initial value of the zoom instruction information to 0 indicating that the default zoom operation is to zoom in the focus frame, setting an initial value of the zoom instruction information to 1 indicating that the default zoom operation is to zoom out the focus frame.
In some embodiments, the initial value of the zoom instruction information is set by the system, and during the operation process, the value of the zoom instruction information may be changed according to the zoom-in and zoom-out conditions of the focus frame, so as to implement continuous change of the size of the focus frame.
For example, if the current zoom instruction information is an operation for instructing zooming in of the focus frame, after the smart television receives the enter key instruction, because the current zoom instruction information is used for instructing zooming in of the focus frame, the zoom in of the focus frame is executed, and it is determined whether the size of the second focus frame after being zoomed in is equal to the size of the maximum focus frame or the edge of the second focus frame coincides with the maximum boundary of the focus frame, if yes, it is indicated that the current focus frame has been zoomed in to the maximum degree, and the zoom instruction information is set to instruct zooming out of the focus frame. In this way, the zoom-out operation of the focus frame can be executed after receiving the confirmation key instruction acting on the focus frame again.
For another example, if the current zoom instruction information is a zoom-out operation for instructing a focus frame, after the smart television receives the enter key instruction, because the current zoom instruction information is used for instructing a zoom-out operation for the focus frame, the zoom-out operation for the focus frame is executed, and it is determined whether the size of the second focus frame after being zoomed out is equal to the size of the minimum focus frame, if so, it indicates that the current focus frame has been zoomed out to the minimum degree, and the zoom instruction information is set to instruct a zoom-in operation for the focus frame. In this way, after receiving the confirmation key instruction acting on the focus frame again, the zoom-in operation of the focus frame can be performed.
In some embodiments, the value of the zoom indication information remains unchanged.
Taking the example that the initial value of the zoom instruction information is used to instruct the zoom-in operation of the focus frame, after receiving the enter key instruction applied to the first focus frame, it is determined whether the size of the first focus frame is equal to the size of the maximum focus frame or whether the edge of the first focus frame coincides with the maximum boundary of the focus frame, and if so, it indicates that the focus frame cannot be zoomed in any more, and therefore the first focus frame is deleted and the second focus frame is displayed according to the minimum focus frame size or the default focus frame size or the size of the initial focus frame. Wherein, the size of the initial focus frame is the size of the focus frame when the focus frame is initially generated.
Taking the initial value of the zoom instruction information as an example for instructing the zoom-out operation of the focus frame, after receiving the confirmation key instruction applied to the first focus frame, it is determined whether the size of the first focus frame is equal to the size of the minimum focus frame or whether the edge of the first focus frame coincides with the minimum boundary of the focus frame, and if so, it indicates that the focus frame cannot be zoomed out any more, so that the first focus frame is deleted and the second focus frame is displayed according to the maximum focus frame size or the default focus frame size or the size of the initial focus frame.
The following describes the flow shown in fig. 11 in detail by taking a square focus frame as an example and combining the user interface shown in fig. 3 and the system architecture shown in fig. 8 as an example.
Taking an image processing application as an example, when the application is installed, it registers a key value that it wants to monitor with the monitoring control service in the application framework layer 802, and stores the key value registered by the application and related monitoring information in the database of the monitoring control service.
Table 3: monitoring information corresponding to confirmation key registered by image processing application program
Application name Key value Action Whether or not to distribute
Image processing application program Key_vol_ok Response to True
The remote controller keys registered by the image processing application program comprise confirmation keys, and Key values of the confirmation keys are represented as Key _ vol _ ok. And the Action indicates that a response needs to be carried out if the event corresponding to the key value is received. True indicates that the downward distribution continues after receiving the event corresponding to the key value, i.e. the application program executes not only the response corresponding to the key value but also the default response of the key value. For example, after receiving the key value event, the image processing application executes a default confirmation operation after executing a zoom-out or zoom-in operation of the focus frame.
A wireless signal (confirmation Key instruction) generated after a user operates a confirmation Key on a remote controller is read by an Input Device Driver (Input Device Driver) in the smart television system architecture, the Input Device Driver processes the signal to obtain a confirmation event (the event comprises a Key value Key _ vol _ ok) and sends the confirmation event to a monitoring control service in an application framework layer 802; the listening control service determines whether the current application (image processing application) registers the key value and registers as a response to the key value according to the monitoring information (shown in table 3) registered by the application, and if so, sends the confirmation event to the image processing application in the application layer 803; the image processing application determines whether the event is an enlargement operation or a reduction operation based on the received confirmation event, and performs corresponding processing, in this example, determining that the event is an enlargement operation. And when the focal frame amplification operation is executed, the related system service in the application framework layer needs to be called, and after the focal frame amplification operation is executed, the monitoring control service of the application framework layer distributes the confirmation event according to the key value registered by the application program and the related information, so that the focal frame amplification operation corresponding to the confirmation key is completed.
Based on the same technical concept, the embodiment of the disclosure also provides a device for zooming the user interface focus frame of the smart television, and the device can execute the process of the method for zooming the user interface focus frame of the smart television.
Fig. 12 is a schematic diagram of a user interface focus frame zooming apparatus of an intelligent television set according to the present disclosure. The apparatus may implement the focus frame scaling procedure provided by the foregoing embodiment.
The apparatus comprises: display unit 1201, response unit 1202.
The display unit 1201 is configured to display a user interface, where the user interface includes a first focus frame.
A response unit 1202, configured to delete the first focus frame in response to a remote control key instruction acting on the first focus frame, and control a display unit to display a second focus frame; if the remote controller key instruction is used for indicating the focus frame to be amplified, the area range of the first focus frame is within the area range of the second focus frame; and if the remote controller key instruction is used for indicating the zoom-out operation of the focus frame, the area range of the second focus frame is within the area range of the first focus frame.
The display unit 1201 is further configured to display the second focus frame according to the control of the response unit.
Optionally, the remote controller key instruction is a volume increasing key instruction, and the volume increasing key instruction is an instruction generated by operating a volume increasing key on the remote controller; and the volume increasing key instruction is used for indicating the zooming-in operation of the focus frame.
Optionally, a magnification of the second focus frame compared with the first focus frame is equal to a first setting multiple, and a value of the first setting multiple is greater than 1.
Optionally, the response unit 1202 is further configured to: and if the size of the first focus frame is equal to the size of the maximum focus frame or the edge of the first focus frame is coincided with the maximum boundary of the focus frame, reserving the first focus frame.
Optionally, the response unit 1202 is further configured to: if the partial edge of the first focus frame coincides with the partial boundary of the maximum boundary of the focus frame, the corresponding edge of the second focus frame displayed in response to the volume increasing key instruction still remains coincident with the partial boundary of the maximum boundary of the focus frame.
Optionally, the remote controller key instruction is a volume reduction key instruction, and the volume reduction key instruction is an instruction generated by operating a volume reduction key on the remote controller; and the volume reduction key instruction is used for indicating the zoom-out operation of the focus frame.
Optionally, a reduction multiple of the second focus compared to the first focus frame is equal to a second setting multiple, and a value of the second setting multiple is greater than 1.
Optionally, the response unit 1202 is further configured to: and if the size of the first focus frame is equal to the size of the minimum focus frame, reserving the first focus frame.
Optionally, the remote controller key instruction is a confirm key instruction, and the confirm key instruction is an instruction generated by operating a confirm key on the remote controller; when the response unit 1202 responds to the confirmation key instruction, the response unit is further configured to: acquiring zooming indication information, wherein the zooming indication information is used for indicating a focus frame zooming operation or a focus frame zooming operation; if the zooming indication information indicates the zooming operation of the focus frame, the area range of the first focus frame is within the area range of the second focus frame, otherwise, the area range of the second focus frame is within the area range of the first focus frame.
Optionally, the magnification of the second focus frame compared with the first focus frame is equal to a third setting multiple, and the values of the third setting multiple are all greater than 1; or the reduction multiple of the second focus frame compared with the first focus frame is equal to a fourth setting multiple, and the values of the fourth setting multiple are all greater than 1.
Optionally, the response unit 1202 is further configured to: if the partial edge of the first focus frame coincides with the partial boundary of the maximum boundary of the focus frame and the zooming indication information indicates the zooming operation of the focus frame, the corresponding edge of the second focus frame displayed in response to the instruction of the confirmation key still coincides with the partial boundary of the maximum boundary of the focus frame.
Optionally, the response unit 1202 is further configured to: if the size of the second focus frame is equal to the size of the maximum focus frame or the edge of the second focus frame is overlapped with the maximum boundary of the focus frame, setting the zooming indication information as the zooming-out operation for indicating the focus frame; and if the size of the second focus frame is equal to the size of the minimum focus frame, setting the zooming indication information to be used for indicating the zooming-in operation of the focus frame.
Optionally, the response unit 1202 is further configured to: if the size of the first focus frame is equal to the size of the maximum focus frame or the edge of the first focus frame is coincident with the maximum boundary of the focus frame, the size of a second focus frame displayed in response to the confirmation key instruction is the size of the minimum focus frame or the size of a default focus frame or the size of an initial focus frame; the initial focus frame is a focus frame selected by a user through a remote controller; or if the size of the first focus frame is equal to the size of the minimum focus frame, responding to the confirmation key instruction to display the size of a second focus frame as the size of the maximum focus frame or the size of a default focus frame or the size of an initial focus frame; and the initial focus frame is the focus frame selected by the user through the remote controller.
Optionally, when the second focus frame is larger than the first focus frame, the area range of the first focus frame is within the area range of the second focus frame; when the second focus frame is smaller than the first focus frame, the area range of the second focus frame is within the area range of the first focus frame.
Optionally, the shape of the second focus frame is the same as or matches the shape of the first focus frame.
Based on the same technical concept, the embodiment of the present disclosure further provides a smart tv 1300, which can implement the process executed by the apparatus 1200 for zooming the user interface focus frame of the smart tv in the foregoing embodiment.
Fig. 13 shows a schematic structural diagram of an intelligent television 1300 provided in an embodiment of the present disclosure, where the intelligent television includes: processor 1301, memory 1302, and display 1303.
The processor 1301 is configured to execute the computer instructions to implement:
and controlling the display 1303 to display a user interface, wherein the user interface comprises a first focus frame.
Responding to a remote controller key instruction acting on the first focus frame, controlling a display to delete the first focus frame and display a second focus frame, wherein if the remote controller key instruction is used for indicating the focus frame to be amplified, the area range of the first focus frame is within the area range of the second focus frame; and if the remote controller key instruction is used for indicating the zoom-out operation of the focus frame, the area range of the second focus frame is within the area range of the first focus frame.
Optionally, the remote controller key instruction is a volume increasing key instruction, and the volume increasing key instruction is an instruction generated by operating a volume increasing key on the remote controller; and the volume increasing key instruction is used for indicating the zooming-in operation of the focus frame.
Optionally, the processor 1301 is further configured to: and if the size of the first focus frame is equal to the size of the maximum focus frame or the edge of the first focus frame is coincided with the maximum boundary of the focus frame, reserving the first focus frame.
Optionally, a magnification of the second focus frame compared to the first focus frame is equal to a first setting multiple, and a value of the first setting multiple is greater than 1.
Optionally, the processor 1301 is further configured to: if the partial edge of the first focus frame coincides with the partial boundary of the maximum boundary of the focus frame, the corresponding edge of the second focus frame displayed in response to the volume increasing key instruction still remains coincident with the partial boundary of the maximum boundary of the focus frame.
Optionally, the remote controller key instruction is a volume reduction key instruction, and the volume reduction key instruction is an instruction generated by operating a volume reduction key on the remote controller; and the volume reduction key instruction is used for indicating the zoom-out operation of the focus frame.
Optionally, a reduction multiple of the second focus compared to the first focus frame is equal to a second setting multiple, and a value of the second setting multiple is greater than 1.
Optionally, the processor 1301 is further configured to: and if the size of the first focus frame is equal to the size of the minimum focus frame, reserving the first focus frame.
Optionally, the remote controller key instruction is a confirm key instruction, and the confirm key instruction is an instruction generated by operating a confirm key on the remote controller; the processor 1301 is specifically configured to: when responding to the confirmation key instruction, the method further comprises the following steps: acquiring zooming indication information, wherein the zooming indication information is used for indicating a focus frame zooming operation or a focus frame zooming operation; if the zooming indication information indicates the zooming operation of the focus frame, the area range of the first focus frame is within the area range of the second focus frame, otherwise, the area range of the second focus frame is within the area range of the first focus frame.
Optionally, the magnification of the second focus frame compared with the first focus frame is equal to a third setting multiple, and the values of the third setting multiple are all greater than 1; or the reduction multiple of the second focus frame compared with the first focus frame is equal to a fourth setting multiple, and the values of the fourth setting multiple are all greater than 1.
Optionally, the processor 1301 is further configured to: if the partial edge of the first focus frame coincides with the partial boundary of the maximum boundary of the focus frame and the zooming indication information indicates the zoom-in operation of the focus frame, the corresponding edge of the second focus frame displayed in response to the confirmation key instruction still coincides with the partial boundary of the maximum boundary of the focus frame.
Optionally, the processor 1301 is further configured to: if the size of the second focus frame is equal to the size of the maximum focus frame or the edge of the second focus frame is overlapped with the maximum boundary of the focus frame, setting the zooming indication information as the zooming-out operation for indicating the focus frame; and if the size of the second focus frame is equal to the size of the minimum focus frame, setting the zooming indication information to be used for indicating the zooming-in operation of the focus frame.
Optionally, the processor 1301 is further configured to: if the size of the first focus frame is equal to the size of the maximum focus frame or the edge of the first focus frame is coincident with the maximum boundary of the focus frame, the size of a second focus frame displayed in response to the confirmation key instruction is the size of the minimum focus frame or the size of a default focus frame or the size of an initial focus frame; the initial focus frame is a focus frame selected by a user through a remote controller; or if the size of the first focus frame is equal to the size of the minimum focus frame, responding to the confirmation key instruction to display the size of a second focus frame as the size of the maximum focus frame or the size of a default focus frame or the size of an initial focus frame; and the initial focus frame is the focus frame selected by the user through the remote controller.
In the embodiment of the present disclosure, for concepts, explanations, detailed descriptions, and other steps related to the zoom apparatus 1200 and the smart tv 1300 for the user interface focus frame of the smart tv set and related to the technical solution provided by the embodiment of the present disclosure, please refer to the description of the foregoing method or the description related to these contents in other embodiments, which is not described herein again.
It should be noted that the processor referred to above in the embodiments of the present disclosure may be a Central Processing Unit (CPU), a general-purpose processor, a Digital Signal Processor (DSP), an application-specific integrated circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. A processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, a combination of a DSP and a microprocessor, and the like. Wherein the memory may be integrated in the processor or may be provided separately from the processor.
The embodiment of the present disclosure also provides a computer-readable storage medium for storing instructions, which when executed, can perform any one of the methods related to the foregoing terminal.
In the foregoing embodiment of the present disclosure, the currently displayed user interface includes the first focus frame, and after receiving a remote controller key acting on the first focus frame, in response, the first focus frame is deleted, and the second focus frame is displayed, where an area range of the second focus frame is within an area range of the first focus frame or an area range of the first focus frame is within an area range of the second focus frame, so that the focus frame can be zoomed through the remote controller.
The present disclosure is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present disclosure have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the disclosure.
It will be apparent to those skilled in the art that various changes and modifications can be made in the present disclosure without departing from the spirit and scope of the disclosure. Thus, if such modifications and variations of the present disclosure fall within the scope of the claims of the present disclosure and their equivalents, the present disclosure is intended to include such modifications and variations as well.

Claims (10)

1. A method for zooming a user interface focus frame of an intelligent television, wherein the intelligent television comprises an image processing application program, and is characterized by comprising the following steps:
displaying a user interface, the user interface including a first image;
performing image recognition through the image processing application program to determine a first area where the head portrait of the person in the first image is located, and setting a first focus frame based on the first area, wherein the first focus frame is used for representing that the first area of the first image is selected to be used as an object of image processing;
receiving a volume key instruction, and judging whether the image processing application program registers the volume key instruction;
when the image processing application program is determined to be registered with the volume key instruction, deleting the first focus frame in response to the volume key instruction, and displaying a second focus frame above the first image by taking a center point of the first focus frame as a reference point, but not zooming the image in the first area in the first image, wherein the second focus frame is used for representing that a second area of the first image is selected to replace the first area as an object of image processing;
when the image processing application program is determined not to be registered with the volume key instruction, responding to the volume key instruction, and adjusting the volume of the intelligent television;
if the volume key instruction is a volume increasing key instruction, the area range of the first focus frame is within the area range of the second focus frame, and the first area of the first image is the area in the second area; if the volume key instruction is a volume reduction key instruction, the area range of the second focus frame is within the area range of the first focus frame, and the second area of the first image is the area in the first area.
2. The method of claim 1, wherein the up volume key command is a command generated by operating an up volume key on a remote controller;
the volume increasing key instruction is used for indicating the zooming-in operation of the focus frame; or,
the volume reduction key instruction is an instruction generated by operating a volume reduction key on the remote controller;
and the volume reduction key instruction is used for indicating the zoom-out operation of the focus frame.
3. The method of claim 2, wherein the magnification of the second focus frame compared to the first focus frame is equal to a first set multiple, the first set multiple having a value greater than 1.
4. The method of claim 2, further comprising:
and if the size of the first focus frame is equal to the size of the maximum focus frame or the edge of the first focus frame is coincided with the maximum boundary of the focus frame, reserving the first focus frame.
5. The method of claim 2, wherein the second focus is reduced from the first focus frame by a factor equal to a second set factor, the second set factor being greater than 1.
6. The method of claim 2, further comprising:
and if the size of the first focus frame is equal to the size of the minimum focus frame, reserving the first focus frame.
7. The method of any of claims 1 to 6, wherein the shape of the second focus frame is the same as or matches the shape of the first focus frame.
8. The method of claim 7, wherein the volume key instruction is a volume up key instruction when the shape of the second focus frame matches the shape of the first focus frame, the first focus frame being located at a boundary of the user interface.
9. A user interface focus frame zooming device of an intelligent television, wherein the intelligent television comprises an image processing application program, and the device is characterized by comprising:
a display unit: for displaying a user interface, the user interface including a first image;
a response unit: the image processing application program is used for performing image recognition to determine a first area where the head portrait of the person in the first image is located, and setting a first focus frame based on the first area, wherein the first focus frame is used for representing that the first area of the first image is selected to be used as an object of image processing; receiving a volume key instruction, and judging whether the image processing application program registers the volume key instruction; when the volume key instruction is determined to be registered in the image processing application program, deleting the first focus frame in response to the volume key instruction, and controlling the display unit to display a second focus frame above the first image by taking a center point of the first focus frame as a reference point, but not zooming the image in the first area in the first image, wherein the second focus frame is used for representing that a second area of the first image is selected to replace the first area as an object of image processing; when the image processing application program is determined not to be registered with the volume key instruction, responding to the volume key instruction, and adjusting the volume of the intelligent television;
if the volume key instruction is a volume increasing key instruction, the area range of the first focus frame is within the area range of the second focus frame, and the first area of the first image is the area in the second area; and if the volume key instruction is a volume reduction key instruction, the area range of the second focus frame is within the area range of the first focus frame, and the second area of the first image is the area in the first area.
10. An intelligent television, characterized in that the intelligent television comprises an image processing application program, comprising: a processor, a memory, and a display;
the memory to store computer instructions;
the processor to execute the computer instructions to implement:
controlling the display to display a user interface, the user interface including a first image;
performing image recognition through the image processing application program to determine a first area where the head portrait of the person in the first image is located, and setting a first focus frame based on the first area, wherein the first focus frame is used for representing that the first area of the first image is selected to serve as an object of image processing;
receiving a volume key instruction, and judging whether the image processing application program registers the volume key instruction; when the image processing application program is determined to be registered with the volume key instruction, controlling a display to delete the first focus frame in response to the volume key instruction, and displaying a second focus frame above the first image by taking a center point of the first focus frame as a reference point, but not zooming the image in the first area in the first image, wherein the second focus frame is used for representing that a second area of the first image is selected to replace the first area as an object of image processing; when the image processing application program is determined not to be registered with the volume key instruction, responding to the volume key instruction, and adjusting the volume of the intelligent television;
if the volume key instruction is a volume increasing key instruction, the area range of the first focus frame is within the area range of the second focus frame, and the first area of the first image is the area in the second area and is used as an object of image processing, but the image in the first area is not zoomed;
if the volume key instruction is a volume reduction key instruction, the area range of the second focus frame is within the area range of the first focus frame, and the second area of the first image is the area in the first area.
CN201811098713.0A 2018-09-20 2018-09-20 Method and equipment for zooming user interface focus frame of intelligent television Active CN109343923B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201811098713.0A CN109343923B (en) 2018-09-20 2018-09-20 Method and equipment for zooming user interface focus frame of intelligent television
PCT/CN2019/106003 WO2020057466A1 (en) 2018-09-20 2019-09-16 Method and device for zooming focus frame of user interface of smart television

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811098713.0A CN109343923B (en) 2018-09-20 2018-09-20 Method and equipment for zooming user interface focus frame of intelligent television

Publications (2)

Publication Number Publication Date
CN109343923A CN109343923A (en) 2019-02-15
CN109343923B true CN109343923B (en) 2023-04-07

Family

ID=65306078

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811098713.0A Active CN109343923B (en) 2018-09-20 2018-09-20 Method and equipment for zooming user interface focus frame of intelligent television

Country Status (2)

Country Link
CN (1) CN109343923B (en)
WO (1) WO2020057466A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109343923B (en) * 2018-09-20 2023-04-07 聚好看科技股份有限公司 Method and equipment for zooming user interface focus frame of intelligent television

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103517115A (en) * 2012-12-06 2014-01-15 Tcl集团股份有限公司 Method and system for adding focus frame in widget views and smart television
CN103686289A (en) * 2013-12-26 2014-03-26 深圳Tcl新技术有限公司 Intelligent television keyboard input control method and intelligent television
CN104301518A (en) * 2014-09-09 2015-01-21 青岛蓝图文化传播有限公司市南分公司 Voice amplifying circuit based on digital control
CN104602092A (en) * 2015-02-12 2015-05-06 江苏银河电子股份有限公司 Method for adaption of intelligent set top box to mobile phone applications under Android system
CN104853250A (en) * 2014-02-19 2015-08-19 三星电子株式会社 Remote controller and method for controlling screen thereof
CN105812944A (en) * 2014-12-29 2016-07-27 深圳Tcl新技术有限公司 Interface focus movement control method and device
CN105847930A (en) * 2016-03-22 2016-08-10 乐视网信息技术(北京)股份有限公司 Focus frame control method and device
CN106648341A (en) * 2016-11-18 2017-05-10 惠州Tcl移动通信有限公司 Display interface make-ready method and system
CN106648312A (en) * 2016-11-18 2017-05-10 惠州Tcl移动通信有限公司 Adjustment method and system of display interface content
CN106708389A (en) * 2016-12-30 2017-05-24 维沃移动通信有限公司 Method for controlling the zoom of display interfaceand mobile terminal
CN106873928A (en) * 2016-10-31 2017-06-20 深圳市金立通信设备有限公司 Long-range control method and terminal
CN106998489A (en) * 2017-03-30 2017-08-01 武汉斗鱼网络科技有限公司 A kind of focus is crossed the border searching method and device
CN107077290A (en) * 2014-08-08 2017-08-18 三星电子株式会社 For the apparatus and method by using row interactive controlling content
CN107566913A (en) * 2016-06-30 2018-01-09 中兴通讯股份有限公司 A kind of interface display method of IPTV, device and set top box
CN107801105A (en) * 2016-09-07 2018-03-13 环球智达科技(北京)有限公司 Intelligent television sliding focus moving method and control system thereof
CN108028956A (en) * 2015-09-17 2018-05-11 Lg电子株式会社 Mobile terminal and its control method
CN108111898A (en) * 2017-12-20 2018-06-01 聚好看科技股份有限公司 The display methods and smart television of the graphic user interface of television image sectional drawing
CN108419114A (en) * 2018-01-18 2018-08-17 海尔优家智能科技(北京)有限公司 A kind of UI interface display methods, device and storage medium for smart television

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7681149B2 (en) * 2003-05-21 2010-03-16 Nokia Corporation User interface display for set-top box device
CN101710936A (en) * 2009-12-16 2010-05-19 中兴通讯股份有限公司 IPTV printscreen realization method and device based on set-top box
CN102262496B (en) * 2010-05-27 2013-11-06 鸿富锦精密工业(深圳)有限公司 Electronic device and graphical user interface display method thereof
KR101832463B1 (en) * 2010-12-01 2018-02-27 엘지전자 주식회사 Method for controlling a screen display and display apparatus thereof
US8504941B2 (en) * 2011-10-31 2013-08-06 Utc Fire & Security Corporation Digital image magnification user interface
KR101383840B1 (en) * 2011-11-17 2014-04-14 도시바삼성스토리지테크놀러지코리아 주식회사 Remote controller, system and method for controlling by using the remote controller
CN103905752B (en) * 2012-12-30 2016-12-28 腾讯科技(深圳)有限公司 Remote terminal remote control thereof, device and display terminal display packing, device
CN102984595B (en) * 2012-12-31 2016-10-05 北京京东世纪贸易有限公司 A kind of image processing system and method
CN103309618A (en) * 2013-07-02 2013-09-18 姜洪明 Mobile operating system
US9271048B2 (en) * 2013-12-13 2016-02-23 The Directv Group, Inc. Systems and methods for immersive viewing experience
CN104461256A (en) * 2014-12-30 2015-03-25 广州视源电子科技股份有限公司 interface element display method and system
US10709422B2 (en) * 2016-10-27 2020-07-14 Clarius Mobile Health Corp. Systems and methods for controlling visualization of ultrasound image data
US10684758B2 (en) * 2017-02-20 2020-06-16 Microsoft Technology Licensing, Llc Unified system for bimanual interactions
CN109343923B (en) * 2018-09-20 2023-04-07 聚好看科技股份有限公司 Method and equipment for zooming user interface focus frame of intelligent television

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103517115A (en) * 2012-12-06 2014-01-15 Tcl集团股份有限公司 Method and system for adding focus frame in widget views and smart television
CN103686289A (en) * 2013-12-26 2014-03-26 深圳Tcl新技术有限公司 Intelligent television keyboard input control method and intelligent television
CN104853250A (en) * 2014-02-19 2015-08-19 三星电子株式会社 Remote controller and method for controlling screen thereof
CN107077290A (en) * 2014-08-08 2017-08-18 三星电子株式会社 For the apparatus and method by using row interactive controlling content
CN104301518A (en) * 2014-09-09 2015-01-21 青岛蓝图文化传播有限公司市南分公司 Voice amplifying circuit based on digital control
CN105812944A (en) * 2014-12-29 2016-07-27 深圳Tcl新技术有限公司 Interface focus movement control method and device
CN104602092A (en) * 2015-02-12 2015-05-06 江苏银河电子股份有限公司 Method for adaption of intelligent set top box to mobile phone applications under Android system
CN108028956A (en) * 2015-09-17 2018-05-11 Lg电子株式会社 Mobile terminal and its control method
CN105847930A (en) * 2016-03-22 2016-08-10 乐视网信息技术(北京)股份有限公司 Focus frame control method and device
CN107566913A (en) * 2016-06-30 2018-01-09 中兴通讯股份有限公司 A kind of interface display method of IPTV, device and set top box
CN107801105A (en) * 2016-09-07 2018-03-13 环球智达科技(北京)有限公司 Intelligent television sliding focus moving method and control system thereof
CN106873928A (en) * 2016-10-31 2017-06-20 深圳市金立通信设备有限公司 Long-range control method and terminal
CN106648312A (en) * 2016-11-18 2017-05-10 惠州Tcl移动通信有限公司 Adjustment method and system of display interface content
CN106648341A (en) * 2016-11-18 2017-05-10 惠州Tcl移动通信有限公司 Display interface make-ready method and system
CN106708389A (en) * 2016-12-30 2017-05-24 维沃移动通信有限公司 Method for controlling the zoom of display interfaceand mobile terminal
CN106998489A (en) * 2017-03-30 2017-08-01 武汉斗鱼网络科技有限公司 A kind of focus is crossed the border searching method and device
CN108111898A (en) * 2017-12-20 2018-06-01 聚好看科技股份有限公司 The display methods and smart television of the graphic user interface of television image sectional drawing
CN108419114A (en) * 2018-01-18 2018-08-17 海尔优家智能科技(北京)有限公司 A kind of UI interface display methods, device and storage medium for smart television

Also Published As

Publication number Publication date
WO2020057466A1 (en) 2020-03-26
CN109343923A (en) 2019-02-15

Similar Documents

Publication Publication Date Title
CN109479162B (en) Image display apparatus and method of displaying image
CN105872683B (en) Image display and method
EP3416053B1 (en) Image display apparatus and method of operating the same
US10536742B2 (en) Display apparatus and display method
CN103795948B (en) Show equipment and its control method
KR102354328B1 (en) Image display apparatus and operating method for the same
EP3242472B1 (en) Image displaying apparatus and method of operating the same
CN105763921B (en) Image display apparatus and method
KR102337216B1 (en) Image display apparatus and method for displaying image
KR101890626B1 (en) Mobile terminal, image display device and user interface providing method using the same
CN105025237A (en) User terminal device, method for controlling user terminal device and multimedia system thereof
EP2894544A1 (en) Display apparatus and method for controlling display apparatus thereof
CN104866269A (en) Screen splitting area regulation method and terminal
US10742880B2 (en) Image display apparatus and method of displaying image
US20150015730A1 (en) Apparatus and method for obtaining image
CN104768055B (en) Display device, mobile device, system, and setting control method of connection thereof
CN109343923B (en) Method and equipment for zooming user interface focus frame of intelligent television
US20140215347A1 (en) Portable device and control method thereof
CN103500065B (en) For the coordinate transformation method of mobile terminal and device and mobile terminal
CN108804187A (en) The setting method and device of interfaces windows
KR20150086609A (en) display apparatus for content list and method therefor
US20220182703A1 (en) Control apparatus and operating method thereof
CN115586857A (en) Instruction execution method, device, equipment and storage medium
KR20160097392A (en) Display apparatus and Method for providing user interface thereof
CN115993944A (en) Image display effect optimization method and device, storage medium and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240725

Address after: 266000 No.399, Songling Road, Laoshan District, Qingdao, Shandong Province

Patentee after: QINGDAO JUKANYUN TECHNOLOGY CO.,LTD.

Country or region after: China

Address before: 266061 Songling Road, Laoshan District, Qingdao, Shandong Province, No. 399

Patentee before: JUHAOKAN TECHNOLOGY Co.,Ltd.

Country or region before: China