CN112650418B - Display device - Google Patents

Display device Download PDF

Info

Publication number
CN112650418B
CN112650418B CN202110064704.5A CN202110064704A CN112650418B CN 112650418 B CN112650418 B CN 112650418B CN 202110064704 A CN202110064704 A CN 202110064704A CN 112650418 B CN112650418 B CN 112650418B
Authority
CN
China
Prior art keywords
display
angle
rotating
picture
rotation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110064704.5A
Other languages
Chinese (zh)
Other versions
CN112650418A (en
Inventor
马乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202110064704.5A priority Critical patent/CN112650418B/en
Publication of CN112650418A publication Critical patent/CN112650418A/en
Priority to PCT/CN2021/102319 priority patent/WO2022151662A1/en
Application granted granted Critical
Publication of CN112650418B publication Critical patent/CN112650418B/en
Priority to US18/348,740 priority patent/US20230350567A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The display device shown in the embodiment comprises a display and a controller, wherein the controller is configured to generate a first rotation angle based on a rotation gesture formed by at least two fingers of a user contacting the display and moving the at least one finger, and the first rotation angle is a rotation angle of the rotation gesture; drawing a rotating picture according to the first rotating angle, so that a second rotating angle is associated with the first rotating angle, at least two opposite vertexes of the rotating picture are always in contact with a frame of the display, the rotating picture does not exceed the frame of the display, and the second rotating angle is the rotating angle of the rotating picture. According to the display device, the controller of the display device can determine to draw the rotating pictures according to the first rotating angle, all the rotating pictures obtained each time can be displayed in the display, and the user experience is good.

Description

Display device
Technical Field
The application relates to the technical field of file display, in particular to a display device.
Background
Currently, since a display device can provide a user with a play picture such as audio, video, picture, and the like, it is receiving a wide attention of the user. With the development of big data and artificial intelligence, the functional requirements of users on display devices are increasing day by day. For example, a user may wish to interact with the display device without the aid of a remote control, but may interact directly with the display device.
Touch Screen display equipment has come into existence, a display of the Touch Screen display equipment is a Touch Screen display (Touch Screen), and the Touch Screen display can enable a user to operate a host machine only by lightly touching icons or characters on the display with fingers, so that the operation from a keyboard, a mouse and a remote controller is avoided, and the human-computer interaction is more straightforward.
The user rotates a finger touching the display to control the display to display a picture, which is a basic function of the touch screen display device. When the picture is rotated, the size of the picture is not changed, so that the picture cannot be completely displayed in the display, and the experience of the user is poor.
Disclosure of Invention
In order to solve technical problems in the prior art, embodiments of the present application illustrate a display device.
A first aspect of embodiments of the present application shows a display device, including:
a display;
a touch component configured to detect a touch trajectory input by a user;
a controller configured to:
a display;
a touch component configured to detect a touch trajectory input by a user;
a controller configured to:
generating a first rotation angle based on a rotation gesture formed by at least two fingers of a user contacting a display and at least one finger moving, wherein the first rotation angle is a rotation angle of the rotation gesture;
drawing a rotating picture according to the first rotating angle, so that a second rotating angle is associated with the first rotating angle, at least two opposite vertexes of the rotating picture are always in contact with a frame of the display, the rotating picture does not exceed the frame of the display, and the second rotating angle is the rotating angle of the rotating picture;
controlling the display to show the rotated picture.
The display device shown in the embodiment comprises a display and a controller, wherein the controller is configured to generate a first rotation angle based on a rotation gesture formed by at least two fingers of a user contacting the display and moving the at least one finger, and the first rotation angle is a rotation angle of the rotation gesture; drawing a rotating picture according to the first rotating angle, so that a second rotating angle is associated with the first rotating angle, at least two opposite vertexes of the rotating picture are always in contact with a frame of the display, the rotating picture does not exceed the frame of the display, and the second rotating angle is the rotating angle of the rotating picture. According to the display device, the controller of the display device can determine to draw the rotating picture according to the first rotating angle, the rotating picture obtained each time can be completely displayed in the display, and the user experience is good.
A second aspect of embodiments of the present application shows a display device, including:
a display;
the rotating assembly is used for connecting the display and driving the display to rotate;
a touch component configured to detect a touch trajectory input by a user;
a controller configured to:
generating a first rotation angle based on a rotation gesture input by a user, wherein the first rotation angle is a rotation angle of the rotation gesture;
if the rotating assembly is determined not to have the condition of driving the display to rotate, drawing a rotating picture according to the first rotating angle so as to enable a second rotating angle to be associated with the first rotating angle, wherein at least two opposite vertexes of the rotating picture are always in contact with a frame of the display, the rotating picture does not exceed the frame of the display, and the second rotating angle is the rotating angle of the rotating picture;
determining that the rotating assembly has a condition of driving the display to rotate, controlling the rotating assembly to drive the display to rotate based on the first rotating angle, so that a third rotating angle is associated with the first rotating angle, at least two opposite vertexes of the rotating picture are always in contact with a frame of the display, the rotating picture does not exceed the frame of the display, and the third rotating angle is the rotating angle of the display. The display device shown in this embodiment includes a display, a rotating component, and a controller, and the controller may determine to control the picture to rotate or control the display to rotate according to whether the rotating component has a condition for rotating the display. The embodiment shows that the controller of the display device can determine to draw the rotating picture according to the first rotating angle, so that all the rotating pictures obtained each time can be displayed in the display, and the user experience is better.
A third aspect of embodiments of the present application shows a display device including:
a display;
the external interface is used for connecting a rotating assembly so that the rotating assembly can drive the display to rotate;
a touch component configured to detect a touch trajectory input by a user;
a controller configured to:
generating a first rotation angle based on a rotation gesture input by a user, wherein the first rotation angle is a rotation angle of the rotation gesture;
if the rotating assembly is determined not to have the condition of driving the display to rotate, drawing a rotating picture according to the first rotating angle so as to enable a second rotating angle to be associated with the first rotating angle, wherein at least two opposite vertexes of the rotating picture are always in contact with a frame of the display, the rotating picture does not exceed the frame of the display, and the second rotating angle is the rotating angle of the rotating picture;
determining that the rotating assembly has a condition of driving the display to rotate, controlling the rotating assembly to drive the display to rotate based on the first rotating angle, so that a third rotating angle is associated with the first rotating angle, at least two opposite vertexes of the rotating picture are always in contact with a frame of the display, the rotating picture does not exceed the frame of the display, and the third rotating angle is the rotating angle of the display.
The display device shown in this embodiment includes a display, an external interface, and a controller, and the controller may determine to control the image rotation or control the display rotation according to whether the rotation component has a condition of driving the display to rotate. The embodiment shows that the controller of the display device can determine to draw the rotating picture according to the first rotating angle, so that all the rotating pictures obtained each time can be displayed in the display, and the user experience is better.
Drawings
In order to more clearly illustrate the embodiments of the present application or the implementation manner in the related art, the drawings used in the description of the embodiments or the related art will be briefly described below, and it is obvious that the drawings in the description below are some embodiments of the present application, and other drawings can be obtained by those skilled in the art according to these drawings.
FIG. 1 illustrates a usage scenario of a display device according to some embodiments;
fig. 2 illustrates a hardware configuration block diagram of the control apparatus 100 according to some embodiments;
fig. 3A illustrates a hardware configuration block diagram of a display device 200 according to some embodiments;
fig. 3B illustrates a hardware configuration block diagram of the display apparatus 200 according to some embodiments;
FIG. 4 illustrates a software configuration diagram in the display device 200 according to some embodiments;
FIG. 5 illustrates an icon control interface display of an application in display device 200, in accordance with some embodiments;
FIG. 6 is a flow diagram illustrating a display device interacting with a user in accordance with one possible embodiment;
FIG. 7 is a flow diagram illustrating a display presentation interface during a picture rotation process in accordance with one possible embodiment;
FIG. 8 is a schematic diagram illustrating a display presentation interface during a picture rotation process, according to one possible embodiment;
FIG. 9 is a flow chart illustrating a manner of calculating a first angle of rotation according to one possible embodiment;
FIG. 10 is a flow chart illustrating a process for controlling a picture rotation by a controller according to one possible embodiment;
FIG. 11A is a diagram illustrating a display presentation interface during a picture rotation process, according to one possible embodiment;
FIG. 11B is a diagram illustrating a display presentation interface during a picture rotation process, according to one possible embodiment;
FIG. 12 is a flowchart illustrating the control of picture rotation in an application scenario where a user touches the display with less than two fingers according to one possible embodiment;
FIG. 13 is a schematic diagram illustrating a display presentation interface during a picture rotation process, in accordance with one possible embodiment;
FIG. 14 is a flow chart illustrating a target angle generation method according to one possible embodiment;
FIG. 15A is a diagram illustrating a display presentation interface during a picture rotation process, according to one possible embodiment;
FIG. 15B is a diagram illustrating a display presentation interface during a picture rotation process, according to one possible embodiment;
FIG. 16 is a diagram illustrating the display effect of the display during the rotation of the picture;
FIG. 17 is a flow chart illustrating a zoom factor calculation method according to one possible embodiment;
FIG. 18 is a flow chart illustrating a current diagonal computation method in accordance with a possible embodiment;
FIG. 19A is a diagram illustrating a display presentation interface during a picture rotation process, according to one possible embodiment;
FIG. 19B is a diagram illustrating a display presentation interface during a picture rotation process, according to one possible embodiment;
FIG. 20A is a diagram illustrating a display presentation interface during a picture rotation process, according to one possible embodiment;
FIG. 20B is a diagram illustrating a display presentation interface during a picture rotation process, according to one possible embodiment;
FIG. 21A is a diagram illustrating a display presentation interface during a picture rotation process, according to one possible embodiment;
FIG. 21B is a diagram illustrating a display presentation interface during a picture rotation process, according to one possible embodiment;
fig. 22 is a flowchart illustrating a picture anti-shake method according to a possible embodiment.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Fig. 1 is a schematic diagram of a usage scenario of a display device according to an embodiment. As shown in fig. 1, the display apparatus 200 is also in data communication with a server 400, and a user can operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes at least one of an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may control the display apparatus 200 by inputting a user instruction through at least one of a key on a remote controller, a voice input, a control panel input, and the like.
In some embodiments, the smart device 300 may include any of a mobile terminal, a tablet, a computer, a laptop, an AR/VR device, and the like.
In some embodiments, the smart device 300 may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device.
In some embodiments, the smart device 300 and the display device may also be used for communication of data.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice instruction control of the user may be directly received by a module configured inside the display device 200 to obtain a voice instruction, or may be received by a voice control apparatus provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
In some embodiments, software steps executed by one step execution agent may be migrated on demand to another step execution agent in data communication therewith for execution. Illustratively, software steps performed by the server may be migrated to be performed on a display device in data communication therewith, and vice versa, as desired.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200.
In some embodiments, the communication interface 130 is used for external communication, and includes at least one of a WIFI chip, a bluetooth module, NFC, or an alternative module.
In some embodiments, the user input/output interface 140 includes at least one of a microphone, a touchpad, a sensor, a key, or an alternative module.
Fig. 3A illustrates a hardware configuration block diagram of the display apparatus 200 according to an exemplary embodiment.
In some embodiments, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, a user interface.
In some embodiments the controller comprises a central processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, the display 260 includes a display component for displaying pictures, and a driving component for driving picture display, and is used for receiving picture signals from the controller output, displaying video content, picture content, and menu manipulation interface components, and user manipulation UI interface.
In some embodiments, the display 260 may be at least one of a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, the tuner demodulator 210 receives broadcast television signals via wired or wireless reception, and demodulates audio/video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the control device 100 or the server 400 through the communicator 220.
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes a picture collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. Or may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other operable control. The operations related to the selected object are: displaying an operation connected to a hyperlink page, document, picture, or the like, or performing an operation of a program corresponding to the icon.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
A CPU processor. For executing operating system and application program instructions stored in the memory, and executing various application programs, data and contents according to various interactive instructions receiving external input, so as to finally display and play various audio-video contents. The CPU processor may include a plurality of processors. E.g., comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphics objects, such as: at least one of an icon, an operation menu, and a user input instruction display figure. The graphic processor comprises an arithmetic unit, which performs operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, and perform at least one of video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and picture synthesis according to a standard codec protocol of the input signal, so as to obtain a signal displayed or played on the direct display device 200.
In some embodiments, the video processor includes at least one of a demultiplexing module, a video decoding module, a picture composition module, a frame rate conversion module, a display formatting module, and the like. The demultiplexing module is used for demultiplexing the input audio and video data stream. And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like. And the picture synthesis module, such as a picture synthesizer, is used for performing superposition mixing processing on the GUI signal input by the user or generated by the user and the video picture after the zooming processing by the graphic generator so as to generate a picture signal for display. And the frame rate conversion module is used for converting the frame rate of the input video. And the display formatting module is used for converting the received video output signal after the frame rate conversion, and changing the signal to be in accordance with the signal of the display format, such as an output RGB data signal.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform at least one of noise reduction, digital-to-analog conversion, and amplification processing to obtain a sound signal that can be played in the speaker.
In some embodiments, the user may input a user command on a Graphical User Interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, window, control, etc. displayed in a display of the electronic device, where the control may include at least one of an icon, button, menu, tab, text box, dialog box, status bar, navigation bar, widget, etc. visual interface element.
In some embodiments, user interface 280 is an interface that may be used to receive control inputs (e.g., physical keys on the body of the display device, or the like).
In some embodiments, the system of the display device may include a Kernel (Kernel), a command parser (shell), a file system, and an application. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user application program. The application program is compiled into machine code after being started, and a process is formed.
In some possible embodiments the display device may further comprise a rotating assembly. Specifically, referring to fig. 3B, fig. 3B shows a hardware configuration block diagram of the display device 200 according to an exemplary embodiment. The display is connected to the bracket or the wall through the rotating component 290, and the display placing angle can be adjusted through the rotating component, so that the purpose of rotation is achieved. Different display placement angles can accommodate animated pages of different aspect ratios, for example, in most cases the displays are placed sideways to display video frames such as movies, television shows, etc. with an aspect ratio of 16. When the aspect ratio of a video screen is a short video, a cartoon, or the like screen of 9. Thus, the display can be positioned vertically by rotating the assembly to accommodate a 9.
Referring to fig. 4, in some embodiments, the system is divided into four layers, which are an Application (Applications) layer (abbreviated as "Application layer"), an Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer from top to bottom.
In some embodiments, at least one application program runs in the application program layer, and the application programs may be windows (windows) programs carried by an operating system, system setting programs, clock programs or the like; or an application developed by a third party developer. In particular implementations, the application packages in the application layer are not limited to the above examples.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. The application program can access the resources in the system and obtain the services of the system in execution through the API interface.
As shown in fig. 4, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to an application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the various applications as well as general navigational fallback functions, such as controlling exit, opening, fallback, etc. of the applications. The window manager is used for managing all window programs, such as obtaining the size of the display, judging whether a status bar exists, locking a screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window, displaying a shake, displaying a distortion deformation, and the like), and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the core layer includes at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
In some embodiments, the display device may directly enter the interface of the preset vod program after being activated, and the interface of the vod program may include at least a navigation bar 510 and a content display area located below the navigation bar 510, as shown in fig. 5, where the content displayed in the content display area may change according to the change of the selected control in the navigation bar. The programs in the application program layer can be integrated in the video-on-demand program and displayed through one control of the navigation bar, and can also be further displayed after the application control in the navigation bar is selected.
In some embodiments, the display device may directly enter a display interface of a signal source selected last time after being started, or a signal source selection interface, where the signal source may be a preset video-on-demand program, or may be at least one of an HDMI interface, a live tv interface, and the like, and after a user selects different signal sources, the display may display contents obtained from different signal sources. May be used.
Based on the display device 200, the display device 200 may support a touch interaction function by adding the touch component 276. In general, the touch-sensitive component 276 may constitute a touch screen with the display 260. A user can input different control instructions on the touch screen through touch operation. For example, the user may input a click, slide, long press, double click, etc. touch command, and different touch commands may represent different control functions.
To implement the different touch actions, the touch control component 276 may generate different electrical signals when the user inputs different touch actions, and transmit the generated electrical signals to the controller 250. The controller 250 may perform feature extraction on the received electrical signal to determine a control function to be performed by the user based on the extracted features.
For example, when a user enters a click touch action at any program icon location in the application interface, the touch component 276 will sense the touch action and generate an electrical signal. After receiving the electrical signal, the controller 250 may first determine a duration of a level corresponding to a touch action in the electrical signal, and when the duration is less than a preset time threshold, recognize that a click touch instruction is input by the user. The controller 250 then extracts the positional features generated by the electrical signals to determine the touch position. And when the touch position is within the display range of the application icon, determining that the user inputs a click touch instruction at the position of the application icon. Accordingly, the click touch command is used to execute a function of running a corresponding application program in the current scene, so that the controller 250 may start running the corresponding application program.
For another example, when the user enters a swipe action in the media presentation page, the touch-sensitive component 276 also sends the sensed electrical signal to the controller 250. The controller 250 determines a duration of a signal corresponding to a touch action in the electrical signals. When the determined duration is longer than the preset time threshold, the position change condition generated by the signal is judged, and obviously, for the interactive touch action, the generation position of the signal changes, so that the sliding touch instruction input by the user is determined. The controller 250 determines the sliding direction of the sliding touch instruction according to the change condition of the position of the signal generation, and controls to turn pages of the display frame in the media asset display page so as to display more media asset options. Further, the controller 250 may extract features such as a sliding speed and a sliding distance of the sliding touch instruction, and perform a page-turning screen control according to the extracted features, so as to achieve a hand-following effect.
Similarly, for the touch instruction such as double click, long press, etc., the controller 250 may execute the corresponding control function according to the preset interaction rule by extracting different features and determining the type of the touch instruction through feature judgment. In some embodiments, the touch component 276 also supports multi-touch, such that a user can input touch actions on the touch screen with multiple fingers, e.g., multi-finger clicks, multi-finger long presses, multi-finger swipes, and the like.
The touch control action can be matched with a specific application program to realize a specific function. For example, when the user opens the "demonstration whiteboard" application, the display 260 may present a drawing area, the user may draw a specific touch trajectory in the drawing area through the sliding touch command, and the controller 250 determines a touch pattern through the touch detected by the touch component 276 and controls the display 260 to display in real time to satisfy the demonstration effect.
The display of the Touch Screen display device is a Touch Screen display (Touch Screen), and the Touch Screen display can enable a user to operate the host machine only by lightly touching the display with fingers, so that the operation from a keyboard, a mouse and a remote controller is avoided, and the man-machine interaction is more direct.
For example, a user rotates a finger touching a display to control the display to display a picture, which is a basic function of a touch screen display device. The current interaction mode is that after the multiple fingers rotate on the screen, the picture immediately rotates to a horizontal or vertical angle according to the finger rotation direction, no interaction process exists, and user experience is poor.
In order to solve the above technical problem, embodiments of the present application illustrate a display device, and hardware of each part of the display device and functions of the hardware at least include embodiments. FIG. 6 is a flow diagram illustrating a display device interacting with a user in accordance with one possible embodiment;
the display is configured to perform step S101 to display a picture;
the technical solution shown in this embodiment does not limit the types of pictures. For example, in some feasible embodiments the picture may be a photograph stored within the display device, a frame picture of a video. In some feasible embodiments, the picture user opens the top page of the APP. In the practical application process, all pictures that can be displayed on the display can be called pictures, and the applicant does not make much limitation herein.
The user executes step S102 at least two finger touch displays;
when the user needs to rotate the picture displayed on the display, the finger of the user touches the display.
The controller is configured to execute step S103 to generate a first rotation angle based on a rotation gesture formed by at least two fingers of a user contacting the display and at least one finger moving.
There are various implementations of generating the first angle.
For example, fig. 7 is a flowchart illustrating a calculation manner of the first rotation angle according to a feasible embodiment, in response to the at least two fingers of the user touching the display, the controller is configured to execute step S11 to calculate an initial angle, where the initial angle is an included angle between a connection line between the two fingers before the fingers of the user rotate and a preset reference line;
in the technical scheme shown in the embodiment of the application, two fingers of a user touch the display at the same time as a trigger condition for picture rotation. When each finger of a user touches the display, the display sends contact information to the controller, and the contact information is at least the position where the user touches the display. When a user needs to control the picture to rotate, the user usually touches the display with two fingers within a preset time, and if the time interval between the two fingers of the user touching the display is long, the user may touch the display by misoperation. In order to avoid the above situation, in the solution shown in the embodiment of the present application, the controller starts to calculate the initial angle only when the controller receives the two contact information within the preset time. The preset time can be set according to requirements, and the applicant does not make excessive restrictions.
For example, in a feasible embodiment, the preset time may be 5s, the controller starts the timer when receiving the first contact information, and the controller receives the second contact information when the time recorded by the timer is 3 s. In this case, the controller calculates the initial angle based on the first contact information and the second contact information.
In a possible embodiment the preset time may be 5s, the controller starts a timer when it receives the first contact information, and the controller receives the second contact information when the timer records a time of 30s, in which case the user does not calculate the initial angle.
In a possible embodiment, the controller does not continue to receive contact information when it receives two contact information. For example, when the controller receives the first contact information, the controller starts the timer, and when the time recorded by the timer is 3s, the controller receives the second contact information, and at this time, the controller closes the timer. After the timer is closed, the controller ignores the subsequently received contact information.
In this embodiment, the initial angle is an included angle between a connection line between two fingers before the user rotates the fingers and a preset reference line; the calculation process of the initial angle is described below with reference to specific examples.
For convenience of description, in the present embodiment, two fingers are distinguished, one of the two fingers is called as an axis finger, and the axis finger is used as an axis to rotate in the process of rotating the fingers; the other finger is called as a rotating finger, and the rotating finger rotates around the axis finger in the process of rotating the finger. The thumb of the user is commonly referred to as the axial finger.
In some feasible embodiments, the preset reference line may be a line segment parallel to the display width, and in some feasible embodiments, the preset reference line may be a line segment parallel to the display height. In the present embodiment, a line segment parallel to the display width is explained as a preset reference line.
Fig. 8 is a schematic diagram of a display presentation interface during a picture rotation process according to a possible embodiment. In the initial state, the display displays a landscape picture. The user needs to rotate the picture and touch the display with the thumb and forefinger simultaneously. Specifically, refer to fig. 8 as a schematic diagram 11, where the position touched by the thumb (which may be referred to as an axis finger) is P1, and the position touched by the index finger (which may be referred to as a rotation finger) is P2. The display transmits the contact information (X1, Y1) of P1 and the contact information (X2, Y2) of P2 to the controller. The controller calculates the initial angle based on the contact information (X1, Y1) of P1 and the contact information (X2, Y2) of P2. In this embodiment, the preset reference line is a line segment parallel to the width of the display, one vertex of the preset reference line coincides with P1 (X1, Y1), and the other vertex of the preset reference line has the same ordinate as P2 and the same abscissa as P1, that is, (X1, Y2).
Finally, the initial angle sin α = (Y2-Y1)/[ (Y2-Y1) 2 +(X2-X1) 2 ] 1/2
It should be noted that this embodiment is only an exemplary way to describe the calculation method of the initial angle, and the technical method of the initial angle in the practical application process may be, but is not limited to, the above method.
In response to the rotation of the fingers, the controller is configured to execute step S12 to calculate a current angle, where the current angle is an included angle between a connection line between the two fingers and a preset reference line when the fingers of the user rotate;
there are various ways of detecting whether the finger is rotated, for example, in a possible embodiment the display may send the displacement of the rotating finger moving in unit time to the controller. The controller determines whether the finger is rotated by rotating the displacement of the finger moving in unit time. In practice, the controller may determine whether the user's finger has moved in other ways, without limitation by the applicant herein.
The calculation of the current angle is described below with reference to the schematic diagram 12 of fig. 8. The user rotates the finger around the axis finger with the axis finger as the central axis (the corresponding contact point is P1), and the display interface of the display can refer to the schematic diagram 12 in fig. 8. The display sends the contact information (X1, Y1) of P1 and the contact information (X3, Y3) of P3 to the controller, in the embodiment, the preset reference line is a line segment parallel to the width of the display, one vertex of the preset reference line is coincident with P1 (X1, Y1), and the other vertex of the preset reference line has the same vertical coordinate with P3 and the same horizontal coordinate with P1, namely (X1, Y3).
Finally, the current angle sin β = (Y3-X1)/[ (Y3-Y1) 2 +(X3-X1) 2 ] 1/2
The controller is configured to perform step S13 of calculating a first rotation angle from the initial angle and the current angle;
the first rotation angle is equal to the difference between the current angle and the initial angle;
specifically, the first rotation angle = β - α.
The controller is configured to execute step S14 of drawing a rotated picture according to the first rotation angle, so that the rotation angle of the picture has a correlation with or is consistent with the angle of rotation of the user' S finger;
there are various implementations of drawing the rotated picture according to the first rotation angle.
For example, in one possible embodiment, the controller is provided with an OSD (on-screen display) layer; the OSD layer is configured to control the picture to rotate according to the first rotation angle, so as to obtain a rotated picture, and directly output the rotated picture to the display, so that the display displays the rotated picture.
In the picture rotation process, see the schematic diagram 13 and the schematic diagram 14 in fig. 8, specifically, the rotated picture obtained by the OSD layer controlling the picture rotation according to the first rotation angle can refer to the schematic diagram 13 in fig. 8, and the effect picture finally displayed on the display can refer to the schematic diagram 14 in fig. 8.
For example, in one possible embodiment, the controller is provided with a video layer; the video layer is configured to render a rotated picture based on the picture data and the first rotation angle. The video layer cannot control the display to show the picture to rotate, and therefore, each time the user controls the picture to rotate, the video layer needs to render the rotated picture based on the picture data and the first rotation angle.
For example, fig. 9 is a flowchart illustrating a manner of calculating the first rotation angle according to a possible embodiment.
When the display displays the picture, in response to the at least two fingers of the user touching the display, the controller is configured to execute step S21 to obtain an initial reference line, where the initial reference line is a connection line between the two fingers before the fingers of the user rotate;
in response to the rotation of the fingers, the controller is configured to execute step S22 to obtain a current reference line, where the current reference line is a connection line between the two fingers when the fingers of the user rotate;
the controller is configured to perform step S23 to calculate a first rotation angle from the initial reference line and the current reference line.
The manner of calculating the first rotation angle may refer to the above embodiments, and is not described herein again.
The controller is configured to execute step S104 to control the display to show the rotated picture such that a second rotation angle is associated with the first rotation angle, the second rotation angle being a rotation angle of the rotated picture;
the effect graph finally showing the rotated picture can be seen in the schematic diagram 14 in fig. 8.
In the process of controlling the picture rotation, the user's finger may rotate in a clockwise direction or a counterclockwise direction. In order to match the rotation direction of the picture with the rotation direction of the finger of the user, in the technical scheme shown in the embodiment of the application, the controller is further configured to generate the rotation identifier, so that the controller can determine the rotation direction of the picture according to the rotation identifier, the rotation direction of the picture is matched with the rotation direction of the finger of the user, and the experience of the user is further improved.
Fig. 10 is a flowchart illustrating a process of controlling a picture rotation by a controller according to a possible embodiment. It can be seen from the figure that the controller is further configured to perform steps S31 to S341/S342 in the scheme shown in the present embodiment.
Step S31, calculating a first rotation angle according to the initial angle and the current angle;
the implementation of calculating the first rotation angle according to the initial angle and the current angle may refer to the above embodiments, which are not described herein again.
Step S32, generating a rotation identifier, wherein the rotation identifier is a ratio of the first rotation angle to an absolute value of the rotation angle, and the absolute value of the rotation angle is the absolute value of the rotation angle;
there are various implementations of generating the rotational identifier. For example, in one possible embodiment, an absolute value of the rotation angle may be calculated, and the rotation identifier may be generated based on a ratio of the rotation angle and the absolute value of the rotation angle.
For example, in a feasible embodiment, the initial angle α =60 degrees, the current angle β =20 degrees, the first rotation angle β - α =20-60= -40 is calculated, the absolute value of the first rotation angle is equal to 40 degrees, and the rotation identifier = -40/40= -1;
in a possible embodiment, the initial angle α =60 degrees, the current angle β =90 degrees, a first rotation angle β - α =90-60=30 is calculated, the absolute value of the first rotation angle is equal to 30 degrees, and the rotation identifier =30/30=1;
for another example, in a possible embodiment, an absolute value of the rotation angle may be calculated, and the rotation identifier may be generated according to a difference between the first rotation angle and the absolute value of the rotation angle.
For example, in a feasible embodiment, the initial angle α =60 degrees, the current angle β =20 degrees, the first rotation angle β - α =20-60= -40 is calculated, the absolute value of the first rotation angle is equal to 40 degrees, and the rotation identifier = -40-40= -80;
in a possible embodiment, the initial angle α =60 degrees, the current angle β =90 degrees, a first rotation angle β - α =90-60=30 is calculated, the absolute value of the first rotation angle is equal to 30 degrees, and the rotation identifier =30-30=0;
it should be noted that the present embodiment is only an exemplary method for generating two rotation identifiers, and in the process of practical application, the method for generating the rotation identifier may be, but is not limited to, the two manners described above.
S33, judging whether the rotary identifier is greater than or equal to 0;
if the rotation flag is greater than or equal to 0, step S341 is executed to control the picture to rotate by the rotation angle in the clockwise direction;
if the rotation flag is less than 0, step S342 is performed to control the picture to rotate by the rotation angle in the counterclockwise direction.
The process of rotating the picture is further described below with reference to specific examples.
Fig. 11A is a schematic diagram of a display interface during a picture rotation process according to a possible embodiment. In this embodiment, the initial angle α =30 degrees, and the current angle β = -30 (specifically, refer to the schematic diagram 21 in fig. 11A). The controller calculates a rotation angle of-60, an absolute value of the rotation angle is equal to 60 degrees, the rotation identifier = -60/60= -1 (less than 0), and the controller controls the picture to rotate counterclockwise by 60 degrees, where the display interface of the display can refer to the schematic diagram 22 in fig. 11A.
FIG. 11B is a diagram illustrating a display interface during a picture rotation process according to one possible embodiment. In this embodiment, the initial angle α =30 degrees, and the current angle β =60 (see the schematic diagram 31 of fig. 11B in particular). The controller calculates a rotation angle 30, an absolute value of the rotation angle is equal to 30 degrees, the rotation flag =30/30=1 (greater than 0), and the controller controls the picture to rotate clockwise by 30 degrees, where the display interface of the display can refer to the schematic diagram 32 in fig. 11B.
In this embodiment, the controller may determine the rotation identifier according to the first rotation angle, and then determine whether to control the picture to rotate clockwise or counterclockwise based on the rotation identifier, thereby implementing that the rotation direction of the picture matches with the rotation direction of the user's finger, and the user experience is better.
When the user stops rotating the picture, the user can separate the fingers touching the display from the display, and at the moment, the number of the fingers contacting the display is less than two. According to the scheme, the display state of the picture is further limited in the application scene that the user touches the display with less than two fingers, so that the experience of the user is further improved.
FIG. 12 is a flowchart illustrating controlling picture rotation in an application scenario where a user touches the display with less than two fingers according to one possible embodiment. It can be seen from the figure that the controller is further configured to perform steps S41 to S421/S422 in the scheme shown in the present embodiment.
In response to the user touching the display with less than two fingers, step S41 is performed to determine whether the first rotation-stopping angle is less than or equal to the rotation threshold.
The rotation threshold in this embodiment may be set according to requirements, for example, the rotation threshold may be 20 degrees in some feasible embodiments, and may be 45 degrees in some feasible embodiments.
If the first rotation angle is less than or equal to the rotation threshold, step S421 controls the picture to rotate back to (or keep) the initial display state, where the initial display state is the state of the picture before the user' S finger rotates;
the process of rotating the picture is further described with reference to specific examples. Fig. 13 is a schematic diagram of a display presentation interface during a picture rotation process according to a possible embodiment. In this embodiment, the initial angle α =30 degrees, and the current angle β =60 (specifically, refer to the schematic diagram 41 in fig. 13). The controller calculates the first rotation angle 30, and the controller controls the picture to rotate clockwise by 30 degrees, at this time, the display page of the display can refer to the schematic diagram 42 in fig. 13. In this embodiment, the rotation threshold is 45 degrees (the first rotation angle is smaller than or equal to the rotation threshold), and in response to the user touching the display with less than two fingers, the picture is controlled to rotate back to (or maintain) the initial state, and at this time, the display page of the display can refer to the schematic diagram 43 in fig. 13.
If the first rotation angle is greater than the rotation threshold, step S422 controls the picture to rotate by a target angle, where the target angle is related to the first rotation angle.
There are various methods for generating the target angle, and fig. 14 is a flowchart illustrating the target angle generating method according to one possible embodiment, and the controller is further configured to perform steps S51 to S53.
Step S51, calculating an exceeding angle according to the first rotation angle and a rotation threshold value;
the calculation of the out-of-angle may be: rotation angle delta = β - α, beyond angle over = math.max (| delta |, T) -T, where T is the rotation threshold.
For example, in a feasible embodiment, the rotation threshold is 20, the initial angle α =30 degrees, the current angle β =60, delta = β - α =60-30=30 degrees (the first rotation angle is greater than the rotation threshold), and the angle over = math.max (30,20) -20=10 degrees is exceeded.
In a feasible embodiment, the rotation threshold is 45, the initial angle α =30 degrees, the current angle β =60, delta = β - α =60-30=30 degrees (the first rotation angle is greater than the rotation threshold), and the angle over = math.max (30,45) -45=0 degrees is exceeded.
Step S52 calculating an excess multiple = math. Ceil (over/90), over being an excess angle;
for example, in a possible embodiment, over =10 degrees, out of multiple = math. Ceil (10/90) =1;
in a feasible embodiment, over =0 degrees, the excess multiple = math.ceil (0/90) =0.
Step S53 calculates a target angle from the excess factor.
The implementation of calculating the target angle according to the excess multiple may be: the final rotated target angle target = direction times 90 is calculated.
For example, in a feasible embodiment, over =10 degrees, time = math. Ceil (10/90) =1; target =1 × 90=90.
In a feasible embodiment over =0 degrees, out of multiple = math. Ceil (0/90) =0.target =0 × 90=90.
The implementation of calculating the target angle according to the excess multiple may also be: calculating a target angle target = direction times 90 to which the final rotation is made, and if the rotation identifier is greater than or equal to 0, the direction =1; if the rotation flag is less than 0, direction = -1.
The process of rotating the picture is further described below with reference to specific examples.
Fig. 15A is a schematic diagram of a display interface during a picture rotation process according to a possible embodiment. In this embodiment, the initial angle α =30 degrees, and the current angle β =60 (see the schematic diagram 51 of fig. 15A specifically). The controller calculates the rotation angle 30, and the controller controls the picture to rotate clockwise by 60 degrees, at this time, the display interface of the display can refer to the schematic diagram 52 in fig. 15A. At this time, the user disengages the finger from the display, the rotation threshold is 20 degrees in this embodiment, in response to that the user touches the display with less than two fingers, the first rotation angle is greater than the rotation threshold, the controller calculates the exceeding multiple = math.ceil (30-20/90) =1, and then calculates the target angle =1 × 90, so that in response to that the user touches the display with less than two fingers, the controller controls the picture to rotate by 90 degrees, and at this time, the display interface of the display may refer to the schematic diagram 53 in fig. 15A.
Fig. 15B is a schematic diagram of a display interface during a picture rotation process according to a possible embodiment. In this embodiment, the initial angle α =30 degrees, the current angle β =150 (specifically, refer to the schematic diagram 61 of fig. 15B. The controller calculates the first rotation angle 120, the controller controls the picture to rotate 120 degrees clockwise, and at this time, the display interface of the display may refer to the schematic diagram 62 of fig. 15B. At this time, the user separates the finger from the display, the rotation threshold value in this embodiment is 20 degrees, the rotation angle is greater than the rotation threshold value in response to the user touching the display with less than two fingers.
The picture may also be presented in other ways when the user's finger is off the display, for example, in a possible embodiment, the controller does not adjust the rotated picture in response to the user touching the display with less than two fingers.
In the process of the picture rotation, the size of the picture is not changed, so that the display can only show part of the picture when the rotation angle is not 0. Of course, in other embodiments, the display scale of the picture may be correspondingly scaled according to the angle indicating the rotation, so that the picture can be completely displayed on the display. The above two display modes are only an exemplary one, and other foreseeable display modes can be included.
Specifically, the embodiment of the present application shows a display device, where a controller of the display device may determine a zoom factor of a picture according to a first rotation angle, and all the obtained rotated pictures may be displayed in the display at each time, and a specific display effect may refer to fig. 16, where fig. 16 is a display effect diagram of the display during a picture rotation process.
FIG. 17 is a process flow diagram of a controller according to one possible embodiment. The controller is further configured to perform steps S61 to S63.
S61, generating a first rotation angle based on a rotation gesture formed by at least two fingers of a user contacting a display and at least one finger moving, wherein the first rotation angle is a rotation angle of the rotation gesture;
s62, drawing a rotating picture according to the first rotating angle, so that a second rotating angle is associated with the first rotating angle, at least two opposite vertexes of the rotating picture are always in contact with a frame of the display, the rotating picture does not exceed the frame of the display, and the second rotating angle is the rotating angle of the rotating picture;
s63, controlling the display to display the rotating picture;
the at least two opposite vertexes of the rotated picture are always in contact with the frame of the display, and the implementation manner that the rotated picture does not exceed the frame of the display may be: the controller may determine the scaling factor of the picture according to the first rotation angle in various ways. For example, in one possible embodiment, the controller may rotate the ratio of the diagonals of the front and rear pictures to determine the zoom factor N of the picture.
A flowchart of a zoom factor calculation method is shown in one possible embodiment. The controller is further configured to perform steps (1) to (3).
Responding to at least two fingers of a user to touch the display, executing the step (1) to calculate an initial diagonal value, wherein the initial diagonal value is the length of a diagonal of a picture displayed by the display before the fingers of the user rotate;
for example, the width and height of the display picture before the user's finger rotates are w and h, respectively. Starting diagonal value d = (w) 2 +h 2 ) 1/2
Responding to the rotation of the finger, executing the step (2) to calculate the current diagonal value according to the first rotation angle;
there are various ways to calculate the current diagonal value according to the first rotation angle. For example, FIG. 18 is a flow chart illustrating the present diagonal calculation method according to one possible embodiment, wherein the controller is further configured to perform steps S71-S721/S73.
S71, judging whether the aspect ratio of the display is consistent with that of the picture;
in some feasible embodiments, the aspect ratio of the display and the aspect ratio of the picture can be stored in advance, in some feasible embodiments, the aspect ratio of the display can be calculated according to the aspect ratio of the display, and the aspect ratio of the picture can be calculated according to the aspect ratio of the picture.
If the aspect ratio of the display is consistent with the aspect ratio of the picture, performing step S721 to calculate a current diagonal value according to the first rotation angle and the height of the display;
for example, fig. 19A is a schematic diagram of a display displaying interface during a picture rotation process according to a possible embodiment. In this embodiment, the aspect ratio of the display is 16/9, and the aspect ratio of the picture is 16/9; the effect of the picture on the display can be seen in the left diagram of fig. 19A. A user rotates the picture by rotating the finger controller by α, the display effect of the rotated picture on the display can refer to the right picture in fig. 19A, an included angle between the width of the rotated picture and the width of the display is α, and an included angle between a diagonal line of the rotated picture and the width of the picture is γ, in this embodiment, when the first rotation angle is less than 90 °, d (α) = dh/sin (α + γ) can be obtained according to a sine function for the length of the current diagonal line; where γ = arc tan (h/w), where dh is the height of the display, h is the height of the picture, and w is the width of the picture.
For example, fig. 19B is a schematic diagram of a display interface during a picture rotation process according to a possible embodiment. In this embodiment, the aspect ratio of the display is 16/9, and the aspect ratio of the picture is 16/9; the effect of the picture on the display can be seen in the left diagram of fig. 19B. A user rotates the picture by rotating the finger controller by α, the display effect of the rotated picture on the display can refer to the right picture in fig. 19A, an included angle between the width of the rotated picture and the width of the display is α, and an included angle between the diagonal line of the rotated picture and the width of the picture is γ, in this embodiment, when the first rotation angle is greater than 90 °, the length of the current diagonal line can obtain d (α) = dh/sin (α - γ) according to a sine function; where γ = arc tan (h/w), where dh is the height of the display, h is the height of the picture, and w is the width of the picture.
If the aspect ratio of the display is not consistent with the aspect ratio of the picture, step S722 is executed to calculate a first diagonal value according to the first rotation angle and the height of the display, and calculate a second diagonal value according to the first rotation angle and the height of the display;
step S53, selecting a smaller value from the first diagonal value and the second diagonal value as a current diagonal value;
for example, fig. 20A is a schematic diagram of a display presentation interface during a picture rotation process according to one possible embodiment. In this embodiment, the aspect ratio of the display is 16/9, and the aspect ratio of the picture is 7/2; the effect of the picture on the display can be seen in the left diagram of fig. 20A. A user rotates the picture by rotating the finger controller by α (α <90 degrees), the display effect of the rotated picture on the display can refer to the right picture in fig. 20A, an included angle between the width of the rotated picture and the width of the display is α, and an included angle between a diagonal line of the rotated picture and the width of the picture is γ, in this embodiment, when the first rotation angle is less than 90 degrees, the first diagonal line value d (α) 1= dh/sin (α + γ), and the second diagonal line value d (α) 2= dw/sin (α + γ); in this embodiment, d (α) 2 is smaller than d (α) 1, so d (α) 2 is selected as the current diagonal value.
FIG. 20B is a diagram illustrating a display interface during a picture rotation process according to one possible embodiment. In this embodiment, the aspect ratio of the display is 16/9, and the aspect ratio of the picture is 7/2; the effect of the picture on the display can be seen in the left diagram of fig. 20A. A user rotates the picture by rotating the finger controller by α (α >90 degrees), the display effect of the rotated picture on the display can refer to the right picture in fig. 20B, an included angle between the width of the rotated picture and the width of the display is α, and an included angle between a diagonal line of the rotated picture and the width of the picture is γ, in this embodiment, when the first rotation angle is greater than 90 °, the first diagonal line value d (α) 1= dh/sin (α - γ), and the second diagonal line value d (α) 2= dw/sin (α - γ); in this embodiment, d (α) 2 is smaller than d (α) 1, so d (α) 2 is selected as the current diagonal value.
Fig. 21A is a schematic diagram of a display interface during a picture rotation process according to a possible embodiment. In this embodiment, the aspect ratio of the display is 16/9, and the aspect ratio of the picture is 2/1; the effect of the picture on the display can be seen in the left diagram of fig. 20A. A user rotates the picture by rotating the finger controller by α (α <90 degrees), the display effect of the rotated picture on the display can refer to the right picture in fig. 21A, an included angle between the width of the rotated picture and the width of the display is α, and an included angle between a diagonal line of the rotated picture and the width of the picture is γ, in this embodiment, when the first rotation angle is less than 90 degrees, the first diagonal line value d (α) 1= dh/sin (α + γ), and the second diagonal line value d (α) 2= dw/sin (α + γ); in this embodiment, d (α) 1 is smaller than d (α) 2, so d (α) 1 is selected as the current diagonal value.
Fig. 21B is a schematic diagram of a display interface during a picture rotation process according to a possible embodiment. In this embodiment, the aspect ratio of the display is 16/9, and the aspect ratio of the picture is 2/1; the effect of the picture on the display can be seen in the left diagram of fig. 20A. A user rotates the picture by rotating the finger controller by α (α >90 degrees), the display effect of the rotated picture on the display can refer to the right picture in fig. 21B, an included angle between the width of the rotated picture and the width of the display is α, and an included angle between a diagonal line of the rotated picture and the width of the picture is γ, in this embodiment, when the first rotation angle is greater than 90 °, the first diagonal line value d (α) 1= dh/sin (α - γ), and the second diagonal line value d (α) 2= dw/sin (α - γ); in this embodiment, d (α) 1 is smaller than d (α) 2, so d (α) 1 is selected as the current diagonal value.
(3) And calculating the zoom factor of the picture according to the initial diagonal value and the current diagonal value.
Scale factor = d (α)/d.
According to the display device, the controller of the display device can determine the scaling multiple of the picture according to the first rotation angle, all the obtained rotation pictures can be displayed in the display at each time, namely, at least two opposite vertexes of the rotation pictures are always in contact with the frame of the display, the rotation pictures do not exceed the frame of the display, and the user experience is good.
This completes the description of the present embodiment.
In some application scenarios, the shaking of the finger of the user and other reasons causes the shaking of the rotating picture displayed on the display during the rotation process, which affects the experience of the user, in order to further improve the experience of the user, this embodiment shows a picture anti-shaking method, specifically referring to fig. 22, where fig. 22 is a flowchart of the picture anti-shaking method according to a feasible embodiment, and the controller is further configured to execute steps S81 to S851/S852.
In response to the rotation of the finger, performing step S81 to count the rotation time;
s82, calculating a prediction angle according to the rotation time and the prediction rotation rate;
for example, in a possible embodiment, the rotation time is T, the first rotation angle is α, and the predicted angle = (α/T) × current time + initial angle.
Of course, the predicted angle may be stored in advance in some feasible embodiments.
S83, calculating a difference value between the predicted angle and the first rotation angle; the method for calculating the first rotation angle can refer to the above embodiments and is not described herein.
S84, judging whether the difference value is smaller than or equal to an angle difference threshold value;
if the difference between the predicted angle and the first rotation angle is less than or equal to the angle difference threshold, step S851 is performed to control the picture to rotate such that a second rotation angle is associated with the first rotation angle, the second rotation angle being the rotation angle for rotating the picture.
If the difference value between the predicted angle and the first rotation angle is greater than the angle difference threshold, executing step S852 not to control the picture to rotate;
according to the technical scheme shown in the embodiment, when the difference value between the predicted angle and the first rotation angle is greater than the angle difference threshold value, the picture may be rotated due to misoperation of a user, and in this case, the controller does not calculate the first rotation angle according to the initial angle and the current angle. By adopting the scheme shown in the embodiment, on one hand, the data processing amount of the controller can be reduced, on the other hand, the picture jitter caused by misoperation of a user can be avoided, and the user experience is better.
A second aspect of embodiments of the present application shows a display device, including:
a display;
the rotating assembly is used for connecting the display and driving the display to rotate;
a touch component configured to detect a touch trajectory input by a user;
a controller configured to:
generating a first rotation angle based on a rotation gesture input by a user, wherein the first rotation angle is a rotation angle of the rotation gesture;
if the rotating assembly is determined not to have the condition of driving the display to rotate, drawing a rotating picture according to the first rotating angle so as to enable a second rotating angle to be associated with the first rotating angle, wherein at least two opposite vertexes of the rotating picture are always in contact with a frame of the display, the rotating picture does not exceed the frame of the display, and the second rotating angle is the rotating angle of the rotating picture;
determining a condition that the rotating assembly drives the display to rotate, controlling the rotating assembly to drive the display to rotate based on the first rotating angle, so that a third rotating angle is associated with the first rotating angle, at least two opposite vertexes of the rotating picture are always in contact with a frame of the display, the rotating picture does not exceed the frame of the display, and the third rotating angle is the rotating angle of the display.
The implementation manner of judging whether the rotating assembly has the condition of driving the display to rotate may also be:
for example, in practical applications, in order to prevent the display device from rotating due to a user's misoperation, a rotation switch may be disposed in the display device, and the controller may control the rotation assembly to rotate the display device when the rotation switch is in a connected state, and the controller may not control the rotation assembly to rotate when the rotation switch is in a chopping state. In a possible embodiment, if the rotary switch is in the chopping state, the display is controlled to display a prompt message, and the prompt message is used for prompting a user that the rotary switch is in the chopping state in the embodiment.
For example, in one possible embodiment, the controller may read the state of the rotary switch, and if the rotary switch is in the chopping state, control the display to display a prompt message for prompting the user that the rotary switch is in the chopping state; if the rotary switch is in a communicated state, the controller controls the rotary component to drive the display to rotate.
In the practical application process, the implementation manner of determining whether the rotation assembly has the condition of driving the display to rotate may be, but is not limited to, the above several manners, which are not described in detail by the applicant herein.
The display device shown in this embodiment includes a display, a rotating component, and a controller, and the controller may determine to control the picture to rotate or control the display to rotate according to whether the rotating component has a condition for rotating the display. The embodiment shows that the controller of the display device can determine to draw the rotating picture according to the first rotating angle, so that all the rotating pictures obtained each time can be displayed in the display, and the user experience is better.
A third aspect of embodiments of the present application shows a display device including:
a display;
the external interface is used for connecting a rotating assembly so that the rotating assembly can drive the display to rotate;
a touch component configured to detect a touch trajectory input by a user;
a controller configured to:
generating a first rotation angle based on a rotation gesture input by a user, wherein the first rotation angle is a rotation angle of the rotation gesture;
if the rotating assembly is determined not to have the condition of driving the display to rotate, drawing a rotating picture according to the first rotating angle so as to enable a second rotating angle to be associated with the first rotating angle, wherein at least two opposite vertexes of the rotating picture are always in contact with a frame of the display, the rotating picture does not exceed the frame of the display, and the second rotating angle is the rotating angle of the rotating picture;
determining that the rotating assembly has a condition of driving the display to rotate, controlling the rotating assembly to drive the display to rotate based on the first rotating angle, so that a third rotating angle is associated with the first rotating angle, at least two opposite vertexes of the rotating picture are always in contact with a frame of the display, the rotating picture does not exceed the frame of the display, and the third rotating angle is the rotating angle of the display.
The implementation manner of determining whether the rotating component has the condition of driving the display to rotate in this embodiment may also be: in a feasible embodiment, the controller may read an identifier of an external interface, where the external interface is used to connect the rotating component, and the identifier is a first identifier when the rotating component is inserted into the external interface; when the rotating assembly is separated from the external interface, the identifier is switched to a second identifier; if the mark is the first mark, the connection between the rotating assembly and the controller is proved, and the controller controls the rotating assembly to drive the display to rotate. If the identification is the second identification, the controller controls the display to display prompt information, and the prompt information is used for prompting a user that the rotating assembly is not inserted into the external interface in the embodiment.
The display device shown in this embodiment includes a display, an external interface, and a controller, and the controller can determine to control the image rotation or control the display rotation according to whether the rotation component has a condition of driving the display to rotate. The embodiment shows that the controller of the display device can determine to draw the rotating picture according to the first rotating angle, so that all the rotating pictures obtained each time can be displayed in the display, and the user experience is better.
In a specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in each embodiment of the method for customizing a control key and the method for starting the control key provided by the present invention when executed. The computer storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a Random Access Memory (RAM), or the like.
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of software products, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and include instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method in the embodiments or some parts of the embodiments of the present invention.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and these modifications or substitutions do not depart from the scope of the technical solutions of the embodiments of the present application. The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (9)

1. A display device, comprising:
a display;
a touch component configured to detect a touch trajectory input by a user;
a controller configured to:
generating a first rotation angle based on a rotation gesture formed by at least two fingers of a user contacting a display and at least one finger moving, wherein the first rotation angle is a rotation angle of the rotation gesture;
drawing a rotating picture according to the first rotating angle, so that a second rotating angle is associated with the first rotating angle, at least two opposite vertexes of the rotating picture are always in contact with a display boundary of the display, the rotating picture does not exceed the display boundary of the display, and the second rotating angle is the rotating angle of the rotating picture;
controlling the display to show the rotated picture.
2. The display device of claim 1, wherein the controller is further configured to:
responding to the touch of at least two fingers of a user on the display, and calculating an initial diagonal value, wherein the initial diagonal value is the length of a diagonal of a picture displayed by the display before the fingers of the user rotate;
responding to the rotation of the finger, and calculating a current diagonal value according to the first rotation angle;
calculating a zoom factor of the picture according to the initial diagonal value and the current diagonal value;
and in the process of drawing the rotating picture, controlling the picture to be reduced or enlarged based on the zoom factor.
3. The display device of claim 2, wherein the controller is further configured to:
reading an aspect ratio of the display and an aspect ratio of the picture in response to a user contacting the display with at least two fingers and at least one finger moving;
if the aspect ratio of the display is consistent with that of the picture, calculating the current diagonal value according to the first rotation angle and the height of the display
If the aspect ratio of the display is inconsistent with the aspect ratio of the picture, calculating a first diagonal value according to the first rotation angle and the height of the display, and calculating a second diagonal value according to the first rotation angle and the height of the display;
and selecting a smaller value from the first diagonal value and the second diagonal value as the current diagonal value.
4. The display device of claim 1, wherein the controller is further configured to:
when a display displays a picture, responding to the touch of at least two fingers of a user on the display, and calculating an initial angle, wherein the initial angle is an included angle between a connecting line between the two fingers before the fingers of the user rotate and a preset reference line;
responding to the rotation of the fingers, and calculating a current angle, wherein the current angle is an included angle between a connecting line between the two fingers and a preset reference line when the fingers of the user rotate;
and calculating a first rotation angle according to the initial angle and the current angle.
5. The display device according to claim 1, wherein the controller is further configured to:
when a display displays a picture, responding to the touch of at least two fingers of a user on the display, and acquiring an initial reference line, wherein the initial reference line is a connecting line between the two fingers before the fingers of the user rotate;
responding to the rotation of the fingers, and acquiring a current reference line, wherein the current reference line is a connecting line between the two fingers when the fingers of the user rotate;
and calculating a first rotation angle according to the initial reference line and the current reference line.
6. The display device according to any one of claims 1-5, wherein the controller is further configured to:
generating a rotation identifier, wherein the rotation identifier is generated by the first rotation angle and an absolute value of the rotation angle, and the absolute value of the rotation angle is the absolute value of the first rotation angle;
if the rotation identifier is larger than 0, controlling the picture to rotate clockwise by a first rotation angle;
and if the rotation identifier is less than 0, controlling the picture to rotate in the counterclockwise direction by a first rotation angle.
7. The display device of claim 5, wherein the controller is further configured to:
counting a rotation time in response to a rotation of a finger of a user;
calculating a predicted angle according to the rotation time and the predicted rotation rate;
if the difference value between the predicted angle and the current angle is greater than an angle difference threshold value, not calculating a first rotation angle, wherein the first rotation angle is a rotation angle generated based on a rotation gesture formed by at least two fingers of a user contacting a display and at least one finger moving;
and if the difference value of the predicted angle and the current angle is less than or equal to an angle difference threshold value, calculating the first rotation angle.
8. A display device, comprising:
a display;
the rotating assembly is used for connecting the display and driving the display to rotate;
a touch component configured to detect a touch trajectory input by a user;
a controller configured to:
generating a first rotation angle based on a rotation gesture input by a user, wherein the first rotation angle is a rotation angle of the rotation gesture;
if it is determined that the rotating assembly does not have the condition of driving the display to rotate, drawing a rotating picture according to the first rotating angle, so that a second rotating angle is associated with the first rotating angle, at least two opposite vertexes of the rotating picture are always in contact with a display boundary of the display, the rotating picture does not exceed the display boundary of the display, and the second rotating angle is the rotating angle of the rotating picture;
determining that the rotating assembly has a condition for driving the display to rotate, and controlling the rotating assembly to drive the display to rotate based on the first rotating angle, so that a third rotating angle is associated with the first rotating angle, at least two opposite vertexes of the rotating picture are always in contact with a display boundary of the display, the rotating picture does not exceed the display boundary of the display, and the third rotating angle is the rotating angle of the display.
9. A display device, comprising:
a display;
the external interface is used for connecting a rotating assembly so that the rotating assembly can drive the display to rotate;
a touch component configured to detect a touch trajectory input by a user;
a controller configured to:
generating a first rotation angle based on a rotation gesture input by a user, wherein the first rotation angle is a rotation angle of the rotation gesture;
if it is determined that the rotating assembly does not have the condition of driving the display to rotate, drawing a rotating picture according to the first rotating angle, so that a second rotating angle is associated with the first rotating angle, at least two opposite vertexes of the rotating picture are always in contact with a display boundary of the display, the rotating picture does not exceed the display boundary of the display, and the second rotating angle is the rotating angle of the rotating picture;
determining that the rotating assembly has a condition for driving the display to rotate, and controlling the rotating assembly to drive the display to rotate based on the first rotating angle, so that a third rotating angle is associated with the first rotating angle, at least two opposite vertexes of the rotating picture are always in contact with a display boundary of the display, the rotating picture does not exceed the display boundary of the display, and the third rotating angle is the rotating angle of the display.
CN202110064704.5A 2021-01-18 2021-01-18 Display device Active CN112650418B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202110064704.5A CN112650418B (en) 2021-01-18 2021-01-18 Display device
PCT/CN2021/102319 WO2022151662A1 (en) 2021-01-18 2021-06-25 Display device
US18/348,740 US20230350567A1 (en) 2021-01-18 2023-07-07 Display apparatus and display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110064704.5A CN112650418B (en) 2021-01-18 2021-01-18 Display device

Publications (2)

Publication Number Publication Date
CN112650418A CN112650418A (en) 2021-04-13
CN112650418B true CN112650418B (en) 2022-11-29

Family

ID=75368279

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110064704.5A Active CN112650418B (en) 2021-01-18 2021-01-18 Display device

Country Status (1)

Country Link
CN (1) CN112650418B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022151662A1 (en) * 2021-01-18 2022-07-21 海信视像科技股份有限公司 Display device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103279290A (en) * 2013-05-29 2013-09-04 广东欧珀移动通信有限公司 Mobile device display method and mobile device
CN106648378A (en) * 2017-01-04 2017-05-10 北京奇虎科技有限公司 Image display method, device and mobile terminal
CN109242976A (en) * 2018-08-02 2019-01-18 实野信息科技(上海)有限公司 A method of based on the automatic rotary display of WebGL virtual reality
CN109901778A (en) * 2019-01-25 2019-06-18 湖南新云网科技有限公司 A kind of page object rotation Zoom method, memory and smart machine
CN111309232A (en) * 2020-02-24 2020-06-19 北京明略软件系统有限公司 Display area adjusting method and device
CN111913608A (en) * 2020-07-31 2020-11-10 海信视像科技股份有限公司 Touch screen rotation control interaction method and display device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8619100B2 (en) * 2009-09-25 2013-12-31 Apple Inc. Device, method, and graphical user interface for touch-based gestural input on an electronic canvas

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103279290A (en) * 2013-05-29 2013-09-04 广东欧珀移动通信有限公司 Mobile device display method and mobile device
CN106648378A (en) * 2017-01-04 2017-05-10 北京奇虎科技有限公司 Image display method, device and mobile terminal
CN109242976A (en) * 2018-08-02 2019-01-18 实野信息科技(上海)有限公司 A method of based on the automatic rotary display of WebGL virtual reality
CN109901778A (en) * 2019-01-25 2019-06-18 湖南新云网科技有限公司 A kind of page object rotation Zoom method, memory and smart machine
CN111309232A (en) * 2020-02-24 2020-06-19 北京明略软件系统有限公司 Display area adjusting method and device
CN111913608A (en) * 2020-07-31 2020-11-10 海信视像科技股份有限公司 Touch screen rotation control interaction method and display device

Also Published As

Publication number Publication date
CN112650418A (en) 2021-04-13

Similar Documents

Publication Publication Date Title
CN112799627B (en) Display apparatus and image display method
CN113810746B (en) Display equipment and picture sharing method
CN112672195A (en) Remote controller key setting method and display equipment
CN112584211B (en) Display equipment
CN112087671B (en) Display method and display equipment for control prompt information of input method control
CN111901646A (en) Display device and touch menu display method
CN112650418B (en) Display device
CN112947783B (en) Display device
CN114157889B (en) Display equipment and touch control assisting interaction method
CN113593488A (en) Backlight adjusting method and display device
CN113490024A (en) Control device key setting method and display equipment
CN114302204A (en) Split-screen playing method and display device
CN116264864A (en) Display equipment and display method
CN112601109A (en) Audio playing method and display device
CN112732120A (en) Display device
CN112860331B (en) Display equipment and voice interaction prompting method
CN111935530B (en) Display equipment
CN113485614A (en) Display apparatus and color setting method
CN113542882A (en) Method for awakening standby display device, display device and terminal
CN114302070A (en) Display device and audio output method
CN114281284B (en) Display apparatus and image display method
CN112668546A (en) Video thumbnail display method and display equipment
CN112926420A (en) Display device and menu character recognition method
CN113064534A (en) Display method and display equipment of user interface
CN114417035A (en) Picture browsing method and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant