CN113497965B - Configuration method of rotary animation and display device - Google Patents

Configuration method of rotary animation and display device Download PDF

Info

Publication number
CN113497965B
CN113497965B CN202010202433.0A CN202010202433A CN113497965B CN 113497965 B CN113497965 B CN 113497965B CN 202010202433 A CN202010202433 A CN 202010202433A CN 113497965 B CN113497965 B CN 113497965B
Authority
CN
China
Prior art keywords
rotation
angle
display
component
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010202433.0A
Other languages
Chinese (zh)
Other versions
CN113497965A (en
Inventor
孟亚州
刘承龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202010202433.0A priority Critical patent/CN113497965B/en
Publication of CN113497965A publication Critical patent/CN113497965A/en
Application granted granted Critical
Publication of CN113497965B publication Critical patent/CN113497965B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers
    • H04N5/655Construction or mounting of chassis, e.g. for varying the elevation of the tube
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The embodiment of the application discloses a configuration method of a rotary animation and a display device, wherein the display device comprises the following components: a display; a rotation assembly coupled to the display and configured to drive the display to rotate; a controller configured to: responding to receiving a control instruction input by a user to indicate the display to rotate, and controlling the rotating assembly to drive the display to rotate; acquiring a component rotation angle and a component rotation rate of the rotation component; generating a predicted angle based on the assembly rotation angle and the assembly rotation rate; a rotational animation is configured based on the predicted angle.

Description

Configuration method of rotary animation and display device
Technical Field
The disclosure relates to the technical field of smart televisions, and in particular relates to a configuration method of a rotary animation and a display device.
Background
Various applications, such as a social application such as a traditional video application, a short video, etc., and a reading application such as a cartoon, a reading book, etc. The applications can utilize the screen of the intelligent television to display application pictures, and rich media resources are provided for the intelligent television. Meanwhile, the intelligent television can also perform data interaction and resource sharing with different terminals. For example, the smart tv may be connected to the mobile phone through a wireless communication manner such as a local area network, bluetooth, etc., so as to play resources in the mobile phone or directly perform screen projection to display a picture on the mobile phone.
However, since the proportion of pictures corresponding to different applications or media of different sources is different, smart televisions are often used to display pictures different from the traditional video proportion. For example, video resources shot by a terminal such as a mobile phone are vertical assets with the aspect ratio of 9:16, 9:18, 3:4 and the like; while the view provided by the reading application is a vertical resource similar to the aspect ratio of a book. The aspect ratio of the display screen of the intelligent television is generally in a horizontal state of 16:9 and the like, so that when the intelligent television displays vertical media assets such as short videos and cartoon, the vertical media asset images cannot be normally displayed due to mismatching of the image proportion and the display screen proportion. The vertical media frames are generally required to be scaled to be displayed completely, which not only wastes the display space on the screen, but also brings bad user experience.
Disclosure of Invention
Based on the above technical problems, an object of the present application is to provide a configuration method of a rotary animation and a display device.
A first aspect of an embodiment of the present application shows a display device, a display;
a rotation assembly coupled to the display and configured to drive the display to rotate;
a controller configured to:
Responding to receiving a control instruction input by a user to indicate the display to rotate, and controlling the rotating assembly to drive the display to rotate;
acquiring a component rotation angle and a component rotation rate of the rotation component;
generating a predicted angle based on the assembly rotation angle and the assembly rotation rate;
a rotational animation is configured based on the predicted angle.
A second aspect of an embodiment of the present application shows an animation configuration method, including:
responding to receiving a control instruction input by a user to indicate the display to rotate, and controlling the rotating assembly to drive the display to rotate;
acquiring a component rotation angle and a component rotation rate of the rotation component;
generating a predicted angle based on the assembly rotation angle and the assembly rotation rate;
a rotational animation is configured based on the predicted angle.
As can be seen from the above technical solutions, the embodiment of the present application shows a configuration method of a rotation animation and a display device, where the display device includes: a display; a rotation assembly coupled to the display and configured to drive the display to rotate; a controller configured to: responding to receiving a control instruction input by a user to indicate the display to rotate, and controlling the rotating assembly to drive the display to rotate; acquiring a component rotation angle and a component rotation rate of the rotation component; generating a predicted angle based on the assembly rotation angle and the assembly rotation rate; a rotational animation is configured based on the predicted angle. According to the display device shown in the embodiment of the application, the controller can generate the predicted angle based on the component rotation angle and the component rotation rate, wherein the predicted angle can be the rotation angle corresponding to the display at the time of drawing the rotary animation, and the predicted angle is the rotation angle corresponding to the display at the end of drawing the animation. Therefore, the display device shown in this embodiment can convert the collected rotation angle of the component into the predicted angle, draw the rotation animation based on the predicted angle, and draw the rotation animation by adopting the above manner, so that the rotation animation drawn each time can be guaranteed to be matched with the view angle of the user, and the experience of the user is improved.
Drawings
In order to more clearly illustrate the technical solution of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
Fig. 1A is an application scenario diagram of a display device according to some embodiments of the present application;
FIG. 1B is a rear view of a display device according to some embodiments of the present application;
FIG. 2 is a block diagram illustrating a hardware configuration of the control device 100 of FIG. 1A according to some embodiments of the present application;
FIG. 3 is a block diagram of a hardware configuration of the display device 200 of FIG. 1A according to some embodiments of the present application;
FIG. 4 is a block diagram of an architecture configuration of an operating system in a memory of a display device 200 according to some embodiments of the present application;
FIG. 5 is a schematic diagram of a display at 20ms according to one embodiment;
FIG. 6 is a schematic diagram of a display interface of a display;
FIG. 7 is a flow chart illustrating operation of a display device according to one embodiment;
FIG. 8A is a schematic diagram illustrating a display interface during rotation of a display according to one embodiment;
FIG. 8B is a schematic diagram illustrating a display interface during rotation of a display according to one embodiment;
FIG. 9A is a schematic diagram of a display interface during rotation of a display;
FIG. 9B is a schematic diagram of a display interface during rotation of the display;
FIG. 10 is a flow chart illustrating operation of a display device according to one embodiment;
FIG. 11 is a schematic diagram of a remote control shown according to an embodiment;
FIG. 12A is a schematic diagram illustrating a display interface during rotation, according to one embodiment;
FIG. 12B is a schematic diagram illustrating a display interface during rotation, according to one embodiment.
Detailed Description
In order to make the technical solution of the present application better understood by those skilled in the art, the technical solution of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
The rotary television is a novel intelligent television and mainly comprises a display and a rotary component. The display is fixed on the wall or the bracket through the rotating component, and the placement angle of the display can be adjusted through the rotating component, so that the purpose of rotation is achieved, and the display is suitable for display pictures with different aspect ratios. For example, the display is in most cases placed sideways to display video pictures with aspect ratios of 16:9, 18:9, etc. When the aspect ratio of the video picture is 9:16, 9:18, etc., the landscape display requires scaling of the picture and black areas are displayed on both sides of the display. Thus, the display may be placed vertically by the rotating assembly to accommodate 9:16, 9:18, etc. proportions of video pictures.
In order to facilitate a user to display a target media detail page in different horizontal and vertical screen states of a display and facilitate the user to watch experience of the display device in different watching states, the embodiment of the application provides the display device, the detail page display method and the computer storage medium, and the display device is a rotary television. It should be noted that, the method provided in this embodiment is not only applicable to a rotary television, but also applicable to other display devices, such as a computer, a tablet computer, and the like.
The term "module" as used in various embodiments of the present application may refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the function associated with that element.
The term "remote control" as used in the various embodiments of the present application refers to a component of an electronic device (such as a display device as disclosed herein) that can typically wirelessly control the electronic device over a relatively short range of distances. The assembly may be connected to the electronic device generally using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in a general remote control device with a touch screen user interface.
The term "gesture" as used in embodiments of the present application refers to a user's behavior through a change in hand or motion of the hand, etc., for expressing an intended idea, action, purpose, and/or result.
The term "hardware system" as used in embodiments of the present application may refer to a physical component comprising mechanical, optical, electrical, magnetic devices such as integrated circuits (Integrated Circuit, ICs), printed circuit boards (Printed circuit board, PCBs) with computing, control, storage, input and output functions. In various embodiments of the present application, the hardware system will also be commonly referred to as a motherboard (or a host chip or controller).
Referring to fig. 1A, an application scenario diagram of a display device according to some embodiments of the present application is provided. As shown in fig. 1A, communication between the control apparatus 100 and the display device 200 may be performed in a wired or wireless manner.
Wherein the control apparatus 100 is configured to control the display device 200, which can receive an operation instruction input by a user, and convert the operation instruction into an instruction recognizable and responsive to the display device 200, and to mediate interaction between the user and the display device 200. Such as: the user responds to the channel addition and subtraction operation by operating the channel addition and subtraction key on the control apparatus 100.
The control device 100 may be a remote control 100A, including an infrared protocol communication or a bluetooth protocol communication, and other short-range communication modes, and the display apparatus 200 is controlled by a wireless or other wired mode. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc. Such as: the user can input corresponding control instructions through volume up-down keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, on-off keys, etc. on the remote controller to realize the functions of the control display device 200.
The control device 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, or the like. For example, the display device 200 is controlled using an application running on a smart device. The application program, by configuration, can provide various controls to the user through an intuitive User Interface (UI) on a screen associated with the smart device.
For example, the mobile terminal 100B may install a software application with the display device 200, implement connection communication through a network communication protocol, and achieve the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 100B may be caused to establish a control instruction protocol with the display device 200 to implement functions such as physical buttons arranged by the remote controller 100A by operating various function keys or virtual controls of a user interface provided on the mobile terminal 100B. The audio/video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
The display device 200 may provide a broadcast receiving function and a network television function of a computer supporting function. The display device may be implemented as a digital television, a web television, an Internet Protocol Television (IPTV), or the like.
The display device 200 may be a liquid crystal display, an organic light emitting display, a projection device. The specific display device type, size, resolution, etc. are not limited.
The display device 200 is also in data communication with the server 300 via a variety of communication means. Display device 200 may be permitted to communicate via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 300 may provide various contents and interactions to the display device 200. By way of example, the display device 200 may send and receive information, such as: receiving Electronic Program Guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library. The servers 300 may be one group, may be multiple groups, and may be one or more types of servers. Other web service content such as video on demand and advertising services are provided through the server 300.
In some embodiments, as shown in fig. 1B, the display device 200 includes a rotating component 276, a controller 250, a display 275, a terminal interface 278 extending from a gap on the back plate, and a rotating component 276 connected to the back plate, where the rotating component 276 may be a display screen, and the rotating component 276 may rotate the display screen to a vertical screen state, i.e. a state in which a vertical side length of the screen is greater than a lateral side length, or may rotate the screen to a horizontal screen state, i.e. a state in which a lateral side length of the screen is greater than a vertical side length, from a front view of the display device.
A block diagram of the configuration of the control apparatus 100 is exemplarily shown in fig. 2. As shown in fig. 2, the control device 100 includes a controller 110, a memory 120, a communicator 130, a user input interface 140, a user output interface 150, and a power supply 160.
The controller 110 includes a Random Access Memory (RAM) 111, a Read Only Memory (ROM) 112, a processor 113, a communication interface, and a communication bus. The controller 110 is used to control the operation and operation of the control device 100, as well as the communication collaboration between the internal components, external and internal data processing functions.
For example, when an interaction in which a user presses a key arranged on the remote controller 100A or an interaction in which a touch panel arranged on the remote controller 100A is touched is detected, the controller 110 may control to generate a signal corresponding to the detected interaction and transmit the signal to the display device 200.
The memory 120 stores various operation programs, data, and applications for driving and controlling the control device 100 under the control of the controller 110. The memory 120 may store various control signal instructions input by a user.
The communicator 130 performs communication of control signals and data signals with the display device 200 under the control of the controller 110. Such as: the control apparatus 100 transmits a control signal (e.g., a touch signal or a control signal) to the display device 200 via the communicator 130, and the control apparatus 100 may receive the signal transmitted by the display device 200 via the communicator 130. Communicator 130 may include an infrared signal interface 131 and a radio frequency signal interface 132. For example: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. And the following steps: when the radio frequency signal interface is used, the user input instruction is converted into a digital signal, and then the digital signal is modulated according to a radio frequency control signal modulation protocol and then transmitted to the display device 200 through the radio frequency transmission terminal.
The user input interface 140 may include at least one of a microphone 141, a touch pad 142, a sensor 143, keys 144, etc., so that a user may input user instructions regarding controlling the display apparatus 200 to the control device 100 through voice, touch, gesture, press, etc.
The user output interface 150 outputs a user instruction received by the user input interface 140 to the display device 200 or outputs an image or voice signal received by the display device 200. Here, the user output interface 150 may include an LED interface 151, a vibration interface 152 generating vibrations, a sound output interface 153 outputting sound, a display 154 outputting an image, and the like. For example, the remote controller 100A may receive an output signal of audio, video, or data from the user output interface 150, and display the output signal as an image on the display 154, as an audio at the sound output interface 153, or as a vibration at the vibration interface 152.
A power supply 160 for providing operating power support for the various elements of the control device 100 under the control of the controller 110. May be in the form of a battery and associated control circuitry.
A hardware configuration block diagram of the display device 200 is exemplarily shown in fig. 3. As shown in fig. 3, a modem 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a memory 260, a user interface 265, a video processor 270, a display 275, a rotating component 276, a monitoring component 277, an audio processor 280, an audio output interface 285, and a power supply 290 may be included in the display apparatus 200.
The monitoring component 277 may be provided independently or in the controller.
Wherein the rotating assembly 276 may include a drive motor, a rotating shaft, etc. The driving motor may be connected to the controller 250, and the controller 250 outputs a rotation angle under control; one end of the rotating shaft is connected to a power output shaft of the driving motor, and the other end is connected to the display 275, so that the display 275 can be fixedly mounted on a wall or a bracket through the rotating assembly 276.
The rotating assembly 276 may also include other components, such as a transmission component, a detection component, and the like. Wherein, the transmission component can adjust the rotation speed and torque output by the rotating component 276 through a specific transmission ratio, and can be in a gear transmission mode; the detection means may be constituted by a sensor provided on the rotation shaft, such as an angle sensor, an attitude sensor, or the like. These sensors may detect parameters such as the angle at which the rotating assembly 276 rotates and send the detected parameters to the controller 250 to enable the controller 250 to determine or adjust the status of the display device 200 based on the detected parameters. In practice, the rotating assembly 276 may include, but is not limited to, one or more of the components described above.
A monitoring component 277 for monitoring component rotation information of the rotation component 276 and outputting the component rotation information to the controller.
The modem 210 receives broadcast television signals through a wired or wireless manner, and may perform modulation and demodulation processes such as amplification, mixing, and resonance, for demodulating an audio/video signal carried in a frequency of a television channel selected by a user and additional information (e.g., EPG data) from among a plurality of wireless or wired broadcast television signals.
The tuning demodulator 210 is responsive to the frequency of the television channel selected by the user and the television signal carried by that frequency, as selected by the user, and as controlled by the controller 250.
The tuning demodulator 210 can receive signals in various ways according to broadcasting systems of television signals, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; according to different modulation types, a digital modulation mode or an analog modulation mode can be adopted; and the analog signal and the digital signal can be demodulated according to the kind of the received television signal.
In other exemplary embodiments, the modem 210 may also be in an external device, such as an external set-top box or the like. In this way, the set-top box outputs a television signal after modulation and demodulation, and inputs the television signal to the display apparatus 200 through the external device interface 240.
The communicator 220 is a component for communicating with an external device or an external server according to various communication protocol types. For example, the display device 200 may transmit content data to an external device connected via the communicator 220, or browse and download content data from an external device connected via the communicator 220. The communicator 220 may include a network communication protocol module or a near field communication protocol module such as a WIFI module 221, a bluetooth communication protocol module 222, a wired ethernet communication protocol module 223, etc., so that the communicator 220 may receive a control signal of the control device 100 according to the control of the controller 250 and implement the control signal as a WIFI signal, a bluetooth signal, a radio frequency signal, etc.
The detector 230 is a component of the display device 200 for collecting signals of the external environment or interaction with the outside. The detector 230 may include a sound collector 231, such as a microphone, that may be used to receive a user's sound, such as a voice signal of a control instruction of the user controlling the display device 200; alternatively, ambient sounds for identifying the type of ambient scene may be collected, and the implementation display device 200 may adapt to ambient noise.
In other exemplary embodiments, the detector 230 may further include an image collector 232, such as a camera, webcam, etc., that may be used to collect external environmental scenes to adaptively change the display parameters of the display device 200; and the function is used for collecting the attribute of the user or interacting gestures with the user so as to realize the interaction between the display equipment and the user.
In other exemplary embodiments, the detector 230 may further include a light receiver for collecting ambient light intensity to adapt to changes in display parameters of the display device 200, etc.
In other exemplary embodiments, the detector 230 may further include a temperature sensor, such as by sensing ambient temperature, the display device 200 may adaptively adjust the display color temperature of the image. Illustratively, the display device 200 may be adjusted to display a colder color temperature shade of the image when the temperature is higher than ambient; when the temperature is low, the display device 200 may be adjusted to display a color temperature-warm tone of the image.
The external device interface 240 is a component that provides the controller 250 to control data transmission between the display apparatus 200 and an external device. The external device interface 240 may be connected to an external device such as a set-top box, a game device, a notebook computer, etc., in a wired/wireless manner, and may receive data such as a video signal (e.g., a moving image), an audio signal (e.g., music), additional information (e.g., an EPG), etc., of the external device.
The external device interface 240 may include: any one or more of a High Definition Multimedia Interface (HDMI) terminal 241, a Composite Video Blanking Sync (CVBS) terminal 242, an analog or digital Component terminal 243, a Universal Serial Bus (USB) terminal 244, a Component terminal (not shown), a Red Green Blue (RGB) terminal (not shown), and the like.
The controller 250 controls the operation of the display device 200 and responds to the user's operations by running various software control programs (e.g., an operating system and various application programs) stored on the memory 260.
As shown in fig. 3, the controller 250 includes a Random Access Memory (RAM) 251, a Read Only Memory (ROM) 252, a graphic processor 253, a CPU processor 254, a communication interface 255, a communication bus 256, a rotation processor 257, and an animation processor 258. Wherein the RAM251, the ROM252, and the graphic processor 253, the CPU processor 254, the communication interface 255, the rotation processor 257, and the animation processor 258 are connected through a communication bus 256. The functions of the rotation processor 257 and the animation processor 258 will be described in detail in the following embodiments.
A ROM252 for storing various system boot instructions. When the power of the display apparatus 200 starts to be started upon receiving the power-on signal, the CPU processor 254 runs a system start instruction in the ROM252, copies the operating system stored in the memory 260 into the RAM251 to start running the start operating system. When the operating system is started, the CPU processor 254 copies various applications in the memory 260 to the RAM251, and then starts running the various applications.
The graphic processor 253 generates various graphic objects such as icons, operation menus, and user input instruction display graphics, etc. The graphic processor 253 may include an operator for performing an operation by receiving user input of various interactive instructions, thereby displaying various objects according to display attributes; and a renderer for generating various objects based on the operator, and displaying the result of rendering on the display 275.
CPU processor 254 is operative to execute operating system and application program instructions stored in memory 260. And executing processing of various application programs, data and contents according to the received user input instructions so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and a plurality or one sub-processor. A main processor for performing some initialization operations of the display device 200 in a display device preloading mode and/or an operation of displaying a picture in a normal mode. A plurality of or a sub-processor for performing an operation in a state of standby mode or the like of the display device.
Communication interface 255 may include a first interface through an nth interface. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user input command for selecting a GUI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user input command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. The operation related to the selected object, such as an operation of displaying a link to a hyperlink page, a document, an image, or the like, or an operation of performing an over-program corresponding to the object. The user input command for selecting the GUI object may be a command input through various input means (e.g., mouse, keyboard, touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice uttered by the user.
The memory 260 is used to store various types of data, software programs, or applications that drive and control the operation of the display device 200. Memory 260 may include volatile and/or nonvolatile memory. And the term "memory" includes memory 260, RAM251 and ROM252 of controller 250, or a memory card in display device 200.
In some embodiments, the memory 260 is specifically configured to store an operating program that drives the controller 250 in the display device 200; various application programs built in the display device 200 and downloaded from an external device by a user are stored; data for configuring various GUIs provided by the display 275, various objects related to the GUIs, visual effect images of selectors for selecting GUI objects, and the like are stored.
In some embodiments, the memory 260 is specifically configured to store drivers and related data for the modem 210, the communicator 220, the detector 230, the external device interface 240, the video processor 270, the display 275, the audio processor 280, etc., such as external data (e.g., audio-visual data) received from the external device interface or user data (e.g., key information, voice information, touch information, etc.) received from the user interface.
In some embodiments, memory 260 specifically stores software and/or programs for representing an Operating System (OS), which may include, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. Illustratively, the kernel may control or manage system resources, as well as functions implemented by other programs (such as the middleware, APIs, or application programs); at the same time, the kernel may provide an interface to allow middleware, APIs, or applications to access the controller to implement control or management of system resources.
An architectural configuration block diagram of the operating system in the memory of the display device 200 is exemplarily shown in fig. 4. The operating system architecture is an application layer, a middleware layer and a kernel layer in sequence from top to bottom.
The application layer, the application program built in the system and the non-system application program belong to the application layer. Is responsible for direct interaction with the user. The application layer may include a plurality of applications, such as a setup application, an electronic post application, a media center application, and the like. The application programs are mainly developed based on an Android system and can be Java/C++ for development languages. These applications may also be implemented as Web applications that execute based on WebKit engines, and in particular may be developed and executed based on HTML5, cascading Style Sheets (CSS), and JavaScript.
Here, HTML, which is called a hypertext markup language (HyperText Markup Language) in its entirety, is a standard markup language for creating web pages, which are described by markup tags for describing words, graphics, animations, sounds, tables, links, etc., and a browser reads an HTML document, interprets the contents of tags within the document, and displays them in the form of web pages.
CSS, collectively referred to as cascading style sheets (Cascading Style Sheets), is a computer language used to represent the style of HTML files and may be used to define style structures such as fonts, colors, positions, and the like. The CSS style can be directly stored in an HTML webpage or a separate style file, so that the control of the style in the webpage is realized.
JavaScript, a language applied to Web page programming, can be inserted into HTML pages and interpreted by a browser. The interaction logic of the Web application is realized through JavaScript. The JavaScript can be used for realizing communication with the kernel layer by encapsulating the JavaScript extension interface through the browser,
middleware layer, some standardized interfaces may be provided to support the operation of various environments and systems. For example, the middleware layer may be implemented as multimedia and hypermedia information coding expert group (MHEG) of middleware related to data broadcasting, as DLNA middleware of middleware related to communication with an external device, as middleware providing a browser environment in which applications within a display device are running, and the like.
A kernel layer providing core system services such as: file management, memory management, process management, network management, system security authority management and other services. The kernel layer may be implemented as a kernel based on various operating systems, such as a kernel based on the Linux operating system.
The kernel layer also provides communication between system software and hardware at the same time, providing device driver services for various hardware, such as: providing display drivers for display 275, providing camera drivers for cameras, providing key drivers for remote controls, providing WIFI drivers for WIFI modules, providing audio drivers for audio output interfaces, providing Power Management (PM) drivers for power management modules, etc.
In fig. 3, a user interface 265 receives various user interactions. Specifically, an input signal for a user is transmitted to the controller 250, or an output signal from the controller 250 is transmitted to the user. Illustratively, the remote control 100A may send input signals such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by a user to the user interface 265, and then forwarded by the user interface 265 to the controller 250; alternatively, the remote controller 100A may receive an output signal such as audio, video, or data, which is processed by the controller 250 to be output from the user interface 265, and display the received output signal or output the received output signal in the form of audio or vibration.
In some embodiments, a user may input a user command through a Graphical User Interface (GUI) displayed on the display 275, and the user interface 265 receives the user input command through the GUI. In particular, the user interface 265 may receive user input commands for controlling the position of a selector in a GUI to select different objects or items. Wherein a "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user, which enables conversion between an internal form of information and a user-acceptable form. A commonly used presentation form of a user interface is a Graphical User Interface (GUI), which refers to a user interface graphically displayed in connection with computer operations. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a control, a menu, a tab, a text box, a dialog box, a status bar, a channel bar, a Widget, etc.
Alternatively, the user may enter a user command by entering a particular sound or gesture, and the user interface 265 recognizes the sound or gesture through the sensor to receive the user input command.
The video processor 270 is configured to receive an external video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image composition according to a standard codec protocol of an input signal, so as to obtain a video signal that is directly displayed or played on the display 275.
By way of example, video processor 270 includes a demultiplexing module, a video decoding module, an image compositing module, a frame rate conversion module, a display formatting module, and the like.
Wherein, the demultiplexing module is used for demultiplexing the input audio/video data stream, such as the input MPEG-2 stream (based on the compression standard of the digital storage media moving image and voice), and then the demultiplexing module demultiplexes the input audio/video data stream into video signals, audio signals and the like.
And the video decoding module is used for processing the demultiplexed video signal, including decoding, scaling and the like.
And an image synthesis module, such as an image synthesizer, for performing superposition mixing processing on the graphic generator and the video image after the scaling processing according to the GUI signal input by the user or generated by the graphic generator, so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, for example, convert a frame rate of an input 60Hz video into a frame rate of 120Hz or 240Hz, and a common format is implemented in an inserting frame manner.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format such as display 275, for example, format converting the signal output by the frame rate conversion module to output an RGB data signal.
And a display 275 for receiving image signals from the video processor 270 and displaying video content, images and menu manipulation interfaces. The video content may be displayed from the broadcast signal received by the modem 210, or may be displayed from the video content input by the communicator 220 or the external device interface 240. And a display 275 for simultaneously displaying a user manipulation interface UI generated in the display device 200 and used to control the display device 200.
And, the display 275 may include a display screen assembly for presenting pictures and a drive assembly for driving the display of images. Alternatively, a projection device and projection screen may be included, provided that the display 275 is a projection display 275.
The rotating assembly 276, the controller may issue control signals to cause the rotating assembly 276 to rotate the display 275.
A monitoring component 277 for monitoring component rotation information of the rotation component 276 and outputting the component rotation information to the controller.
The audio processor 280 is configured to receive an external audio signal, decompress and decode according to a standard codec of an input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification, so as to obtain an audio signal that can be played in the speaker 286.
Illustratively, the audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, advanced Audio Coding (AAC), high efficiency AAC (HE-AAC), etc.
An audio output interface 285 for receiving the audio signal output from the audio processor 280 under the control of the controller 250, the audio output interface 285 may include a speaker 286, or an external audio output terminal 287, such as a headphone output terminal, for outputting to a generating device of an external device.
In other exemplary embodiments, video processor 270 may include one or more chip components. Audio processor 280 may also include one or more chip components.
And, in other exemplary embodiments, video processor 270 and audio processor 280 may be separate chips or integrated with controller 250 in one or more chips.
The power supply 290 is used for providing power supply support for the display device 200 by power input by an external power supply under the control of the controller 250. The power supply 290 may be a built-in power supply circuit mounted inside the display device 200 or may be a power supply mounted outside the display device 200.
There is a time difference between the rendering timing within the controller 250 and the acquired component rotation angle timing. Typically rendering the occasion once every 33 ms; the controller acquires the component rotation angle 50 times a second, i.e., every 20 ms. The controller 250 thus obtains the component rotation angle at 20ms, which is actually utilized at 33ms, that is, 33ms shows the 20ms rotation angle. This results in that the actual rotation is not completely synchronized with the rotation of the animation in the screen.
Specifically, referring to fig. 5, fig. 5 is a schematic view of the display at 20ms, and it can be seen that at 20ms, the display device rotates by β1 angle relative to the user viewing angle. It should be noted that, in the present application, the user viewing direction refers to a vertical direction, and in particular, as applied to fig. 5, a dashed line a is the user viewing direction, and a dashed line B is the direction of the central axis of the display. It can be seen that the display is rotated by an angle β1 with respect to the user's viewing angle.
Since the controller renders once every 33ms, if the controller starts rendering the second frame of the rotation animation at 33ms, if the controller configures the rotation animation based on β1 at this time, the display interface of the corresponding display may refer to fig. 6; it can be seen that the central axis (dotted line C) of the rotation animation displayed on the display has a certain deviation from the viewing angle direction of the user, which affects the experience of the user.
Based on the technical problem, the first aspect of the embodiment of the present application shows a display device, and the structure and functions of each component of the display device can be referred to the above embodiment. The function of the controller 250 is described in detail below.
The operation process of the display device may refer to fig. 7; the rotation animation of the display device configuration to which the operation flow chart of fig. 7 is applied is a center axis symmetry screen.
The controller 250 is configured to configure a rotation animation in response to receiving a control instruction from a user to instruct the display to rotate, and control the rotation component to rotate the display.
The rotation animation according to the present embodiment is a center axis symmetric screen. The rotational animation rotates with the display 275 during rotation of the display 275. However, since the rotation animation is a central axis symmetric picture, the rotation animation will not affect the user's vision during rotation, and therefore, in this embodiment, the controller 250 directly sends a motion command to the controller 250 after receiving the rotation command carrying the first rotation information. The controller 250 configures the rotation animation after receiving the start instruction.
The operation of the display device will be described in detail with reference to specific examples.
In a possible embodiment, the rotation animation of the display device is a hollow sphere. In the initial state, the display 275 of the display device is in a landscape state. The user wants to watch vertical media such as short videos, photos and videos generated by mobile phone photographing or video recording through display. At this time, the operation process of the display apparatus may continue with reference to fig. 5.
S101 the user outputs a control instruction to the controller 250. The control instruction may be a user voice during the actual application, for example, the user voice is "turn 90 degrees to the left". In this embodiment, the control instruction may also be an operation instruction, specifically, the user may output the operation instruction to the controller 250 through the remote controller, for example, the first rotation information corresponding to the sound lowering key of the remote controller is "90 degrees rotated leftwards", and the user triggers the remote controller to send the operation instruction carrying "90 degrees rotated leftwards" to the controller 250 through touching the sound lowering key.
S102, after receiving a control instruction, the controller 250 controls the rotating assembly to drive the display to rotate;
for example: the user's voice is "rotated 90 degrees to the left", and the corresponding first rotation information is "rotated 90 degrees to the left".
S103 the controller 250 configures the rotation animation after receiving the control instruction.
S104 the controller 250 outputs the configured rotation animation;
s105 a display 275 for displaying the rotation animation outputted from the controller 250.
In this embodiment, referring to fig. 8A and 8B, it can be seen that the rotation animation is a central axis symmetrical picture, and the rotation animation will not affect the user's vision during rotation.
In another possible embodiment, a display device is shown, the display device configured to rotate an animation that is a non-axisymmetric picture. The structure of the display device and the functions of the respective components can be referred to the above-described embodiments. The controller 250 of the controller and the functions of the controller 250 are described in detail below.
The rotation animation of the non-axisymmetric picture rotates along with the display 275 during the rotation of the display 275, and fig. 9A and 9B are schematic diagrams of the display interface of the display 275 during the rotation. It can be seen that the rotation animation in the figure is an apple, and during rotation of the display 275, the rotation animation rotates with it, causing the rotation animation for viewing to tilt.
The display device shown in the present embodiment based on the above-described problems, the operation process of which may be referred to as fig. 10; the rotation animation of the display device configuration to which the operation flow chart of fig. 10 is applied is a non-center axis symmetrical screen.
The job flow of the display device includes the steps of:
s201 the user outputs a control instruction to the controller 250 of the controller.
The control instruction in this embodiment may be a user voice, for example, the user voice is "rotate XX degrees to X". In this embodiment, the control instruction may also be an operation instruction, specifically, the user may output the operation instruction to the controller 250 through the remote controller, for example, the first rotation information corresponding to the sound lowering key of the remote controller is "XX degrees to X", and the user triggers the remote controller to output the operation instruction through touching the sound lowering key.
For the situation control instruction of the user interacting with the controller 250 through the user voice to be the user voice, correspondingly, the controller 250 is configured to execute the step S202 of identifying the control instruction and generate corresponding first rotation information according to the identification result, wherein the first rotation information comprises the control direction and the control angle
S203, controlling the rotating assembly to rotate in a control direction by a control angle;
Specifically, the controller 250 is further configured to recognize the user direction and the user angle θ\u in the user's voice i The method comprises the steps of carrying out a first treatment on the surface of the If the user angle is less than or equal to 180 degrees, determining the user direction as the control direction and theta/u i To control the angle; if the rotation angle is greater than 180 degrees, determining the reversal of the user direction as the control direction and 360-theta/u i To control the angle.
The determination process of the first rotation information will be described in detail below with reference to specific examples.
In a feasible embodiment, the user voice is "30 degrees rotated to the left". After receiving the control command carrying the "30 degrees turn left", the controller 250 recognizes the user direction in the control commandIs "left rotated" and user angle θ/u i Is "30 degrees". The first rotation information is determined to be "30 degrees rotated to the left".
In a feasible embodiment, the user voice is "rotate 270 degrees to the left". After receiving the control command carrying the "rotation to the left by 270 degrees", the controller 250 recognizes the user direction in the control command as "rotation to the left" and the user angle θ/u i Is "270 degrees". The controller 250 determines that "270 degrees" is greater than "180 degrees", and in this case, determines that "360-270=90 degrees" and "reverse information rotated left" are rotated right "as the first rotation information, and the finally generated first rotation information is" rotated right by 90 degrees ".
By adopting the method for determining the first rotation information, which is shown in the embodiment of the application, the display 275 can be ensured to be rotated to the angle wanted by the user in the shortest time, the waiting time of the user is shortened, and the experience of the user is correspondingly improved. In the above embodiment, if the user directly generates a control instruction of "rotating 270 degrees to the left" according to the user's voice. When the controller 250 sends a control command carrying "rotate 270 degrees to the left" to the rotating assembly 276, the rotating assembly 276 rotates the display 275 270 degrees based on the control of the control command. The first rotation information obtained by adopting the generation mode of the first rotation information shown in the application is '90 degrees rotated right'. When the controller 250 sends a control command carrying "rotate 90 degrees to the rotation component 276, the rotation component 276 rotates the display 275 by 90 degrees based on the control of the control command. The results ultimately achieved for both control schemes are consistent. Obviously, the first rotation information generation method can ensure that the display 275 is rotated to the angle wanted by the user in the shortest time, shorten waiting time of the user and correspondingly improve experience of the user.
It should be noted that the above-described process of determining the first rotation information may be performed based on the controller 250, or may be performed based on the rotation processor 257 configured in the controller 250, and in some possible embodiments, the process of determining the first rotation information may be performed based on the rotation processor 257 in the display device being independently configured.
In a possible embodiment, the display further includes a remote control through which the user sends control instructions to the controller 250. The remote controller is configured to send corresponding operation instructions based on touch control of a user; the controller 250 is further configured to identify first rotation information corresponding to the operation instruction.
Specifically, a remote controller is typically configured with a plurality of keys such as: sound adjustment keys, turntable keys, signal source keys, etc. In the practical application process, the corresponding relation between each key and the rotation information can be preset. When the user touches the corresponding key, the remote controller sends an operation instruction carrying rotation information corresponding to the key to the controller 250.
Fig. 11 is a schematic diagram of a remote controller according to an embodiment, in which a key 1 is a key corresponding to "increase sound", a key 2 is a key corresponding to "adjust up", a key 3 is a key corresponding to "adjust down", and a key 4 is a key corresponding to "decrease sound". In the application scenario of controlling the rotation of the display 275, the key 1, the key 2, the key 3 and the key 4 are all rotation keys of the remote controller. Specifically, in the figure, the key 1 is a key corresponding to "control display 275 rotates 360 degrees", the key 2 is a key corresponding to "control display 275 rotates 90 degrees leftwards", the key 3 is a key corresponding to "control display 275 rotates 180 degrees leftwards", and the key 4 is a key corresponding to "control display 275 rotates 90 degrees rightwards". It should be noted that, in this embodiment, the correspondence between a set of rotation information and a key is only described by way of example, and in the practical application process, the correspondence between the key and the rotation information may be set according to the habit of the user.
In the process of the user interacting with the controller 250 using the remote controller shown in fig. 11, the user touches the key 2. The remote controller transmits an operation instruction carrying "rotate 90 degrees to the left" to the controller 250 based on the trigger of the user's operation. The controller 250 recognizes that the first rotation information corresponding to the operation instruction is "90 degrees rotated to the left". The controller 250 sends a rotation command carrying "rotate 90 degrees to the left" to the rotation assembly 276. The rotating component 276 drives the display 275 to rotate 90 degrees to the left based on the control of the control instructions.
S204, the controller 250 obtains the component rotation angle and the component rotation rate of the rotating component; .
In the present application, the controller 250 detects the component rotation information including the component rotation angle and the component rotation direction in real time.
In a feasible embodiment, the controller 250 sends the collected component rotation information to the controller 250 every preset time interval.
It should be noted that the process of collecting the rotation angle of the component may be performed based on the controller 250 or may be performed based on the monitoring component 277 configured in the controller 250, and in some possible embodiments, the process of collecting the rotation angle of the component may be performed based on the monitoring component 277 in the display device being independently configured.
The data acquisition process of the controller 250 is described in detail below in connection with specific examples.
In a feasible embodiment, the controller 250 collects an increased value of the rotation angle of the component every 0.2s, and specifically, the data collected by the controller 250 may refer to table 1.
TABLE 1
In this embodiment, the data collected by the controller 250 is an angle increment value, and each angle increment value corresponds to a rotation angle of the component.
During actual use, the display 275 will rotate due to human movement, and thus during actual use, it is necessary to determine whether the rotation of the rotating assembly 276 is caused by a control command sent from the controller 250 or due to a malfunction. And different processing modes are adopted for different results. If the rotation of the rotating assembly 276 is caused by a control instruction sent from the controller 250, the rotating animation is configured based on the assembly rotation information output from the controller 250. Rotation of the rotation assembly 276 is caused by a malfunction, and the rotation animation is not configured.
Specifically, the illustrated embodiment of the present application determines whether the rotation of the rotating assembly 276 is caused by a malfunction by calculating the rate of change of the assembly rotation angle (which may also be referred to as the assembly rotation rate). Specifically, the controller 250 is further configured to calculate a rate of change of the component rotation angle.
For example, the angle acquired by the controller 250 at 0.2S is increased by 2 degrees, and the controller 250 calculates the rate of change of the component rotation angle to be 2/0.2=10 degrees/second. The angle acquired by the controller 250 at 0.4S increases by 0 degrees, and the controller 250 calculates the rate of change of the component rotation angle to be 0/0.2=0 degrees/second.
In practice, to reduce the amount of computation by the controller 250, the controller 250 may calculate the target component rotation rate over a period of time.
In one feasibility real time, the angle acquired by the controller 250 at 0.2S is increased by 2 degrees, the angle acquired by the controller 250 at 0.4S is increased by 0 degrees, the angle acquired by the controller 250 at 0.6S is increased by 0 degrees, the angle acquired by the controller 250 at 0.8S is increased by 0 degrees, and the angle acquired by the controller 250 at 1S is increased by 0 degrees. At this time, the controller 250 calculates the change rate of the component rotation angle to be 0.2/1=0.2 degrees/second, and the preset change rate in this embodiment to be 5 degrees/second. Based on this, it can be determined that the rate of change of the component rotation angle is less than the preset rate of change in time of 1s, it is determined that the rotation of the rotating component 276 is caused by the erroneous operation.
For another example, in the embodiment shown in table 1, the angle acquired by the controller 250 at 0.2S is increased by 2 degrees, and the controller 250 calculates the change rate of the component rotation angle to be 2/0.2=10 degrees/second. In the practical process, in order to reduce the calculation amount of the controller 250, the average rate of the rotation angle of the component acquired N times before the target acquisition time is, for example, in the embodiment shown in table 1, the angle acquired by the controller 250 at 0.2S is increased by 2 degrees, the angle acquired by the controller 250 at 0.4S is increased by 2 degrees, the angle acquired by the controller 250 at 0.6S is increased by 2 degrees, the angle acquired by the controller 250 at 0.8S is increased by 2 degrees, and the angle acquired by the controller 250 at 1S is increased by 2 degrees. If 1S is the target acquisition time, the controller 250 calculates the average rate of component rotation angles for the first 5 acquisitions of 1S to be (2+2+2+2)/1=10 degrees/second. The preset rate of change in this implementation is 5 degrees/second. Based on this, it can be determined that the rate of change of the component rotation angle is greater than the preset rate of change in time of 1s, it is determined that the rotation of the rotating component 276 is caused by the control instruction transmitted from the controller 250.
In the embodiment of the present application, the controller 250 may also determine when to stop configuring the rotation animation through the rate of change of the rotation angle of the component. Specifically, taking the example shown in table 1 as an example, the rate of change of the rotation angle of the 0-9s assembly is 10 degrees/s, during which time the controller 250 continuously configures the rotation animation. At 9.2s, the controller 250 calculates that the component rotation angle change rate is 0, which is smaller than the preset change rate, and the controller 250 terminates the configuration rotation animation.
In a feasible embodiment, each component rotation angle corresponds to one acquisition time, and the controller is further controlled to calculate the predicted angle according to the following formula;
pA=lA+(CT-lT)*v;
the pA is a predicted angle, the lA is a target rotation angle, the target rotation angle is a component rotation angle corresponding to one acquisition time closest to the drawing time, the CT is the drawing time, the drawing time is the time of drawing the rotation animation, the lT is a target acquisition time, the target acquisition time is the acquisition time of the target rotation angle, and the v is the component rotation rate.
In a feasible embodiment, the controller is further controlled to calculate the predicted angle according to the following formula, wherein each component rotation angle corresponds to one acquisition time;
pA=lA+(CT+PT-lT)*v;
Wherein pA (predict Angle) is a predicted angle, lA (last Angle) is a target rotation angle, the target rotation angle is a component rotation angle corresponding to one acquisition Time closest to the drawing Time, CT (cur Time) is the drawing Time, the drawing Time is the Time of drawing the rotation animation, lT (last Time) is a target acquisition Time, the target acquisition Time is an acquisition Time of the target rotation angle, v is a component rotation rate, and PT is a Time required for drawing one frame of rotation animation.
The method for calculating the predicted angle will be described in detail with reference to specific examples.
In a feasible embodiment, the controller collects a component rotation angle every 20ms, each component rotation angle corresponds to a collection time, and the controller stores collected data in a preset list. Table 2 is a preset list 2 shown according to an embodiment;
TABLE 2
The controller draws a picture of one frame of the rotary animation every 33ms, and prepares to draw a picture of the tenth frame of the rotary animation at the drawing time (0.297 s). At this time, the controller determines 0.28S as the last acquisition time, and the component rotation angle (2.8) corresponding to 0.28S is the target rotation angle.
The component rotation rate V is the average rate of component rotation angles acquired N times before the target acquisition time, N is set to 5, and v= (2.8-1.8)/(0.28-0.18) =10 degrees/s is specifically applied to the above implementation.
pa=la+ (CT-lT) ×v=2.8+ (0.297-0.28) 10 degrees/s=2.97 degrees.
In another possible embodiment, pa=la+ (ct+pt-lT) ×v, and PT is the time required to draw a frame of a rotational animation. In particular, in the above embodiment, pa=2.8+ (0.297+0.0.033-0.28) is 10 degrees/s=3.3 degrees.
It should be noted that the above calculation process may be performed based on the controller 250 or may be performed based on the animation processor 258 configured in the controller 250, and in some possible embodiments, the process of acquiring the rotation angle of the component may be performed based on the animation processor 258 in the display device being independently configured.
S206 controller 250 configures a rotation animation based on the predicted angle.
In one possible embodiment, the controller 250 drives the display 275 to start rotating, and transmits the detected component rotation information to the controller 250 in real time. At 0.297s, the controller 250 calculates the predicted angle to be "2.97 degrees". The assembly is rotated to the right.
The controller initializes OpenGL, configures an OpenGL environment, loads animation resources, and sets animation parameters. The configuration of the animated content was started with each animated model angle rotated 2.97 degrees to the left. The angle will vary during subsequent configurations following the variation of the component rotation information.
The animation rotates around the center of the display 275 all the time. For example, display 275 is sized W H, and the corresponding rotating animation rotates about the (W/2, H/2) point; the display interface during the rotation process can be seen in fig. 12A and 12B. Fig. 12A is a change diagram of a display interface corresponding to a display device according to an embodiment of the present application rotated from a portrait mode to a landscape mode; fig. 12B is a change diagram of a display interface corresponding to a display device according to an embodiment of the present application when the display device rotates from a landscape mode to a portrait mode.
In another possible embodiment, the controller 250 drives the display 275 to start rotating, and transmits the monitored component rotation information to the controller 250 in real time. At 0.297s, the controller 250 calculates the predicted angle to be "3.3 degrees". The assembly is rotated to the right.
The controller initializes OpenGL, configures an OpenGL environment, loads animation resources, and sets animation parameters. The configuration of the animation contents is started, and each animation model angle is rotated to the right by-3.3 degrees. The angle will vary during subsequent configurations following the variation of the component rotation information.
The animation rotates around the center of the display 275 all the time. For example, display 275 is sized W H, and the corresponding rotating animation rotates about the (W/2, H/2) point; the display interface during the rotation process can be seen in fig. 12A and 12B. Fig. 12A is a change diagram of a display interface corresponding to a display device according to an embodiment of the present application rotated from a portrait mode to a landscape mode; fig. 12B is a change diagram of a display interface corresponding to a display device according to an embodiment of the present application when the display device rotates from a landscape mode to a portrait mode.
S207, the controller 250 outputs the rotation animation to the display;
s208 the display displays the rotation animation.
A controller 250 further configured to configure a rotation animation based on the component rotation information output by the controller 250, the component rotation information being real-time rotation information of the rotation component 276, and output the rotation animation to a display 275.
It is noted that the process of rotating the animation configuration may be accomplished based on the controller 250 or may be accomplished based on the animation processor 258 configured in the controller 250, and in some possible embodiments, the process of rotating the animation configuration may be accomplished based on the animation processor 258 within a stand-alone display device.
It should be understood that the same reference numerals are used to refer to the same or similar parts throughout the various embodiments of the present disclosure, and the above-described embodiments do not limit the scope of the present disclosure. Various modifications and variations of the present application will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (7)

1. A display device, characterized by comprising:
a display;
A rotation assembly coupled to the display and configured to drive the display to rotate;
a controller configured to:
responding to receiving a control instruction input by a user to indicate the display to rotate, and controlling the rotating assembly to drive the display to rotate;
acquiring a component rotation angle and a component rotation rate of the rotation component;
generating a predicted angle based on the assembly rotation angle and the assembly rotation rate, wherein each assembly rotation angle corresponds to one acquisition time, and a calculation formula of the predicted angle is as follows: pa=la+ (CT-lT) ×v,
wherein pA is a predicted angle, CT is a drawing time, the drawing time is a time for drawing a rotary animation, lA is a target rotation angle, the target rotation angle is a component rotation angle corresponding to one acquisition time closest to the drawing time, lT is a target acquisition time, the target acquisition time is an acquisition time of the target rotation angle, and v is a component rotation rate;
the rotational animation is rotated by the predicted angle in a direction opposite to the direction of rotation of the component.
2. The display device of claim 1, wherein each component rotation angle corresponds to an acquisition time and the controller is further controlled to calculate the predicted angle according to the following formula;
pA=lA+(CT+PT-lT)*v;
The method comprises the steps of taking pA as a predicted angle, CT as a drawing time, taking the drawing time as the time for drawing a rotary animation, taking lA as a target rotation angle, taking lT as a target acquisition time, taking V as a component rotation rate, and taking PT as the time required for drawing a rotary animation, wherein the target rotation angle is a component rotation angle corresponding to one acquisition time nearest to the drawing time, and the lT is the target acquisition time.
3. A display device according to claim 1 or 2, wherein the component rotation rate is an average rate of component rotation angles of N acquisitions before a target acquisition time.
4. A display device according to claim 3, wherein if the component rotation rate is less than a preset rate of change, the controller is further configured to terminate configuring the rotation animation.
5. The display device of claim 1, wherein the controller is further configured to;
determining first rotation information in response to receiving a control instruction from a user indicating rotation of the display, the first rotation information including a control direction and a control angle;
And controlling the rotating assembly to rotate in a control direction by a control angle.
6. The display device of claim 5, wherein the control instruction is a user voice, the controller being further configured to recognize a user direction and a user angle θu in the user voice i
If the user angle is less than or equal to 180 degrees, determining the user direction as the control direction and theta/u i To control the angle;
if the rotation angle is greater than 180 degrees, determining the reversal of the user direction as the control direction and 360-theta/u i To control the angle.
7. An animation configuration method, comprising:
responding to receiving a control instruction input by a user to indicate the display to rotate, and controlling the rotating assembly to drive the display to rotate;
acquiring a component rotation angle and a component rotation rate of the rotation component;
generating a predicted angle based on the assembly rotation angle and the assembly rotation rate, wherein each assembly rotation angle corresponds to one acquisition time, and a calculation formula of the predicted angle is as follows: pa=la+ (CT-lT) ×v,
wherein pA is a predicted angle, CT is a drawing time, the drawing time is a time for drawing a rotary animation, lA is a target rotation angle, the target rotation angle is a component rotation angle corresponding to one acquisition time closest to the drawing time, lT is a target acquisition time, the target acquisition time is an acquisition time of the target rotation angle, and v is a component rotation rate;
The rotational animation is rotated by the predicted angle in a direction opposite to the direction of rotation of the component.
CN202010202433.0A 2020-03-20 2020-03-20 Configuration method of rotary animation and display device Active CN113497965B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010202433.0A CN113497965B (en) 2020-03-20 2020-03-20 Configuration method of rotary animation and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010202433.0A CN113497965B (en) 2020-03-20 2020-03-20 Configuration method of rotary animation and display device

Publications (2)

Publication Number Publication Date
CN113497965A CN113497965A (en) 2021-10-12
CN113497965B true CN113497965B (en) 2023-09-15

Family

ID=77993674

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010202433.0A Active CN113497965B (en) 2020-03-20 2020-03-20 Configuration method of rotary animation and display device

Country Status (1)

Country Link
CN (1) CN113497965B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101246754A (en) * 2007-02-13 2008-08-20 鸿富锦精密工业(深圳)有限公司 Display device
CN102479065A (en) * 2010-11-26 2012-05-30 Tcl集团股份有限公司 Rotary display and display method thereof
CN103475823A (en) * 2013-08-29 2013-12-25 Tcl光电科技(惠州)有限公司 Display device and image processing method for same
CN110740364A (en) * 2019-11-14 2020-01-31 四川长虹电器股份有限公司 intelligent rotary television device, system and working method thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102413657B1 (en) * 2015-11-05 2022-06-28 삼성전자주식회사 Method for sensing a rotation of rotation member and an electronic device thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101246754A (en) * 2007-02-13 2008-08-20 鸿富锦精密工业(深圳)有限公司 Display device
CN102479065A (en) * 2010-11-26 2012-05-30 Tcl集团股份有限公司 Rotary display and display method thereof
CN103475823A (en) * 2013-08-29 2013-12-25 Tcl光电科技(惠州)有限公司 Display device and image processing method for same
CN110740364A (en) * 2019-11-14 2020-01-31 四川长虹电器股份有限公司 intelligent rotary television device, system and working method thereof

Also Published As

Publication number Publication date
CN113497965A (en) 2021-10-12

Similar Documents

Publication Publication Date Title
CN113395558B (en) Display equipment and display picture rotation adaptation method
CN112565839B (en) Display method and display device of screen projection image
CN111970550B (en) Display device
CN111787388B (en) Display device
CN114827707B (en) Display equipment and startup animation display method
CN112165644A (en) Display device and video playing method in vertical screen state
CN112565861A (en) Display device
CN113395554B (en) Display device
CN113395600B (en) Interface switching method of display equipment and display equipment
CN113556593B (en) Display device and screen projection method
CN113556591A (en) Display equipment and projection screen image rotation display method
CN113542824B (en) Display equipment and display method of application interface
CN114501087B (en) Display equipment
CN113573118B (en) Video picture rotating method and display equipment
CN113630639B (en) Display device
CN113497958B (en) Display equipment and picture display method
CN113497965B (en) Configuration method of rotary animation and display device
CN113497962B (en) Configuration method of rotary animation and display device
CN113556590A (en) Method for detecting effective resolution of screen-projected video stream and display equipment
CN113542823B (en) Display equipment and application page display method
CN111913608B (en) Touch screen rotation control interaction method and display device
CN113473192B (en) Display device and starting signal source display adaptation method
US11501411B2 (en) Animation configuration method and display device
CN115396704B (en) Display equipment and power-on signal source display adaptation method
CN115697771A (en) Display device and display method of application interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant