CN114296542B - Display apparatus and control method thereof - Google Patents

Display apparatus and control method thereof Download PDF

Info

Publication number
CN114296542B
CN114296542B CN202110499421.3A CN202110499421A CN114296542B CN 114296542 B CN114296542 B CN 114296542B CN 202110499421 A CN202110499421 A CN 202110499421A CN 114296542 B CN114296542 B CN 114296542B
Authority
CN
China
Prior art keywords
display
gesture
user
lifting
finger
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110499421.3A
Other languages
Chinese (zh)
Other versions
CN114296542A (en
Inventor
唐高明
程晋
华峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202110499421.3A priority Critical patent/CN114296542B/en
Priority to PCT/CN2021/096003 priority patent/WO2022227159A1/en
Publication of CN114296542A publication Critical patent/CN114296542A/en
Application granted granted Critical
Publication of CN114296542B publication Critical patent/CN114296542B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a display device and a control method thereof, wherein a display of the display device can rotate and/or lift under the drive of a rotation driving device and/or a lifting driving device; the controller of the display device is configured to receive an input user gesture; if the user gesture is a preset rotation gesture, controlling the rotation driving device to drive the display to rotate according to the user gesture; and if the gesture of the user is a preset lifting gesture, controlling the lifting driving device to drive the display to ascend or descend according to the gesture of the user. According to the display device and the control method thereof provided by the embodiment of the application, the user can control the rotation or lifting of the display device by inputting the preset gesture for controlling the rotation or lifting of the display device, so that the user can more conveniently operate the rotation and lifting of the display, the user operation is saved, and the user experience is improved.

Description

Display apparatus and control method thereof
Technical Field
The present application relates to the field of display devices, and in particular, to a display device and a control method thereof.
Background
A smart tv is a display device that can provide a user with a play screen such as audio, video, pictures, etc., which can provide a user with not only live tv program content received through data broadcasting but also various applications and service content such as a web video program, web game, etc.
Compared with the traditional intelligent television, the rotary lifting intelligent television has the advantages that the rotary lifting intelligent television has the functions of rotation and lifting, and based on the functions, the television can rotate, lift and descend under the driving of corresponding driving equipment, so that different postures are realized. For example, the horizontal screen posture is rotated to the vertical screen posture, and then the vertical screen posture is rotated to the horizontal screen posture. As another example, 10cm above the current elevation, or 5cm below the current elevation, etc.
Disclosure of Invention
The application provides display equipment and a control method thereof, which aim to solve the technical problem of how to control rotation and lifting of a display.
In a first aspect, the present application provides a display apparatus comprising:
A display configured to be rotatable and/or liftable;
A rotation driving device for driving the display to rotate and/or a lifting driving device for driving the display to lift;
a controller configured to:
receiving an input user gesture;
if the user gesture is a preset rotation gesture, controlling the rotation driving device to drive the display to rotate according to the user gesture;
and if the user gesture is a preset lifting gesture, controlling the lifting driving device to drive the display to ascend or descend according to the user gesture.
In a second aspect, the present application also provides a display device control method, applied to a display device, where the display device can rotate and/or lift; the method comprises the following steps:
receiving an input user gesture;
If the user gesture is a preset rotation gesture, controlling the display device to rotate according to the user gesture;
and if the user gesture is a preset lifting gesture, controlling the display equipment to ascend or descend according to the user gesture.
According to the technical scheme, based on the display device and the control method thereof provided by the embodiment of the application, a user can control the rotation or lifting of the display device by inputting the preset gesture for controlling the rotation or lifting of the display device, so that the user can more conveniently operate the rotation and lifting of the display, the user operation is saved, and the user experience is improved.
Drawings
In order to more clearly illustrate the technical solution of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a use scenario of a display device of the present application shown in some embodiments;
fig. 2 is a block diagram of the hardware configuration of the control apparatus 100 shown in some embodiments of the present application;
fig. 3 is a hardware configuration block diagram of a display device 200 of the present application shown in some embodiments;
FIG. 4 is a diagram of the software configuration in a display device 200 according to the present application shown in some embodiments;
FIG. 5 is a schematic rear view of a display device 200 of the present application shown in some embodiments;
FIG. 6 is a use scenario of a display device of the present application shown in some embodiments;
FIG. 7 is a schematic illustration of a predetermined rotation gesture and display device gesture of the present application shown in some embodiments;
FIG. 8 is a schematic diagram of a contact trajectory corresponding to a predetermined rotation gesture shown in some embodiments of the present application;
FIG. 9 is a schematic diagram of a contact trajectory corresponding to a predetermined rotation gesture shown in some embodiments of the present application;
FIG. 10 is a schematic illustration of another predetermined rotational gesture and display device pose of the present application shown in some embodiments;
FIG. 11 is a schematic illustration of a predetermined lift gesture and display device gesture of the present application shown in some embodiments;
FIG. 12 is a schematic view of a contact trajectory corresponding to a predetermined lift gesture according to some embodiments of the present application;
FIG. 13 is a schematic illustration of a predetermined lift gesture and display device gesture of the present application shown in some embodiments;
FIG. 14 is a schematic illustration of another predetermined lift gesture and display device pose of the present application shown in some embodiments;
Fig. 15 is a flow chart of a display device control method of the present application in some embodiments.
Detailed Description
For the purposes of making the objects and embodiments of the present application more apparent, an exemplary embodiment of the present application will be described in detail below with reference to the accompanying drawings in which exemplary embodiments of the present application are illustrated, it being apparent that the exemplary embodiments described are only some, but not all, of the embodiments of the present application.
It should be noted that the brief description of the terminology in the present application is for the purpose of facilitating understanding of the embodiments described below only and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms first, second, third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements explicitly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that is capable of performing the function associated with that element.
Fig. 1 is a schematic diagram of a usage scenario of a display device according to an embodiment. As shown in fig. 1, the display device 200 is also in data communication with a server 400, and a user can operate the display device 200 through the smart device 300 or the control apparatus 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes at least one of infrared protocol communication or bluetooth protocol communication, and other short-range communication modes, and the display device 200 is controlled by a wireless or wired mode. The user may control the display apparatus 200 by inputting a user instruction through at least one of a key on a remote controller, a voice input, a control panel input, and the like.
In some embodiments, the smart device 300 may include any one of a mobile terminal, tablet, computer, notebook, AR/VR device, etc.
In some embodiments, the display device 200 may further perform control in a manner other than the control apparatus 100 and the intelligent device 300, for example, the module configured inside the display device 200 device for obtaining a voice command may directly receive the voice command control of the user, or may receive the voice command control of the user through a voice control apparatus set outside the display device 200 device, or may receive the space gesture control of the user through a camera connected to the display device 200 collecting the user gesture, or may receive the touch gesture control of the user through a touch control component on the display device.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be permitted to make communication connections via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. The server 400 may be a cluster, or may be multiple clusters, and may include one or more types of servers.
In some embodiments, software steps performed by one step execution body may migrate on demand to be performed on another step execution body in data communication therewith. For example, software steps executed by the server may migrate to be executed on demand on a display device in data communication therewith, and vice versa.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 in accordance with an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction of a user, and convert the operation instruction into an instruction recognizable and responsive to the display device 200, and may perform an interaction between the user and the display device 200.
In some embodiments, the communication interface 130 is configured to communicate with the outside, including at least one of a WIFI chip, a bluetooth module, NFC, or an alternative module.
In some embodiments, the user input/output interface 140 includes at least one of a microphone, a touchpad, a sensor, keys, or an alternative module.
Fig. 3 shows a hardware configuration block diagram of the display device 200 in accordance with an exemplary embodiment.
In some embodiments, display apparatus 200 includes at least one of a modem 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, memory, a power supply, a user interface.
In some embodiments the controller comprises a central processor, a video processor, an audio processor, a graphics processor, RAM, ROM, a first interface for input/output to an nth interface.
In some embodiments, the display 260 includes a display screen component for presenting a picture, and a driving component for driving an image display, for receiving an image signal from the controller output, for displaying video content, image content, and components of a menu manipulation interface, and a user manipulation UI interface, etc.
In some embodiments, the display 260 may be at least one of a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, the modem 210 receives broadcast television signals via wired or wireless reception and demodulates audio-video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the control device 100 or the server 400 through the communicator 220.
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for capturing the intensity of ambient light; either the detector 230 includes an image collector, such as a camera, which may be used to collect external environmental scenes, user attributes, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, for receiving external sounds; still alternatively, the detector includes a touch data detector, such as a touch sensitive component, for collecting user touch data.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, or the like. The input/output interface may be a composite input/output interface formed by a plurality of interfaces.
In some embodiments, the controller 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the main device in which the controller 250 is located, such as an external set-top box or the like.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored on the memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command to select a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments the controller includes at least one of a central processing unit (Central Processing Unit, CPU), a video processor, an audio processor, a graphics processor (Graphics Processing Unit, GPU), RAM Random Access Memory, RAM), ROM (Read-Only Memory, ROM), first to nth interfaces for input/output, a communication Bus (Bus), and the like.
And the CPU processor is used for executing the operating system and application program instructions stored in the memory and executing various application programs, data and contents according to various interaction instructions received from the outside so as to finally display and play various audio and video contents. The CPU processor may include a plurality of processors. Such as one main processor and one or more sub-processors.
In some embodiments, a graphics processor is used to generate various graphical objects, such as: at least one of icons, operation menus, and user input instruction display graphics. The graphic processor comprises an arithmetic unit, which is used for receiving various interactive instructions input by a user to operate and displaying various objects according to display attributes; the device also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, perform at least one of decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image composition, and the like according to a standard codec protocol of an input signal, and obtain a signal that is displayed or played on the directly displayable device 200.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode according to a standard codec protocol of an input signal, and at least one of noise reduction, digital-to-analog conversion, and amplification, to obtain a sound signal that can be played in the speaker.
In some embodiments, a user may input a user command through a Graphical User Interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Or the user may input the user command by inputting a specific sound or gesture, the user input interface recognizes the sound or gesture through the sensor, and receives the user input command.
In some embodiments, a "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of a user interface is a graphical user interface (Graphic User Interface, GUI), which refers to a graphically displayed user interface that is related to computer operations. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include at least one of a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
In some embodiments, the user interface 280 is an interface (e.g., physical keys on a display device body, or the like) that may be used to receive control inputs.
In some embodiments, the display device 200 may be a touch display device, where the display device is a touch display formed by a touch component and a screen. The touch display device supports the touch interaction function, and a user can operate the host only by lightly touching the display with fingers, so that the operations of a keyboard, a mouse and a remote controller are eliminated, and man-machine interaction is more straightforward. On the touch display, the user can input different control instructions through touch operations. For example, a user may input touch instructions such as clicking, sliding, long pressing, double clicking, etc., and different touch instructions may represent different control functions.
To achieve the different touch actions described above, the touch sensitive assembly may generate different electrical signals when the user inputs the different touch actions, and transmit the generated electrical signals to the controller 250. The controller 250 may perform feature extraction on the received electrical signal to determine a control function to be performed by the user based on the extracted features. For example, when a user enters a click touch action at any program icon location in the application program interface, the touch component will sense the touch action and thereby generate an electrical signal. After receiving the electrical signal, the controller 250 may determine the duration of the level corresponding to the touch action in the electrical signal, and recognize that the user inputs the click command when the duration is less than the preset time threshold. The controller 250 then extracts the location features generated by the electrical signals to determine the touch location. When the touch position is within the application icon display range, it is determined that a click touch instruction is input by the user at the application icon position. Accordingly, the click touch instruction is used to perform a function of running a corresponding application program in the current scenario, and thus the controller 250 may start running the corresponding application program.
For another example, when the user inputs a sliding motion in the media presentation page, the touch assembly also sends the sensed electrical signal to the controller 250. The controller 250 first determines the duration of the signal corresponding to the touch action in the electrical signal. When the duration time is determined to be longer than the preset time threshold value, the position change condition generated by the signals is judged, and obviously, the generation position of the signals changes for the interactive touch action, so that the user is determined to input a sliding touch instruction. The controller 250 then determines the sliding direction of the sliding touch command according to the change condition of the signal generating position, and controls the page turning of the display screen in the media display page so as to display more media options. Further, the controller 250 may further extract characteristics such as a sliding speed and a sliding distance of the sliding touch instruction, and perform a picture control of turning pages according to the extracted characteristics, so as to achieve a following effect.
Similarly, for the touch instructions such as double-click and long-press, the controller 250 may extract different features, determine the type of the touch instruction through feature judgment, and execute corresponding control functions according to a preset interaction rule. In some embodiments, the touch component 276 also supports multi-touch so that a user can enter touch actions on the touch screen via multiple fingers, e.g., multi-finger clicks, multi-finger long presses, multi-finger swipes, etc.
The touch control action can be matched with a specific application program to realize a specific function. For example, after the user opens the "demonstration whiteboard" application, the display 260 may present a drawing area, the user may draw a specific touch action track in the drawing area through the sliding touch command, and the controller 250 determines a touch action pattern through the touch action detected by the touch component and controls the display 260 to display in real time, so as to satisfy the demonstration effect. For example, it is a basic function of a touch screen display device that a user rotates a finger touching a display to control the display to display a picture. The current interaction mode is that after a plurality of fingers rotate on a screen, the picture immediately rotates to a horizontal or vertical angle according to the rotation direction of the fingers, no interaction process exists, and the user experience is poor.
In some embodiments, a system of display devices may include a Kernel (Kernel), a command parser (shell), a file system, and an application program. The kernel, shell, and file system together form the basic operating system architecture that allows users to manage files, run programs, and use the system. After power-up, the kernel is started, the kernel space is activated, hardware is abstracted, hardware parameters are initialized, virtual memory, a scheduler, signal and inter-process communication (IPC) are operated and maintained. After the kernel is started, shell and user application programs are loaded again. The application program is compiled into machine code after being started to form a process.
Referring to FIG. 4, in some embodiments, the system is divided into four layers, from top to bottom, an application layer (referred to as an "application layer"), an application framework layer (Application Framework) layer (referred to as a "framework layer"), a An Zhuoyun row layer (Android runtime) and a system library layer (referred to as a "system runtime layer"), and a kernel layer, respectively.
In some embodiments, at least one application program is running in the application program layer, and these application programs may be a Window (Window) program of an operating system, a system setting program, a clock program, or the like; or may be an application developed by a third party developer. In particular implementations, the application packages in the application layer are not limited to the above examples.
The framework layer provides an application programming interface (application programming interface, API) and programming framework for the application programs of the application layer. The application framework layer includes a number of predefined functions. The application framework layer corresponds to a processing center that decides to let the applications in the application layer act. Through the API interface, the application program can access the resources in the system and acquire the services of the system in the execution.
As shown in fig. 4, the application framework layer in the embodiment of the present application includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an activity manager (ACTIVITY MANAGER) is used to interact with all activities running in the system; a Location Manager (Location Manager) is used to provide system services or applications with access to system Location services; a package manager (PACKAGE MANAGER) for retrieving various information about the application packages currently installed on the device; a notification manager (Notification Manager) for controlling the display and clearing of notification messages; a Window Manager (Window Manager) is used to manage bracketing icons, windows, toolbars, wallpaper, and desktop components on the user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the individual applications as well as the usual navigation rollback functions, such as controlling the exit, opening, fallback, etc. of the applications. The window manager is used for managing all window programs, such as obtaining the size of the display screen, judging whether a status bar exists or not, locking the screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window to display, dithering display, distorting display, etc.), etc.
In some embodiments, the system runtime layer provides support for the upper layer, the framework layer, and when the framework layer is in use, the android operating system runs the C/C++ libraries contained in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the kernel layer contains at least one of the following drivers: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (e.g., fingerprint sensor, temperature sensor, pressure sensor, etc.), and power supply drive, etc.
Based on the display device, the display device can support the rotation and/or lifting functions by adding the driving component and the gesture detection component. Typically, the drive assembly includes a rotation assembly and/or a lift assembly with which the controller 250 may communicate to control the rotation assembly to drive the display to rotate when it is desired to rotate the display and to control the lift assembly to drive the display to raise or lower when it is desired to raise or lower the display.
In a possible implementation manner, the rotating component and/or the lifting component are provided with a GPIO interface, and the controller changes the GPIO interface state of the rotating component and/or the lifting component by reading the GPIO interface. When the state of the GPIO interface is changed, the rotating component and/or the lifting component drive the display to rotate and/or lift according to the changed state of the GPIO interface.
In a possible implementation, the lifting assembly and/or lifting assembly includes an MCU chip on which a bluetooth module is integrated, such that the lifting assembly and/or lifting assembly supports bluetooth functions, such as Bluetooth Low Energy (BLE), and further, the controller 250 may communicate with the lifting assembly and/or lifting assembly based on a bluetooth protocol.
In some embodiments, the detection assembly includes a sensor for detecting a rotational state of the display and a sensor for detecting a display lift state. During the rotation or lifting process of the display, the controller monitors the rotation state or lifting state of the display in real time according to the data detected by the gesture detection assembly. For example, in the process of controlling the rotation of the display, information such as the rotation angle, the angle speed and the like is acquired by monitoring the data of the sensor. In the process of controlling the lifting of the display, the information such as the lifting distance, the lifting speed and the like is obtained by monitoring the data of the sensor.
In some embodiments, the detection assembly is included in the drive assembly. For example, a sensor for detecting the rotational state of the display is included in the rotating assembly, and constitutes the rotating assembly together with the rotating assembly. The sensor for detecting the display lifting state is included in the lifting assembly and forms the lifting assembly together with the lifting assembly.
In some embodiments, the rotating assembly is referred to as a rotary drive device and the lifting assembly is referred to as a lifting drive device.
Fig. 5 is a schematic rear view of a display device according to the present application, which includes a display 260 and a lift driving device 511, as shown in fig. 5, in some exemplary embodiments. The lift driving device 511 and the lift rail 512 are fixed to the bracket 512. The rotation driving device is then arranged inside the lifting driving device, i.e. between the lifting driving device and the display, not shown in fig. 5.
In some embodiments, a rotational lift control system is deployed in the operating system of the display device, which may be deployed at the application framework layer, or may be deployed cross-layer, for example, with a portion of the functional code deployed at the application framework layer and another portion of the functional code deployed at the runtime layer. The rotational lift control system may communicate with the drive assembly and the detection assembly to uniformly, cooperatively, and safely control the rotation and lift of the display device.
In specific implementation, through the rotation lifting control system, the same control interface provided for each application comprises only control inlets such as clockwise rotation, anticlockwise rotation, stop rotation, short-distance lifting, long-distance lifting and stop lifting, and the like, so that lifting operation and rotation operation are prevented from being performed simultaneously, and the safety of rotation lifting control is ensured. In addition, the rotation lifting control system performs unified logic processing and judgment on control instructions issued by each application. For example, before controlling rotation, it is necessary to determine whether the rotation is in the process of lifting or lowering, and if not, the rotation is controlled again. When the rotation starts, each application is notified through broadcasting, and the application is guaranteed to receive the same notification information. In the process of controlling rotation, the monitoring sensor detects big data to acquire information such as rotation angle, angular speed and the like and update parameters, and the parameter information can be fed back to each application. The system performs a rotation end detection every preset time (e.g., 200 ms), and when the rotation is ended, sends a broadcast notification application.
In some embodiments, the controller 250 turns on the bluetooth module in response to a bluetooth switch turn-on instruction. And scanning the lifting driving device and/or the rotating lifting device according to preset device identifiers through the Bluetooth module, wherein the preset device identifiers can be device names or universal unique identifier codes UUIDs of the lifting driving device and the rotating lifting device. When a plurality of lifting driving devices are scanned according to preset device identifiers, determining a target lifting driving device according to the signal intensity of each lifting driving device; when only one lift driving device is scanned according to the preset device identification, the one lift driving device is determined as the target lift driving device. When a plurality of rotary driving devices are scanned according to preset device identification, determining a target rotary driving device according to the signal intensity of each rotary driving device; when only one rotation driving device is scanned according to the preset device identification, the one rotation driving device is determined as the target rotation driving device.
After the target rotation driving device or the target lifting driving device is determined, communication connection is established with the target rotation driving device or the target lifting driving device, so that when the display lifting needs to be controlled, the target lifting driving device is controlled to drive the display to lift based on the communication connection with the target lifting driving device, and when the display rotation needs to be controlled, the target rotation driving device is controlled to drive the display to rotate based on the communication connection with the target rotation driving device. In this way, when a plurality of lifting driving devices or rotating driving devices with the same type and/or the same identification exist in the environment where the display device is located, driving devices matched with the display device can be screened from the plurality of devices according to the signal intensity of each device, so that the display device is prevented from being connected to other driving devices. The driving device matched with the display device refers to a driving device matched with the mechanical structure of the display device or a driving device mechanically connected with the display device.
For example, fig. 6 is an application scenario, which may be a simple illustration of a mall in particular, of the present application, shown in some exemplary embodiments. As shown in fig. 6, in this scenario, there are a plurality of smart televisions having a lifting function, each of which is equipped with at least one lifting driving device for driving the screen of the smart television to lift. It should be appreciated that in this scenario, the lifting drive device that is powered on and not connected to the smart tv will continuously send bluetooth broadcast information to the outside, so that the smart tv can discover itself after turning on scanning. In such an implementation scenario, after the bluetooth module of a certain smart television starts scanning, a plurality of lifting driving devices with the same type and/or identification can be scanned, wherein the lifting driving devices comprise lifting driving devices matched with the smart television and lifting driving devices matched with other smart televisions in the environment. As can be seen from the above-described embodiments, in this case, the smart tv determines the target lift driving device according to the signal intensity of each lift driving device and establishes a communication connection with the target lift driving device, so that it is possible to avoid connection to an erroneous lift driving device.
In addition, if the user changes the lifting driving device or the rotating driving device of the display device, after the change is completed, the Bluetooth switch of the display device is closed and opened again, and then the pairing and connection flow of the new lifting driving device or the new rotating driving device can be triggered.
In some embodiments, when the controller detects that the bluetooth switch is turned on, a scan instruction is sent to the bluetooth module, where the scan instruction includes a preset device identifier. After receiving the scanning instruction, the Bluetooth module starts scanning. It should be understood that after the bluetooth module starts scanning, bluetooth broadcast information sent by the external device may be received, where the bluetooth broadcast information includes data such as a device identifier, a signal strength RSSI (RECEIVED SIGNAL STRENGTH Indication) value, and the like. The Bluetooth module matches the equipment identifier of the external equipment with the preset equipment identifier, determines the external equipment with the equipment identifier matched with the first preset equipment identifier as lifting driving equipment, and determines the external equipment with the equipment identifier matched with the second preset equipment identifier as rotation driving equipment. Wherein the device identification matches a preset device identification, including but not limited to the same case as the preset device identification. Thus, the display device can filter other external devices according to the preset device identification, and only the scanning result of the lifting driving device or the rotating driving device is reserved.
In some embodiments, when the bluetooth module scans a plurality of lift drive devices according to the first preset device identification, a target lift drive device is determined according to the signal strength of each lift drive device. And when the Bluetooth module scans a plurality of rotary driving devices according to the second preset device identification, determining a target rotary driving device according to the signal intensity of each rotary driving device.
In one possible implementation, the lift drive device with the highest signal strength is determined as the target lift drive device. For example, the signal strength of the lift drive device may be evaluated based on the average level of the RSSI values of the lift drive device over a period of time, and the signal strength of the rotation drive device may be evaluated based on the average level of the RSSI values of the rotation drive device over a period of time. For example, if the bluetooth module scans a plurality of lifting driving devices according to a preset device identifier, the bluetooth module continues scanning, so as to obtain preset several RSSI values of each lifting driving device by continuously receiving bluetooth broadcast information sent by the plurality of lifting driving devices. After a preset number of RSSI values are obtained for each lifting driving device, the Bluetooth module closes scanning, and feeds back the scanned device information of each lifting driving device and the preset number of RSSI values of each lifting driving device to the controller. And the controller calculates the RSSI average value of each lifting driving device according to a preset plurality of RSSI values of each lifting driving device, and determines the lifting driving device with the largest RSSI average value as the target lifting driving device.
In some embodiments, bluetooth Low Energy (BLE) connections are established over the GATT protocol, with the display device in communication with the rotary drive device and/or the lift drive device based on the GATT. GATT, collectively referred to as Generic Attribute Profile, can be translated into a "common Attribute protocol" which refers to a general specification over bluetooth connections for transmitting and receiving very short data segments, referred to as attributes (attributes). GATT defines that two BLE devices communicate through a Service (Service) and a feature value (challenge). The characteristic value of the characteristic may be understood as a tag, and the desired content may be obtained or written by the tag. In the BLE slaves deployed by Service, each Service represents a capability of the BLE slaves, such as a power information Service, a system information Service, and the like, each Service contains a plurality of characteristic feature values, and each specific characteristic feature value is a subject of BLE communication. For example, the current electric quantity is 80%, and the characteristic value of the current lighting passing electric quantity exists in the profile of the slave machine, so that the host machine can read that the current electric quantity is 80% through the characteristic value CHARACTERISTICC of the electric quantity.
It should be noted that, in the GATT protocol, both the Service (Service) and the feature value (characacteristic) of the BLE slave device need to be identified by the universal unique identification code UUID of the BLE slave device, so that if the two BLE devices are to establish a GATT connection, the BLE slave device needs to open the UUID Service. Based on this, in the process of scanning the rotation driving device and/or the lifting driving device, after determining that the scanned external device is the rotation driving device or the lifting driving device, it is further required to determine whether each rotation driving device or lifting driving device opens UUID service, so as to filter out the rotation driving device or lifting driving device that does not open UUID service, so as to determine the target driving device from the rotation driving device or lifting driving device that opens UUID service.
In some embodiments, after determining the target drive device, the controller 250 establishes a GATT communication connection with the target drive device to control the target drive device to drive the display up and down based on the GATT communication connection. In a specific implementation, the controller 250 first performs pairing with the target driving device, and if the pairing is successful, establishes GATT communication connection with the target driving device based on a GATT procedure. If the pairing is failed or the GATT communication connection is failed, scanning the rotation driving device or the lifting driving device according to the preset device identification again, and executing the follow-up steps according to the scanning result.
The rotation lifting control system is used as a GATT client, and the rotation driving device and the lifting driving device are used as GATT servers. When the switch of the Bluetooth module of the display device is turned on, the lifting control system performs Bluetooth pairing with the rotary driving device and the lifting driving device, and establishes GATT communication connection with the rotary driving device and the lifting driving device after the Bluetooth pairing is successful. After connection is successful, the rotary lifting control system is used as a GATT client, information such as the state, the position and the faults of the rotary driving equipment and the lifting driving equipment serving as a GATT server can be queried at regular time, and meanwhile, when the rotary driving equipment or the lifting driving equipment fails or lifts to a limit switch, the state can be actively fed back to the rotary lifting control system, so that each application is guaranteed to read the latest lifting state.
It should be noted that, in the GATT protocol-based network topology, one BLE master (also referred to as a central device) may be connected to a plurality of BLE slaves (also referred to as peripheral devices), and one BLE slave may be connected to only one BLE master. It should be understood that, in the present application, after the display device establishes GATT communication connection with the target driving device, the display device is a BLE master device, and the target driving device is a BLE slave device. When the target driving device is successfully connected with the display device, the Bluetooth broadcasting information is stopped from being sent to the outside.
It should be noted that, both sides based on GATT communication are C/S relationships, where the peripheral device is defined as a GATT server, and service and communication are defined. The central device acts as a GATT client, which initiates a request to the server. It should be noted that all communication events are initiated by the client and are implemented by receiving a response from the server. It should be understood that, in the present application, after the display device establishes GATT communication connection with the target driving device, the display device is used as a GATT client to initiate a request to the server, for example, a request for periodically querying information such as a rotation lifting state, a position, a fault and the like of the target driving device; and the target driving device serves as a GATT server and returns a response to the client.
As can be seen from the above embodiments, according to the display device provided by the embodiment of the present application, when a plurality of rotation driving devices and lifting driving devices with the same type and/or the same identifier exist in an environment where the display device is located, a target rotation driving device and a target lifting driving device matched with the display device can be selected from the plurality of rotation driving devices and lifting driving devices according to signal intensities of the rotation driving devices and the lifting driving devices, so that the display device is prevented from being connected to other driving devices. In addition, if the user changes the rotation driving device or the lifting driving device of the display device, after the change is completed, the Bluetooth switch of the display device is turned off and turned on again, and the pairing and connection flow of the new rotation driving device or the lifting driving device can be triggered.
In some embodiments, a user may control the display to rotate or lift by operating a remote control, opening a system setup application, and accessing an interface operable to rotate and lift the display, and thus by operating a corresponding item in the interface.
For the touch display device, in order to save user operations, a user can more conveniently operate the display to rotate and lift, and the user can also input a user gesture by touching the touch display, and if the input user gesture is a predetermined gesture for controlling rotation or lift, the controller 250 controls the rotation driving device or the lift driving device to drive the display to rotate or lift.
In some embodiments, the controller 250 is configured to: an input user gesture is received. Whether the user gesture corresponds to a predetermined gesture is determined. If the user gesture is a preset rotation gesture, controlling a rotation driving device according to the user gesture so as to drive a display to rotate through the rotation driving device; if the user gesture is a preset lifting gesture, the lifting driving device is controlled according to the user gesture so as to drive the display to ascend or descend through the lifting driving device.
It should be understood that in the above embodiments, the user gesture refers to a contact track of the user with the display screen. The controller 250 determines whether the input user gesture is a predetermined rotation gesture or a predetermined lifting gesture by matching the characteristics of the contact trajectory with the characteristics corresponding to the predetermined rotation gesture and the predetermined lifting gesture. The matching process is the process of recognizing the gesture of the user.
In some embodiments, the user must input a predetermined trigger gesture if he or she wants to input a predetermined rotation gesture or a predetermined lift gesture. In the process of recognizing the user gesture, if the preset trigger gesture is recognized, displaying a preset picture on the top layer of the user interface, and taking the preset picture as a mask, wherein the graphical interactive object in the user interface prevents the user from touching the graphical interactive object in the user interface in the process of inputting the preset rotation gesture or the preset lifting gesture to cause false triggering. Under the condition that the preset picture is displayed, continuously identifying a contact track input subsequently, and if the complete contact track does not correspond to a preset rotation gesture or a preset lifting gesture, withdrawing the preset picture so as to restore to the original user interface; and if the complete contact track corresponds to a preset rotation gesture or a preset lifting gesture, the preset picture is withdrawn when the contact disconnection of the user and the display is detected, so that the original user interface is restored.
In some embodiments, if the multi-finger continuous contact is identified and the duration exceeds a third preset time, the identification of the predetermined trigger gesture is confirmed. Illustratively, when the user' S five fingers are detected to be simultaneously in contact with the touch display and for more than 2S, receipt of the predetermined trigger gesture is confirmed.
In some embodiments, if the input user gesture includes a single-finger continuous fixed contact and a multi-finger arc sliding contact that are synchronized to input, the user gesture is determined to be a predetermined rotation gesture.
For example, as shown in fig. 7, the user may simultaneously touch five fingers at any position of the screen, with the thumb held stationary, and the other four fingers slid in a clockwise or counterclockwise arc with reference to the thumb, to input a predetermined rotation gesture. In this example, the thumb is in contact with a single-finger continuous fixed contact, the other four fingers are in contact with a multi-finger arc sliding contact, and the single-finger continuous fixed contact and the multi-finger scribing sliding contact are synchronously input because the five fingers are always in contact with the screen at the same time.
FIG. 8 is a schematic diagram of a user contact trajectory in some exemplary embodiments of the application. Wherein A1, A2, A3, A4 and A5 are respectively initial contacts of five-finger contact, and B1, B2, B3, B4 and B5 are respectively disconnection points of the five-finger contact. A1 and B1 are coincident, namely single-point continuous contact. The arcs formed by connecting A2 with B2, A3 with B3, A4 with B4, A5 with B5 are arc contact tracks corresponding to sliding of other four fingers respectively, the starting point of each contact track is a corresponding initial contact point, and the end point is a corresponding disconnection point. The four contact tracks enclose a single point contact point A1 (B1). In connection with fig. 8, the display device receives a user gesture by detecting a contact trajectory of a user with a screen contact. If a single point contact and four arc contact trajectories around the point of contact of the single point contact are detected simultaneously, the user gesture is determined to be a predetermined rotation gesture.
It should be appreciated that the user may also touch the thumb, index finger and middle finger simultaneously at any location on the screen, the thumb remaining stationary and the other two fingers sliding in a clockwise or counterclockwise arc with reference to the thumb to input a predetermined rotation gesture. In this example, the thumb is in contact with a single-finger continuous stationary contact, and the other two fingers are in contact with a multi-finger arc sliding contact, the single-finger continuous stationary contact being input in synchronism with the multi-finger arc sliding contact. The user may also use other fingers instead of the thumb to input a single finger for constant fixed contact.
In other embodiments, if the input contact is a single point continuous fixed contact and a multi-finger arc sliding contact of the synchronous input, and the angle corresponding to the multi-finger arc sliding contact is greater than a preset angle, the contact is determined to correspond to a predetermined rotation gesture. It is noted that the criterion of the magnitude relation between the angle corresponding to the sliding contact of the multi-finger arc and the preset angle can be preset according to the requirement. For example, if the angle corresponding to the sliding contact of any one arc is larger than the preset angle, it is determined that the angle corresponding to the sliding contact of the multi-finger arc is larger than the preset angle. Or if the angle corresponding to the sliding contact of each arc is larger than the preset angle, determining that the angle corresponding to the sliding contact of the multi-finger arc is larger than the preset angle.
For example, in the example shown in fig. 7, if the angle of rotation of the other four fingers clockwise or counterclockwise with reference to the thumb exceeds the preset angle, the input contact corresponds to a predetermined rotation gesture.
FIG. 9 is a schematic diagram of a user contact trajectory in some exemplary embodiments of the application. Wherein C1, C2, C3, C4 and C5 are respectively initial contacts of five-finger contact, and D1, D2, D3, D4 and D5 are respectively disconnection points of the five-finger contact. C1 and D1 are coincident, namely single-point continuous contact. The arcs formed by connecting C2 and D2, C3 and D3, C4 and D4, and C5 and D5 are arc contact tracks corresponding to other four fingers respectively, the starting point of each contact track is a corresponding initial contact point, and the end point is a corresponding disconnection point. The four contact tracks enclose a single point contact point C1 (D1). The included angle 1 between C1C2 and D1D2 is the sliding angle of the index finger, the included angle 2 between C1C3 and D1D3 is the sliding angle of the middle finger, the included angle 3 between C1C4 and D1D4 is the sliding angle of the ring finger, and the included angle 4 between C1C5 and D1D5 is the sliding angle of the little finger. In connection with fig. 9, the display device receives a user gesture by detecting a contact trajectory of a user with a screen contact. And if the single-point contact and four contact tracks surrounding the contact point of the single-point contact are detected at the same time, and the corresponding angle of each contact track is larger than a preset angle, determining the gesture of the user as a preset rotation gesture. In some embodiments, if the input contact corresponds to a predetermined rotation gesture, the rotation driving device is controlled to drive the display to rotate 90 ° in the direction of the multi-fingered arc sliding. The direction of the multi-finger arc sliding is the direction from the start point to the end point of the contact track. In the example shown in fig. 8, the direction from the start point to the end point is clockwise, which is determined according to any one of the contact tracks. In the example shown in fig. 9, the direction from the start point to the end point is counterclockwise, which is determined according to any one of the contact tracks.
In a specific implementation manner, an application layer recognizes an input user gesture, and when a preset rotation gesture is recognized and the sliding direction of the input multi-finger arc sliding is clockwise, a clockwise rotation instruction is sent to a rotation lifting control system; and when the preset rotation gesture is recognized and the sliding direction of the input multi-finger arc sliding is a counterclockwise direction, sending a counterclockwise rotation instruction to the rotation lifting control system. After receiving the clockwise rotation instruction or the anticlockwise rotation instruction, the rotary lifting control system judges whether the lifting driving equipment is in a working state, if the lifting driving equipment is not in the working state, the rotary lifting control system sends the clockwise rotation instruction or the anticlockwise rotation instruction to the rotary driving equipment, and if the lifting driving equipment is in the working state, after the lifting driving equipment stops, the rotary lifting control system sends the clockwise rotation instruction or the anticlockwise rotation instruction to the rotary driving equipment. The rotary drive device will control the display to rotate 90 deg. clockwise in response to a clockwise rotation command and to rotate 90 deg. counterclockwise in response to a counterclockwise rotation command.
In some embodiments, in order to control the rotation of the display according to the complete contact track, when the contact of the user with the touch display is detected to be disconnected under the condition that a preset rotation gesture is recognized, the rotation of the display is controlled according to the input contact.
For example, referring to fig. 7, if the user touches the five fingers simultaneously to the display and holds the thumb still while in the landscape orientation at the display, the other four fingers rotate clockwise by more than 30 ° with reference to the thumb and lift the five fingers, and the display will rotate 90 ° along the clockwise pointer under the drive of the rotation driving device to assume the portrait orientation shown in fig. 10. Referring to fig. 10, if the user simultaneously touches the five fingers and holds the thumb still while the display is in the portrait posture, the other four fingers rotate counterclockwise by more than 30 ° with reference to the thumb and lift the five fingers, and the display is rotated by 90 ° along the reverse finger by the driving of the rotation driving device to return to the landscape posture shown in fig. 7.
In some embodiments, if the contact trajectory corresponding to the input user gesture is a multi-finger linear sliding contact and the direction of the multi-finger sliding contact matches the vertical direction, then it is determined that the contact corresponds to the predetermined lift gesture.
For example, as shown in fig. 11, the user may touch five fingers at any position of the screen while sliding up or down to input a predetermined lifting gesture.
FIG. 12 is a schematic view of a user contact trajectory in some exemplary embodiments of the application. Wherein E1, E2, E3, E4 and E5 are respectively initial contacts of five-finger contact, and F1, F2, F3, F4 and F5 are respectively disconnection points of the five-finger contact. The connecting lines between E1 and F1, E2 and F2, E3 and F3, E4 and F4, E5 and F5 are respectively straight-line contact tracks corresponding to five fingers, the starting point of each contact track is a corresponding initial contact point, and the end point is a corresponding disconnection point. The direction of each contact trace matches the vertical direction. In connection with fig. 12, the display device receives a user gesture by detecting a contact trajectory of a user with a screen contact. And if five straight contact tracks matched with the vertical direction are detected at the same time, determining that the gesture of the user is a preset rotation gesture.
It should be appreciated that the user may also touch any position of the screen with three or four fingers while sliding up or down to enter a predetermined lift gesture.
In other embodiments, if the contact track corresponding to the user gesture is multi-finger linear sliding contact and the contact duration is greater than the first preset time, it is determined that the user gesture corresponds to the predetermined lifting gesture. For example, in the example shown in fig. 11, when the user touches the five fingers at any position of the screen while sliding up or down, and the contact duration exceeds a first preset time, the user gesture is determined to be a predetermined lift gesture.
In other embodiments, if the input user gesture is a continuous input multi-finger linear sliding contact and a multi-finger continuous fixed contact, wherein the contact point of the multi-finger continuous contact is a sliding termination point of the multi-finger linear sliding, and the duration of the multi-finger continuous fixed contact is greater than a second preset time, the contact is determined to correspond to the predetermined lifting gesture. For example, in the example shown in fig. 11, when the user touches the five fingers at an arbitrary position of the screen while sliding up or down, and after the sliding is terminated, the user gesture is determined to be a predetermined lifting gesture by staying at the sliding termination point for more than a second preset time.
In some embodiments, if the user gesture corresponds to a predetermined lift gesture, the lift driving device is controlled according to the user gesture to drive the display up or down by the lift driving device.
In a specific implementation manner, an application layer recognizes an input user gesture, and when a preset lifting gesture is recognized and the sliding direction of the input multi-finger linear sliding is matched with the upward direction, a lifting instruction is sent to a rotary lifting control system; and when the preset lifting gesture is recognized and the sliding direction of the input multi-finger linear sliding is matched with the downward direction, sending a descending instruction to the rotary lifting control system. After receiving the ascending instruction or the descending instruction, the rotary lifting control system judges whether the rotary driving equipment is in a working state, if the rotary driving equipment is not in the working state, the ascending instruction or the descending instruction is sent to the lifting driving equipment, and if the rotary driving equipment is in the working state, the ascending instruction or the descending instruction is sent to the lifting driving equipment after the rotary driving equipment is stopped. The lifting driving device will drive the display up in response to the up command and drive the display down in response to the down command.
The sliding direction matches with the upward direction, which means that the component of the sliding direction in the vertical direction is upward; the sliding direction matches the downward direction, meaning that the component of the sliding direction in the vertical direction is downward.
In some embodiments, in order to control the display to lift according to the complete contact track, when a predetermined lifting gesture is recognized, the display is lifted according to the control when the contact between the user and the touch display is detected to be disconnected.
In some embodiments, the user may control the display to be lifted long or short by adjusting the sliding distance of the multi-finger linear sliding contact. In particular, the correspondence between the sliding distance and the lifting distance may be established in advance. Further, the lifting distance corresponding to the sliding distance may be determined according to the preset correspondence. Optionally, in the preset corresponding relationship, the larger the sliding distance is, the larger the corresponding lifting distance is.
For example, if the sliding distance of the multi-finger linear sliding contact is greater than a preset distance (e.g., 15 cm), the lifting assembly is controlled to drive the display to rise or fall by a first distance. And if the sliding distance of the multi-finger linear sliding contact is not greater than the preset distance, controlling the lifting assembly to drive the display to ascend or descend by a second distance, wherein the second distance is smaller than the first distance.
It is noted that the criterion of the magnitude relation between the sliding distance and the preset distance may be preset according to the requirement. For example, if the sliding distance of any one of the linear sliding contacts is greater than the preset distance, it is determined that the sliding distance of the multi-finger linear sliding contact is greater than the preset distance. Or if the sliding distance of the linear sliding contact is greater than the preset distance, determining that the sliding distance of the multi-finger linear sliding contact is greater than the preset distance.
Referring to fig. 12, the distances between E1 and F1, E2 and F2, E3 and F3, E4 and F4, and E5 and F5 are respectively the sliding distances of the respective linear sliding contacts. Illustratively, if the user simultaneously touches the five fingers to the display and slides up for 10cm while the display is at the first height shown in fig. 11, and lifts up after resting for more than 10S after the sliding is terminated or lifts up after the contact time with the display exceeds 10S, the display will rise by 50mm under the driving of the lifting assembly, thereby assuming the second height shown in fig. 13, the second height=the first height+50mm. If the user simultaneously touches the five fingers to the display and slides down for 20cm while the display is at the second height shown in fig. 13 and lifts up after resting for more than 10S after the end of the sliding or after the contact time with the display exceeds 10S, the display will descend by 200mm driven by the elevating assembly, thereby assuming the third height shown in fig. 14, the third height=the second height-200 mm. In this example, the preset distance is 15cm, the first distance is 200mm and the second distance is 50mm.
According to the display device provided by the embodiment of the application, the user can control the rotation or lifting of the display by inputting the preset gesture for controlling the rotation or lifting of the display, so that the user can more conveniently operate the rotation and lifting of the display, the user operation is saved, and the user experience is improved.
It should be noted that the user gestures include a touch gesture collected based on the touch sensitive component and a space gesture collected by a user image collected by the camera. Based on the idea of controlling the rotation and/or the lifting of the display device through the touch gesture in the embodiment of the application, a person skilled in the art can obtain the technical scheme of controlling the rotation and/or the lifting of the display device through the space-apart gesture without creative labor.
Based on the display device provided by the embodiment of the present application, the embodiment of the present application further provides a display device control method, as shown in fig. 15, where the method may include:
s151, receiving input user gestures.
And S152, if the user gesture is a preset rotation gesture, controlling a rotation driving device to drive a display to rotate according to the user gesture.
In some embodiments, the user gesture is determined to be a predetermined rotation gesture if the user gesture is a synchronous input single-finger continuous fixed contact and multi-finger arc sliding contact. For example, the user gestures described in the corresponding embodiments of fig. 7-9.
In other embodiments, if the user gesture is a single point continuous fixed contact and multi-finger arc sliding contact of synchronous input, and the angle corresponding to the multi-finger arc sliding contact is greater than a preset angle, the user gesture is determined to be a predetermined rotation gesture.
In some embodiments, if the sliding direction of the multi-fingered arc sliding contact is clockwise, controlling the rotation driving device to drive the display to rotate 90 ° in the clockwise direction; and if the sliding direction of the multi-finger arc sliding contact is a counterclockwise direction, controlling the rotation driving device to drive the display to rotate 90 degrees in the counterclockwise direction.
And S153, if the gesture of the user is a preset lifting gesture, controlling the lifting driving device to drive the display to ascend or descend according to the gesture of the user.
In some embodiments, if the user gesture is a multi-finger linear sliding contact and a sliding direction of the multi-finger linear sliding contact matches a vertical direction, the user gesture is determined to be a predetermined lift gesture. For example, fig. 11-12 correspond to user gestures shown in the embodiments.
In other embodiments, if the user gesture is a multi-finger straight sliding contact and the contact duration is greater than a first preset time, the user gesture is determined to be a predetermined lift gesture.
In other embodiments, if the user gesture is a continuously input multi-finger linear sliding contact and multi-finger continuous fixed contact, and the duration of the multi-finger continuous fixed contact is greater than a second preset time, the contact point of the multi-finger continuous fixed contact is a sliding termination point of the multi-finger linear sliding, and the user gesture is determined to be a predetermined lifting gesture.
In some embodiments, if the sliding direction of the multi-finger linear sliding contact matches an upward direction, controlling the elevation driving apparatus to drive the display to ascend; and if the sliding direction of the multi-finger linear sliding contact is matched with the downward direction, controlling the lifting driving device to drive the display to descend.
In addition, according to the preset corresponding relation between the sliding distance and the lifting distance, the lifting distance corresponding to the sliding distance of the multi-finger linear sliding contact is determined; and controlling the lifting driving device to drive the display to ascend or descend by the lifting distance.
In some embodiments, the driving device is controlled to drive the display to rotate, rise or fall according to the contact when the contact is detected to be broken.
In some embodiments, if the contact is a multi-finger continuous contact and the duration is greater than a third preset time, the contact is determined to be a predetermined trigger gesture. If the contact corresponds to a preset trigger gesture, displaying a preset picture on the top layer of the user interface so as to cover a graphical interaction object in the user interface; and when the user gesture is judged not to be the preset rotation gesture or the preset lifting gesture or when the contact disconnection of the user and the display is detected, the preset picture is withdrawn.
According to the display equipment control method provided by the embodiment of the application, the user can input the preset gesture for controlling the rotation or lifting of the display by touching the touch display, so that the user can more conveniently operate the rotation and lifting of the display, the user operation is saved, and the user experience is improved.
In a specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, where the program may include some or all of the steps in each embodiment of the display device control method provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a random-access memory (random access memory RAM), or the like.
It will be apparent to those skilled in the art that the techniques of embodiments of the present invention may be implemented in software plus a necessary general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be embodied in essence or what contributes to the prior art in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the embodiments or some parts of the embodiments of the present invention.
The same or similar parts between the various embodiments in this specification are referred to each other. In particular, for the display device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference should be made to the description of the method embodiments for the matters.
The embodiments of the present invention described above do not limit the scope of the present invention.

Claims (11)

1. A display device, characterized by comprising:
A display rotatable and/or liftable;
A rotation driving device for driving the display to rotate and/or a lifting driving device for driving the display to lift;
a controller configured to:
receiving an input user-triggered gesture;
If the user trigger gesture is a preset trigger gesture, displaying a preset picture on the top layer of the user interface;
continuing to receive an input user gesture, the user gesture comprising a contact trajectory;
if the user gesture is a synchronous input single-finger continuous fixed contact and multi-finger arc sliding contact, controlling the rotary driving device to drive the display to rotate according to the user gesture;
If the gesture of the user is multi-finger linear sliding contact, and the sliding direction of the multi-finger linear sliding contact is matched with the vertical direction, controlling the lifting driving device to drive the display to ascend or descend according to the gesture of the user;
And if the user gesture is not a preset rotation gesture or a preset lifting gesture, or when the contact disconnection of the user finger and the display is detected, the preset picture is withdrawn.
2. The display device of claim 1, wherein the controller is configured to:
And when the contact disconnection between the user and the display is detected, controlling the rotation driving device to drive the display to rotate according to the gesture of the user, or controlling the lifting driving device to drive the display to ascend or descend.
3. The display device according to claim 1, wherein:
If the user gesture is a single-point continuous fixed contact and multi-finger arc sliding contact which are synchronously input, and the angle corresponding to the multi-finger arc sliding contact is larger than a preset angle, determining that the user gesture is a preset rotation gesture;
And if the user gesture is multi-finger linear sliding contact and the contact duration is longer than a first preset time, determining that the user gesture is a preset lifting gesture.
4. The display device of claim 1, wherein if the user gesture is a continuously input multi-finger straight sliding contact and multi-finger continuous fixed contact and the duration of the multi-finger continuous fixed contact is greater than a second preset time, the contact point of the multi-finger continuous fixed contact is a sliding termination point of the multi-finger straight sliding, the user gesture is determined to be a predetermined lift gesture.
5. The display device of claim 1, wherein if the user gesture is a multi-finger continuous contact and the contact duration is greater than a third preset time, the user gesture is determined to be a predetermined trigger gesture.
6. The display device of claim 3 or 4, wherein controlling the rotation driving device to drive the display to rotate according to the user gesture comprises:
If the sliding direction of the multi-finger arc sliding contact is clockwise, controlling the rotation driving device to drive the display to rotate 90 degrees in the clockwise direction;
and if the sliding direction of the multi-finger arc sliding contact is a counterclockwise direction, controlling the rotation driving device to drive the display to rotate 90 degrees in the counterclockwise direction.
7. The display device according to claim 3 or 4, wherein controlling the lift driving device to drive the display up or down according to the user gesture comprises:
If the sliding direction of the multi-finger linear sliding contact is matched with the upward direction, controlling the lifting driving device to drive the display to ascend;
and if the sliding direction of the multi-finger linear sliding contact is matched with the downward direction, controlling the lifting driving device to drive the display to descend.
8. The display device according to claim 3 or 4, wherein controlling the lift driving device to drive the display up or down according to the user gesture comprises:
determining the lifting distance corresponding to the sliding distance of the multi-finger linear sliding contact according to the preset corresponding relation between the sliding distance and the lifting distance;
and controlling the lifting driving device to drive the display to ascend or descend by the lifting distance.
9. The display apparatus according to claim 8, wherein determining the lifting distance corresponding to the sliding distance of the multi-finger linear sliding contact according to the preset correspondence of the sliding distance and the lifting distance, comprises:
if the sliding distance of the multi-finger linear sliding contact is greater than a preset distance, determining the lifting distance as a first distance;
If the sliding distance of the multi-finger linear sliding contact is not greater than the preset distance, determining that the lifting distance is a second distance;
wherein the first distance is greater than the second distance.
10. The display device of claim 1, wherein the controller is further configured to:
Responding to a Bluetooth switch starting instruction, and scanning the lifting driving device and/or the rotating driving device according to a preset device identifier;
Determining a target lifting driving device according to signal intensities of a plurality of lifting driving devices when the lifting driving devices are scanned, and determining a target rotating driving device according to signal intensities of a plurality of rotating driving devices when the rotating driving devices are scanned;
and establishing communication connection with the target lifting driving device and/or the target rotating driving device, so as to control the target lifting driving device to drive the display to lift and/or control the target rotating driving device to drive the display to rotate based on the communication connection.
11. A display device control method, characterized by being applied to a display device that is rotatable and/or liftable; the method comprises the following steps:
receiving an input user-triggered gesture;
If the user trigger gesture is a preset trigger gesture, displaying a preset picture on the top layer of the user interface;
Continuing to receive an input user gesture, the user gesture comprising a contact trajectory;
If the user gesture is a synchronous input single-finger continuous fixed contact and multi-finger arc sliding contact, controlling the display equipment to rotate according to the user gesture;
If the user gesture is multi-finger linear sliding contact, and the sliding direction of the multi-finger linear sliding contact is matched with the vertical direction, controlling the display equipment to ascend or descend according to the user gesture;
And if the user gesture is not a preset rotation gesture or a preset lifting gesture, or when the contact disconnection of the user finger and the display is detected, the preset picture is withdrawn.
CN202110499421.3A 2021-04-30 2021-04-30 Display apparatus and control method thereof Active CN114296542B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110499421.3A CN114296542B (en) 2021-04-30 2021-04-30 Display apparatus and control method thereof
PCT/CN2021/096003 WO2022227159A1 (en) 2021-04-30 2021-05-26 Display device and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110499421.3A CN114296542B (en) 2021-04-30 2021-04-30 Display apparatus and control method thereof

Publications (2)

Publication Number Publication Date
CN114296542A CN114296542A (en) 2022-04-08
CN114296542B true CN114296542B (en) 2024-05-17

Family

ID=80963810

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110499421.3A Active CN114296542B (en) 2021-04-30 2021-04-30 Display apparatus and control method thereof

Country Status (1)

Country Link
CN (1) CN114296542B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008109298A (en) * 2006-10-24 2008-05-08 Seiko Epson Corp Remote controller and device and system for displaying information
CN103902026A (en) * 2012-12-25 2014-07-02 鸿富锦精密工业(武汉)有限公司 System and method for automatically adjusting display screen
CN104461335A (en) * 2013-09-25 2015-03-25 联想(北京)有限公司 Data processing method and electronic instrument
CN105786349A (en) * 2016-02-26 2016-07-20 广东欧珀移动通信有限公司 Method for controlling screen picture to rotate and electronic device
CN106873765A (en) * 2016-12-27 2017-06-20 比亚迪股份有限公司 The switching method and apparatus of the screen state of car-mounted terminal
CN206497399U (en) * 2017-01-25 2017-09-15 吉首大学张家界学院 A kind of computer display based on gesture control
US10082902B1 (en) * 2016-07-07 2018-09-25 Rockwell Collins, Inc. Display changes via discrete multi-touch gestures
CN109027627A (en) * 2018-08-23 2018-12-18 广州视源电子科技股份有限公司 Show equipment and its spinning solution, device, system and wall hanging frame
CN109254657A (en) * 2018-08-23 2019-01-22 广州视源电子科技股份有限公司 The spinning solution and device of interactive intelligence equipment
CN111885406A (en) * 2020-07-30 2020-11-03 深圳创维-Rgb电子有限公司 Smart television control method and device, rotatable television and readable storage medium
CN111913608A (en) * 2020-07-31 2020-11-10 海信视像科技股份有限公司 Touch screen rotation control interaction method and display device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150026255A (en) * 2013-09-02 2015-03-11 삼성전자주식회사 Display apparatus and control method thereof
CN116600157A (en) * 2020-03-13 2023-08-15 海信视像科技股份有限公司 Display apparatus

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008109298A (en) * 2006-10-24 2008-05-08 Seiko Epson Corp Remote controller and device and system for displaying information
CN103902026A (en) * 2012-12-25 2014-07-02 鸿富锦精密工业(武汉)有限公司 System and method for automatically adjusting display screen
CN104461335A (en) * 2013-09-25 2015-03-25 联想(北京)有限公司 Data processing method and electronic instrument
CN105786349A (en) * 2016-02-26 2016-07-20 广东欧珀移动通信有限公司 Method for controlling screen picture to rotate and electronic device
US10082902B1 (en) * 2016-07-07 2018-09-25 Rockwell Collins, Inc. Display changes via discrete multi-touch gestures
CN106873765A (en) * 2016-12-27 2017-06-20 比亚迪股份有限公司 The switching method and apparatus of the screen state of car-mounted terminal
CN206497399U (en) * 2017-01-25 2017-09-15 吉首大学张家界学院 A kind of computer display based on gesture control
CN109027627A (en) * 2018-08-23 2018-12-18 广州视源电子科技股份有限公司 Show equipment and its spinning solution, device, system and wall hanging frame
CN109254657A (en) * 2018-08-23 2019-01-22 广州视源电子科技股份有限公司 The spinning solution and device of interactive intelligence equipment
CN111885406A (en) * 2020-07-30 2020-11-03 深圳创维-Rgb电子有限公司 Smart television control method and device, rotatable television and readable storage medium
CN111913608A (en) * 2020-07-31 2020-11-10 海信视像科技股份有限公司 Touch screen rotation control interaction method and display device

Also Published As

Publication number Publication date
CN114296542A (en) 2022-04-08

Similar Documents

Publication Publication Date Title
CN113810746B (en) Display equipment and picture sharing method
CN113784200B (en) Communication terminal, display device and screen projection connection method
WO2022048203A1 (en) Display method and display device for manipulation prompt information of input method control
CN114157889B (en) Display equipment and touch control assisting interaction method
CN111970549A (en) Menu display method and display device
CN111901646A (en) Display device and touch menu display method
CN113630656B (en) Display device, terminal device and communication connection method
WO2022028060A1 (en) Display device and display method
CN111818654B (en) Channel access method and display device
CN112269668A (en) Application resource sharing and display equipment
CN114296542B (en) Display apparatus and control method thereof
CN112947783B (en) Display device
CN112650418B (en) Display device
WO2022227159A1 (en) Display device and control method
CN111913622B (en) Screen interface interactive display method and display equipment
CN113810747B (en) Display equipment and signal source setting interface interaction method
CN114007128A (en) Display device and network distribution method
CN114302377A (en) Display device and communication connection method
CN113485614A (en) Display apparatus and color setting method
CN114007129A (en) Display device and network distribution method
CN112732120A (en) Display device
CN113542882A (en) Method for awakening standby display device, display device and terminal
CN114079827A (en) Menu display method and display device
WO2022001635A1 (en) Display device and display method
CN114281284B (en) Display apparatus and image display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant