CN114157889B - Display equipment and touch control assisting interaction method - Google Patents

Display equipment and touch control assisting interaction method Download PDF

Info

Publication number
CN114157889B
CN114157889B CN202010831749.6A CN202010831749A CN114157889B CN 114157889 B CN114157889 B CN 114157889B CN 202010831749 A CN202010831749 A CN 202010831749A CN 114157889 B CN114157889 B CN 114157889B
Authority
CN
China
Prior art keywords
touch
display
mapping
mapping window
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010831749.6A
Other languages
Chinese (zh)
Other versions
CN114157889A (en
Inventor
马晓燕
庞秀娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202010831749.6A priority Critical patent/CN114157889B/en
Publication of CN114157889A publication Critical patent/CN114157889A/en
Application granted granted Critical
Publication of CN114157889B publication Critical patent/CN114157889B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a display device and a touch assistance interaction method, wherein the method can be applied to the display device to realize touch interaction. After a touch instruction for executing touch assistance is acquired, the method can generate a mapping window according to the current user interface in response to the touch instruction, and display the mapping window in an association area of the touch instruction. According to the method, the mapping window can be displayed in the area which is convenient for the user to operate by inputting a specific touch instruction, so that the user can execute touch operation in the mapping window, touch operation on the whole user interface or a part of the area is realized, the user does not need to move in the operation process, and the user can conveniently complete interactive operation.

Description

Display equipment and touch control assisting interaction method
Technical Field
The application relates to the technical field of intelligent televisions, in particular to a display device and a touch assistance interaction method.
Background
The intelligent television is a television product integrating multiple functions of video, audio, entertainment, data and the like and is based on the Internet application technology, has an open operating system and a chip, has an open application platform, can realize a two-way human-computer interaction function, and is used for meeting the diversified and personalized requirements of users. In order to further meet the interaction requirement of users, a touch component is arranged in part of the intelligent televisions, so that the intelligent televisions support touch interaction operation.
The touch control component can detect the touch position of a user in real time, and responds to the touch action according to a control arranged on the touch position to display preset content. Thus, depending on the touch operation of the user at different locations, a response may be triggered to display different content according to different controls. For example, a user enters a touch action at any of the media asset link control positions, the smart television may access the media asset link to play the corresponding media asset content in response to the touch action.
With the development of technology, the screen size of the smart television is larger and larger, and the larger screen size can display more contents and better viewing experience, but if the touch operation is performed, a position where the user cannot touch is formed. For example, when a user stands at a position close to the left side of the screen to operate, touch control cannot be performed on an area close to the right side of the screen, so that the user must step to a position close to the right side of the screen to operate, the user is inconvenient to complete interactive operation, and interactive experience of the user is reduced.
Disclosure of Invention
The application provides a display device and a touch assistance interaction method, which are used for solving the problem that a traditional touch television is inconvenient for a user to complete interaction operation.
In a first aspect, the present application provides a display device including a display, a touch assembly, and a controller. Wherein the display is configured to display a user interface, the touch assembly is configured to detect a touch action entered by a user, and the controller is configured to perform the following program steps:
acquiring a touch instruction input by a user and used for executing touch assistance;
responding to the touch instruction, generating a mapping window according to the current user interface, wherein the mapping window comprises all or part of controls in the current user interface, and the controls in the mapping window have an association relation with the controls in the current user interface;
and displaying the mapping window in the association area of the touch instruction.
Based on the display device, the first aspect of the application further provides a touch assistance interaction method, which includes:
acquiring a touch instruction input by a user and used for executing touch assistance;
responding to the touch instruction, generating a mapping window according to the current user interface, wherein the mapping window comprises all or part of controls in the current user interface, and the controls in the mapping window have an association relation with the controls in the current user interface;
And displaying the mapping window in the association area of the touch instruction.
As can be seen from the above technical solutions, the first aspect of the present application provides a display device and a touch assistance interaction method, where the method may be applied to the display device to implement touch interaction. After a touch instruction for executing touch assistance is acquired, the method can generate a mapping window according to the current user interface in response to the touch instruction, and display the mapping window in an association area of the touch instruction. According to the method, the mapping window can be displayed in the area which is convenient for the user to operate by inputting a specific touch instruction, so that the user can execute touch operation in the mapping window, touch operation on the whole user interface or a part of the area is realized, the user does not need to move in the operation process, and the user can conveniently complete interactive operation.
In a second aspect, the present application provides a display device including a display, a touch assembly, and a controller. Wherein the display is configured to display a user interface, the touch assembly is configured to detect a touch action entered by a user, and the controller is configured to perform the following program steps:
acquiring an action instruction input by a user on a mapping window, wherein the mapping window comprises all or part of controls in a current user interface, and the controls in the mapping window have an association relationship with the controls in the current user interface;
Responding to the action instruction, and extracting the touch position of the action instruction in a mapping window;
and executing the control action corresponding to the touch position in the current user interface according to the association relation.
Based on the display device, the second aspect of the application further provides a touch assistance interaction method, which includes:
acquiring an action instruction input by a user on a mapping window, wherein the mapping window comprises all or part of controls in a current user interface, and the controls in the mapping window have an association relationship with the controls in the current user interface;
responding to the action instruction, and extracting the touch position of the action instruction in a mapping window;
and executing the control action corresponding to the touch position in the current user interface according to the association relation.
According to the technical scheme, after the mapping window is displayed, the controller of the display device responds to the action instruction by inputting the action instruction on the mapping window, and the corresponding touch position is extracted, so that the control action corresponding to the touch position in the current user interface is executed according to the association relation. According to the method, the operation on the mapping window can be equivalently implemented in the current user interface through the association relation between the mapping window and the control in the current user interface, so that a user can finish the operation in the whole user interface in a region convenient to operate, and the user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control device in an embodiment of the present application;
fig. 2 is a hardware configuration block diagram of a display device in an embodiment of the present application;
fig. 3 is a hardware configuration block diagram of a control device in an embodiment of the present application;
fig. 4 is a schematic diagram of a software configuration of a display device according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an icon control interface display of a display device application in an embodiment of the present application;
FIG. 6 is a schematic diagram of a control home page in an embodiment of the present application;
fig. 7 is a flowchart of a touch assistance interaction method in an embodiment of the present application;
FIG. 8 is a schematic diagram of a control home map window in an embodiment of the present application;
FIG. 9 is a flowchart of a touch instruction acquisition process by monitoring a touch event in an embodiment of the present application;
FIG. 10 is a schematic diagram of a multi-finger touch command according to an embodiment of the present application;
FIG. 11 is a schematic diagram of an edge slide touch command in an embodiment of the disclosure;
FIG. 12 is a schematic diagram of a gesture touch command for two-handed expansion in an embodiment of the present application;
FIG. 13 is a schematic diagram of a single-hand spread gesture touch command in an embodiment of the present application;
FIG. 14 is a schematic diagram of a rotating gesture touch command in an embodiment of the present application;
FIG. 15 is a schematic diagram of a multi-finger co-sliding gesture touch command in an embodiment of the present application;
FIG. 16 is a schematic diagram illustrating the input of a touch command through a touch menu in an embodiment of the present application;
FIG. 17 is a diagram of a map window display indirect touch area screen according to an embodiment of the present application;
FIG. 18 is a diagram of a mapping window display custom region screen according to an embodiment of the present application;
FIG. 19 is a schematic view of a touch pad display according to an embodiment of the present disclosure;
FIG. 20 is a flowchart illustrating a mapping window in an embodiment of the present application;
FIG. 21 is a schematic view of touch point locations in an embodiment of the present application;
FIG. 22 is a flowchart illustrating another touch-assisted interaction method according to an embodiment of the present disclosure;
FIG. 23 is a schematic diagram of a refresh map window in an embodiment of the present application;
fig. 24 is a schematic diagram of a refresh display application detail interface in an embodiment of the present application.
Detailed Description
For purposes of clarity, embodiments and advantages of the present application, the following description will make clear and complete the exemplary embodiments of the present application, with reference to the accompanying drawings in the exemplary embodiments of the present application, it being apparent that the exemplary embodiments described are only some, but not all, of the examples of the present application.
Based on the exemplary embodiments described herein, all other embodiments that may be obtained by one of ordinary skill in the art without making any inventive effort are within the scope of the claims appended hereto. Furthermore, while the disclosure is presented in the context of an exemplary embodiment or embodiments, it should be appreciated that the various aspects of the disclosure may, separately, comprise a complete embodiment.
It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms first, second, third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated (Unless otherwise indicated). It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Furthermore, the terms "comprise" and "have," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to those elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" as used in this application refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the function associated with that element.
The term "remote control" as used in this application refers to a component of an electronic device (such as a display device as disclosed in this application) that can typically be controlled wirelessly over a relatively short distance. Typically, the electronic device is connected to the electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in a general remote control device with a touch screen user interface.
The term "gesture" as used herein refers to a user behavior by which a user expresses an intended idea, action, purpose, and/or result through a change in hand shape or movement of a hand, etc.
A schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment is exemplarily shown in fig. 1. As shown in fig. 1, a user may operate the display apparatus 200 through the mobile terminal 300 and the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication or bluetooth protocol communication, and other short-range communication modes, etc., and the display device 200 is controlled by a wireless or other wired mode. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc. Such as: the user can input corresponding control instructions through volume up-down keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, on-off keys, etc. on the remote controller to realize the functions of the control display device 200.
In some embodiments, mobile terminals, tablet computers, notebook computers, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application running on a smart device. The application program, by configuration, can provide various controls to the user in an intuitive User Interface (UI) on a screen associated with the smart device.
In some embodiments, the mobile terminal 300 may install a software application with the display device 200, implement connection communication through a network communication protocol, and achieve the purpose of one-to-one control operation and data communication. Such as: it is possible to implement a control command protocol established between the mobile terminal 300 and the display device 200, synchronize a remote control keyboard to the mobile terminal 300, and implement a function of controlling the display device 200 by controlling a user interface on the mobile terminal 300. The audio/video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display device 200 is also in data communication with the server 400 via a variety of communication means. The display device 200 may be permitted to make communication connections via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. By way of example, display device 200 receives software program updates, or accesses a remotely stored digital media library by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The server 400 may be a cluster, or may be multiple clusters, and may include one or more types of servers. Other web service content such as video on demand and advertising services are provided through the server 400.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The particular display device type, size, resolution, etc. are not limited, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide a smart network television function of a computer support function, including, but not limited to, a network television, a smart television, an Internet Protocol Television (IPTV), etc., in addition to the broadcast receiving television function.
A hardware configuration block diagram of the display device 200 according to an exemplary embodiment is illustrated in fig. 2.
In some embodiments, at least one of the controller 250, the modem 210, the communicator 220, the detector 230, the input/output interface 255, the display 275, the audio output interface 285, the memory 260, the power supply 290, the user interface 265, and the external device interface 240 is included in the display apparatus 200.
In some embodiments, the display 275 is configured to receive image signals from the first processor output, and to display video content and images and components of the menu manipulation interface.
In some embodiments, display 275 includes a display screen assembly for presenting pictures, and a drive assembly for driving the display of images.
In some embodiments, the video content is displayed from broadcast television content, or alternatively, from various broadcast signals that may be received via a wired or wireless communication protocol. Alternatively, various image contents received from the network server side transmitted from the network communication protocol may be displayed.
In some embodiments, the display 275 is used to present a user-manipulated UI interface generated in the display device 200 and used to control the display device 200.
In some embodiments, depending on the type of display 275, a drive assembly for driving the display is also included.
In some embodiments, display 275 is a projection display and may further include a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver.
In some embodiments, the display device 200 may establish control signal and data signal transmission and reception between the communicator 220 and the external control device 100 or the content providing device.
In some embodiments, the user interface 265 may be used to receive infrared control signals from the control device 100 (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is a signal that the display device 200 uses to capture or interact with the external environment.
In some embodiments, the detector 230 includes an optical receiver, a sensor for capturing the intensity of ambient light, a parameter change may be adaptively displayed by capturing ambient light, etc.
In some embodiments, the detector 230 may further include an image collector, such as a camera, a video camera, etc., which may be used to collect external environmental scenes, collect attributes of a user or interact with a user, adaptively change display parameters, and recognize a user gesture to realize an interaction function with the user.
In some embodiments, the detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature.
In some embodiments, the display device 200 may adaptively adjust the display color temperature of the image. The display device 200 may be adjusted to display a colder color temperature shade of the image, such as when the temperature is higher, or the display device 200 may be adjusted to display a warmer color shade of the image when the temperature is lower.
In some embodiments, the detector 230 may also be a sound collector or the like, such as a microphone, that may be used to receive the user's sound. Illustratively, a voice signal including a control instruction for a user to control the display apparatus 200, or an acquisition environmental sound is used to recognize an environmental scene type so that the display apparatus 200 can adapt to environmental noise.
In some embodiments, as shown in fig. 2, the input/output interface 255 is configured to enable data transfer between the controller 250 and external other devices or other controllers 250. Such as receiving video signal data and audio signal data of an external device, command instruction data, or the like.
In some embodiments, external device interface 240 may include, but is not limited to, the following: any one or more interfaces of a high definition multimedia interface HDMI interface, an analog or data high definition component input interface, a composite video input interface, a USB input interface, an RGB port, and the like can be used. The plurality of interfaces may form a composite input/output interface.
In some embodiments, as shown in fig. 2, the modem 210 is configured to receive the broadcast television signal by a wired or wireless receiving manner, and may perform modulation and demodulation processes such as amplification, mixing, and resonance, and demodulate the audio/video signal from a plurality of wireless or wired broadcast television signals, where the audio/video signal may include a television audio/video signal carried in a television channel frequency selected by a user, and an EPG data signal.
In some embodiments, the frequency point demodulated by the modem 210 is controlled by the controller 250, and the controller 250 may send a control signal according to the user selection, so that the modem responds to the television signal frequency selected by the user and modulates and demodulates the television signal carried by the frequency.
In some embodiments, the broadcast television signal may be classified into a terrestrial broadcast signal, a cable broadcast signal, a satellite broadcast signal, an internet broadcast signal, or the like according to a broadcasting system of the television signal. Or may be differentiated into digital modulation signals, analog modulation signals, etc., depending on the type of modulation. Or it may be classified into digital signals, analog signals, etc. according to the kind of signals.
In some embodiments, the controller 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the main device in which the controller 250 is located, such as an external set-top box or the like. In this way, the set-top box outputs the television audio and video signals modulated and demodulated by the received broadcast television signals to the main body equipment, and the main body equipment receives the audio and video signals through the first input/output interface.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored on the memory. The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command to select to display a UI object on the display 275, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation of connecting to a hyperlink page, a document, an image, or the like, or executing an operation of a program corresponding to the icon. The user command for selecting the UI object may be an input command through various input means (e.g., mouse, keyboard, touch pad, etc.) connected to the display device 200 or a voice command corresponding to a voice uttered by the user.
As shown in fig. 2, the controller 250 includes at least one of a random access Memory 251 (Random Access Memory, RAM), a Read-Only Memory 252 (ROM), a video processor 270, an audio processor 280, other processors 253 (e.g., a graphics processor (Graphics Processing Unit, GPU), a central processing unit 254 (CentralProcessing Unit, CPU), a communication interface (Communication Interface), and a communication Bus 256 (Bus), which connects the respective components.
In some embodiments, RAM 251 is used to store temporary data for the operating system or other on-the-fly programs.
In some embodiments, ROM 252 is used to store instructions for various system boots.
In some embodiments, ROM 252 is used to store a basic input output system, referred to as a basic input output system (Basic Input Output System, BIOS). The system comprises a drive program and a boot operating system, wherein the drive program is used for completing power-on self-checking of the system, initialization of each functional module in the system and basic input/output of the system.
In some embodiments, upon receipt of the power-on signal, the display device 200 power starts up, the CPU runs system boot instructions in the ROM 252, copies temporary data of the operating system stored in memory into the RAM 251, in order to start up or run the operating system. When the operating system is started, the CPU copies temporary data of various applications in the memory to the RAM 251, and then, facilitates starting or running of the various applications.
In some embodiments, processor 254 is used to execute operating system and application program instructions stored in memory. And executing various application programs, data and contents according to various interactive instructions received from the outside, so as to finally display and play various audio and video contents.
In some example embodiments, the processor 254 may include a plurality of processors. The plurality of processors may include one main processor and one or more sub-processors. A main processor for performing some operations of the display apparatus 200 in the pre-power-up mode and/or displaying a picture in the normal mode. One or more sub-processors for one operation in a standby mode or the like.
In some embodiments, the graphics processor 253 is configured to generate various graphical objects, such as: icons, operation menus, user input instruction display graphics, and the like. The device comprises an arithmetic unit, wherein the arithmetic unit is used for receiving various interaction instructions input by a user to carry out operation and displaying various objects according to display attributes. And a renderer for rendering the various objects obtained by the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, video processor 270 is configured to receive external video signals, perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image composition, etc., according to standard codec protocols for input signals, and may result in signals that are displayed or played on directly displayable device 200.
In some embodiments, video processor 270 includes a demultiplexing module, a video decoding module, an image compositing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio/video data stream, such as the input MPEG-2, and demultiplexes the input audio/video data stream into video signals, audio signals and the like.
And the video decoding module is used for processing the demultiplexed video signals, including decoding, scaling and the like.
And an image synthesis module, such as an image synthesizer, for performing superposition mixing processing on the graphic generator and the video image after the scaling processing according to the GUI signal input by the user or generated by the graphic generator, so as to generate an image signal for display.
The frame rate conversion module is configured to convert the input video frame rate, for example, converting the 60Hz frame rate into the 120Hz frame rate or the 240Hz frame rate, and the common format is implemented in an inserting frame manner.
The display format module is used for converting the received frame rate into a video output signal, and changing the video output signal to a signal conforming to the display format, such as outputting an RGB data signal.
In some embodiments, the graphics processor 253 may be integrated with the video processor, or may be separately configured, where the integrated configuration may perform processing of graphics signals output to the display, and the separate configuration may perform different functions, such as gpu+ FRC (Frame Rate Conversion)) architecture, respectively.
In some embodiments, the audio processor 280 is configured to receive an external audio signal, decompress and decode the audio signal according to a standard codec protocol of an input signal, and perform noise reduction, digital-to-analog conversion, and amplification processing, so as to obtain a sound signal that can be played in a speaker.
In some embodiments, video processor 270 may include one or more chips. The audio processor may also comprise one or more chips.
In some embodiments, video processor 270 and audio processor 280 may be separate chips or may be integrated together with the controller in one or more chips.
In some embodiments, the audio output, under the control of the controller 250, receives sound signals output by the audio processor 280, such as: the speaker 286, and an external sound output terminal that can be output to a generating device of an external device, other than the speaker carried by the display device 200 itself, such as: external sound interface or earphone interface, etc. can also include the close range communication module in the communication interface, for example: and the Bluetooth module is used for outputting sound of the Bluetooth loudspeaker.
The power supply 290 supplies power input from an external power source to the display device 200 under the control of the controller 250. The power supply 290 may include a built-in power circuit installed inside the display device 200, or may be an external power source installed in the display device 200, and a power interface for providing an external power source in the display device 200.
The user interface 265 is used to receive an input signal from a user and then transmit the received user input signal to the controller 250. The user input signal may be a remote control signal received through an infrared receiver, and various user control signals may be received through a network communication module.
In some embodiments, a user inputs a user command through the control apparatus 100 or the mobile terminal 300, the user input interface is then responsive to the user input through the controller 250, and the display device 200 is then responsive to the user input.
In some embodiments, a user may input a user command through a Graphical User Interface (GUI) displayed on the display 275, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
In some embodiments, a "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a graphical user interface (Graphic User Interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
The memory 260 includes memory storing various software modules for driving the display device 200. Such as: various software modules stored in the first memory, including: at least one of a base module, a detection module, a communication module, a display control module, a browser module, various service modules, and the like.
The base module is a bottom software module for signal communication between the various hardware in the display device 200 and for sending processing and control signals to the upper modules. The detection module is used for collecting various information from various sensors or user input interfaces and carrying out digital-to-analog conversion and analysis management.
For example, the voice recognition module includes a voice analysis module and a voice instruction database module. The display control module is used for controlling the display to display the image content, and can be used for playing the multimedia image content, the UI interface and other information. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing data communication between the browsing servers. And the service module is used for providing various services and various application programs. Meanwhile, the memory 260 also stores received external data and user data, images of various items in various user interfaces, visual effect maps of focus objects, and the like.
Fig. 3 exemplarily shows a block diagram of a configuration of the control apparatus 100 in accordance with an exemplary embodiment. As shown in fig. 3, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface, a memory, and a power supply.
The control device 100 is configured to control the display device 200, and may receive an input operation instruction of a user, and convert the operation instruction into an instruction recognizable and responsive to the display device 200, to function as an interaction between the user and the display device 200. Such as: the user responds to the channel addition and subtraction operation by operating the channel addition and subtraction key on the control apparatus 100, and the display apparatus 200.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications for controlling the display apparatus 200 according to user's needs.
In some embodiments, as shown in fig. 1, a mobile terminal 300 or other intelligent electronic device may function similarly to the control device 100 after installing an application that manipulates the display device 200. Such as: the user may implement the functions of controlling the physical keys of the device 100 by installing various function keys or virtual buttons of a graphical user interface available on the mobile terminal 300 or other intelligent electronic device.
The controller 110 includes a processor 112 and RAM 113 and ROM 114, a communication interface 130, and a communication bus. The controller is used to control the operation and operation of the control device 100, as well as the communication collaboration among the internal components and the external and internal data processing functions.
The communication interface 130 enables communication of control signals and data signals with the display device 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display device 200. The communication interface 130 may include at least one of a WiFi chip 131, a bluetooth module 132, an NFC module 133, and other near field communication modules.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touchpad 142, a sensor 143, keys 144, and other input interfaces. Such as: the user can implement a user instruction input function through actions such as voice, touch, gesture, press, and the like, and the input interface converts a received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the corresponding instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display device 200. In some embodiments, an infrared interface may be used, as well as a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. And the following steps: when the radio frequency signal interface is used, the user input instruction is converted into a digital signal, and then the digital signal is modulated according to a radio frequency control signal modulation protocol and then transmitted to the display device 200 through the radio frequency transmission terminal.
In some embodiments, the control device 100 includes at least one of a communication interface 130 and an input-output interface 140. The control device 100 is provided with a communication interface 130 such as: the WiFi, bluetooth, NFC, etc. modules may send the user input instruction to the display device 200 through a WiFi protocol, or a bluetooth protocol, or an NFC protocol code.
A memory 190 for storing various operation programs, data and applications for driving and controlling the control device 200 under the control of the controller. The memory 190 may store various control signal instructions input by a user.
A power supply 180 for providing operating power support for the various elements of the control device 100 under the control of the controller. May be a battery and associated control circuitry.
In some embodiments, the system may include a Kernel (Kernel), a command parser (shell), a file system, and an application. The kernel, shell, and file system together form the basic operating system architecture that allows users to manage files, run programs, and use the system. After power-up, the kernel is started, the kernel space is activated, hardware is abstracted, hardware parameters are initialized, virtual memory, a scheduler, signal and inter-process communication (IPC) are operated and maintained. After the kernel is started, shell and user application programs are loaded again. The application program is compiled into machine code after being started to form a process.
Referring to FIG. 4, in some embodiments, the system is divided into four layers, from top to bottom, an application layer (referred to as an "application layer"), an application framework layer (Application Framework layer) (referred to as a "framework layer"), a An Zhuoyun row (Android run) and a system library layer (referred to as a "system runtime layer"), and a kernel layer, respectively.
In some embodiments, at least one application program is running in the application program layer, and these application programs may be a Window (Window) program of an operating system, a system setting program, a clock program, a camera application, and the like; and may be an application program developed by a third party developer, such as a hi-see program, a K-song program, a magic mirror program, etc. In particular implementations, the application packages in the application layer are not limited to the above examples, and may actually include other application packages, which are not limited in this embodiment of the present application.
The framework layer provides an application programming interface (application programminginterface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. The application framework layer corresponds to a processing center that decides to let the applications in the application layer act. Through the API interface, the application program can access the resources in the system and acquire the services of the system in the execution.
As shown in fig. 4, the application framework layer in the embodiment of the present application includes a manager (manager), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an activity manager (actiginemanager) is used to interact with all activities that are running in the system; a Location Manager (Location Manager) is used to provide system services or applications with access to system Location services; a Package Manager (Package Manager) for retrieving various information about an application Package currently installed on the device; a notification manager (notifinmanager) for controlling display and clearing of notification messages; a Window Manager (Window Manager) is used to manage bracketing icons, windows, toolbars, wallpaper, and desktop components on the user interface.
In some embodiments, the activity manager is to: the lifecycle of each application program is managed, as well as the usual navigation rollback functions, such as controlling the exit of the application program (including switching the currently displayed user interface in the display window to the system desktop), opening, backing (including switching the currently displayed user interface in the display window to the previous user interface of the currently displayed user interface), etc.
In some embodiments, the window manager is configured to manage all window procedures, such as obtaining a display screen size, determining whether there is a status bar, locking the screen, intercepting the screen, controlling display window changes (e.g., scaling the display window down, dithering, distorting, etc.), and so on.
In some embodiments, the system runtime layer provides support for the upper layer, the framework layer, and when the framework layer is in use, the android operating system runs the C/C++ libraries contained in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the kernel layer contains at least one of the following drivers: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (e.g., fingerprint sensor, temperature sensor, touch sensor, pressure sensor, etc.), and the like.
In some embodiments, the kernel layer further includes a power driver module for power management.
In some embodiments, the software programs and/or modules corresponding to the software architecture in fig. 4 are stored in the first memory or the second memory shown in fig. 2 or fig. 3.
In some embodiments, taking a magic mirror application (photographing application) as an example, when the remote control receiving device receives an input operation of the remote control, a corresponding hardware interrupt is sent to the kernel layer. The kernel layer processes the input operation into the original input event (including the value of the input operation, the timestamp of the input operation, etc.). The original input event is stored at the kernel layer. The application program framework layer acquires an original input event from the kernel layer, identifies a control corresponding to the input event according to the current position of the focus and takes the input operation as a confirmation operation, wherein the control corresponding to the confirmation operation is a control of a magic mirror application icon, the magic mirror application calls an interface of the application framework layer, the magic mirror application is started, and further, a camera driver is started by calling the kernel layer, so that a still image or video is captured through a camera.
In some embodiments, for a display device with a touch function, taking a split screen operation as an example, the display device receives an input operation (such as a split screen operation) acted on a display screen by a user, and the kernel layer may generate a corresponding input event according to the input operation and report the event to the application framework layer. The window mode (e.g., multi-window mode) and window position and size corresponding to the input operation are set by the activity manager of the application framework layer. And window management of the application framework layer draws a window according to the setting of the activity manager, then the drawn window data is sent to a display driver of the kernel layer, and the display driver displays application interfaces corresponding to the window data in different display areas of the display screen.
In some embodiments, as shown in fig. 5, the application layer contains at least one icon control that the application can display in the display, such as: a live television application icon control, a video on demand application icon control, a media center application icon control, an application center icon control, a game application icon control, and the like.
In some embodiments, the live television application may provide live television via different signal sources. For example, a live television application may provide television signals using inputs from cable television, radio broadcast, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
In some embodiments, the video on demand application may provide video from different storage sources. Unlike live television applications, video-on-demand provides video displays from some storage sources. For example, video-on-demand may come from the server side of cloud storage, from a local hard disk storage containing stored video programs.
In some embodiments, the media center application may provide various multimedia content playing applications. For example, a media center may be a different service than live television or video on demand, and a user may access various images or audio through a media center application.
In some embodiments, an application center may be provided to store various applications. The application may be a game, an application, or some other application associated with a computer system or other device but which may be run in a smart television. The application center may obtain these applications from different sources, store them in local storage, and then be run on the display device 200.
In some embodiments, display device 200 may also include a touch component 276 through which a user may interact with touch device 200. The touch control component 276 can be a layer of touch sensing elements added to the display screen of the display 275, and the specific touch sensing principle can be determined according to the actual interaction requirement. For example, a capacitive touch screen, a resistive touch screen, an infrared touch screen, a surface acoustic wave touch screen, or the like may be employed according to the actual application scene of the display device 200.
The touch assembly 276 may include a sensing unit disposed on the display 275 and a signal processing unit disposed in the display device. The sensing unit can be used for sensing touch operation of a user and converting the touch operation into an electric signal; the signal processing unit may process the generated electrical signal, including feature extraction, noise reduction, amplification, etc.
Taking a capacitive touch screen as an example, the sensing unit may be a layer of transparent special metal conductive material attached to the surface of the display screen glass of the display 275. When the finger or palm of the user touches the conductive substance layer, the capacitance value of the touch point is changed, so that a touch signal is generated. The signal processing unit may receive the touch signal and process the touch signal, and convert the touch signal into a digital command readable by the controller 250.
In general, interactions performed by a user on a touch screen may include clicking, long pressing, sliding, and the like. In order to support more interaction modes, the touch screen can also support multi-touch. The more points the touch screen supports touch, the more corresponding interactive actions can be implemented. For example, multi-finger clicking, multi-finger long pressing, multi-finger sliding, and the like may be implemented.
For different interaction actions, characteristics of the touch signal, such as touch point positions, touch point number, touch area and the like, can be acquired for the generated touch signal. And judging the type of the touch signal according to the signal characteristics generated by the touch point, thereby generating a touch instruction. The position of the user touch, namely the position for executing the intersection operation, can be detected according to the position of the touch point; the number of fingers used in the touch interaction operation of the user can be determined through the number of touch points; by judging the duration of the touch signal, it can be determined whether the user performs a click operation or a long press operation; the sliding operation performed by the user can be determined by the position change condition of the touch point.
For example, the touch component 276 may extract characteristics of the touch signal after detecting the touch signal, if the number of touch points in the touch signal is equal to 1, the duration of the touch signal is less than 0.5s, and the position of the touch point in the touch signal is unchanged, then it is determined that the interaction action input by the current user is a single pointing action, and accordingly a touch instruction corresponding to the single pointing action may be generated.
The touch assembly 276 may be coupled to the controller 250 to send the generated touch instructions to the controller 250. Since the interaction process is a continuous process, the touch assembly 276 continuously sends touch commands to the controller 250 to form a data stream. To distinguish between different touch commands, the touch component 276 may generate touch commands according to the law of one or more touch actions so that the controller 250 may receive complete, identifiable touch commands.
According to the touch actions input by the user, the controller 250 may execute different control programs according to the interaction mode of the operating system, and control the display 275 to display different user interfaces. In the embodiment of the present application, the user interface generally refers to a specific interface displayed on the display device 200, and may include a control interface, a play interface, and the like. A user may implement interactive control with display device 200 through control apparatus 100 and/or touch assembly 276 via operating system interaction rules built into display device 200 to enable different user interfaces to be presented on display 275. Different user interfaces may have different UI layouts, i.e. include different control compositions.
The control refers to specific display content for realizing user interaction control. For example: buttons, progress bars, scroll bars, text boxes, radio boxes, check boxes, links, and the like. The control can be displayed in different shapes, patterns and sizes according to the UI style characteristics of the operating system, and can also form functional areas at different positions in the interface according to the corresponding functions. For example, as shown in fig. 6, a status bar at the top, a menu switching area at the middle upper part, and content areas at the middle and lower parts may be included in the control homepage. A plurality of commonly used status controls or icons may be included in the status bar, such as a mode switch button, a search box, a VIP purchase/login status button, a message bar, and the like. A plurality of controls for indicating different menus, such as a my menu, a channel menu, etc., may be included in the menu switching area. A plurality of media asset play link controls may be included in the content zone.
The user may perform a corresponding control action through the control. For example, the user may jump from the currently displayed control home interface to the corresponding media playback interface by clicking on any of the media playback link controls. For the display device 200 in the embodiment of the present application, because the touch component 276 is built in, interaction can be achieved through touch actions.
In practical applications, a user may input various touch actions through the touch component 276, and the touch component 276 further forms a touch instruction by detecting the input touch actions. The touch command formed is different according to the input touch actions, and the touch command can be of various types. For example, a single click touch command, a multiple click touch command, a long press touch command, a slide touch command, and the like. The multi-finger touch instruction, such as a multi-finger click instruction, a multi-finger sliding instruction, a multi-finger long press instruction, and the like, can be further generated according to the corresponding touch point number in the touch action.
The touch command generated by the touch component 276 can be acquired by the controller 250 to perform the corresponding interactive control action according to the rules set in the operating system. Different touch control instructions can correspond to different control actions. For example, a single click touch instruction on a media playback link control may be used to open the playback link to obtain media data; the long-press touch instruction on one function icon can be used for representing that the editing state is entered, so that the position of the function icon can be adjusted by sliding the touch instruction in the editing state.
It should be noted that, in practical application, the same touch instruction may have different control actions in different controls. For example, a click touch instruction input on the media link control may implement a control action of opening a link, that is, jumping to a media detail page or a playing interface, and a click instruction input on the volume control may implement setting the current volume of the display device 200 to a volume corresponding to the click position.
Since touch actions are typically input by a user through a finger on display 275. Accordingly, the area in which the user is convenient to perform the touch operation is limited due to the influence of factors such as the user's arm and height. For example, an average adult user's arm length is 650mm, and for a single arm operation, it is convenient to perform the touch area as one area of not more than 650×650mm on the screen. Whereas for the double-arm operation, the area where touch is convenient to be performed is an area of not more than 1600×650mm on the screen.
However, as the screen size of the display 275 in the display device 200 gradually increases, it is often difficult for the user to perform touch control in an area covering the entire display area. For example, the screen size of a 65-inch smart television is 1459×841mm, and the user cannot touch the lower right corner region because the user operates in the 650×650mm region near the upper left corner due to the influence of height and standing, so that the user cannot conveniently complete touch input.
In order to facilitate user touch input, some embodiments of the present application provide a touch-assisted interaction method, which may be applied to a display device 200, where the display device 200 includes at least a display 275, a touch component 276, and a controller 250, and the display 275 and the touch component 276 are connected to the controller 250. Wherein, the touch component 276 may communicate the detected touch action to the controller 250 to cause the controller 250 to control the display content in the display 275.
In the process of controlling the display 275 to display specific contents, the touch assistance interaction method may be implemented by configuring a control program in the controller 250 and executing the control program by the controller 250. As shown in fig. 7, the touch assistance interaction method includes the following steps:
s1: and acquiring a touch instruction input by a user and used for executing touch assistance.
The user may input a touch action on the touch component 276, i.e., the touch screen, and as the touch action is input, the touch component 276 may generate a corresponding electrical signal to detect the input touch action. And then sends the generated electric signals to the controller 250 or a built-in microprocessor for judgment. The controller 250 or a built-in microprocessor can determine the specific touch action by analyzing specific values and change rules of the electric signals in the electric signals, so that a touch instruction is formed and sent to the controller 250.
By analyzing the electrical signals, parameters in the touch action can be determined, including touch point location, number of touch points, duration of touch operation, etc. The position of the touch point can be determined by the position of the touch point, and whether the user inputs the sliding instruction can be determined by the change of the position of the touch point. The number of touch points can determine whether the user is input in one or more fingers. The touch duration can determine that the user inputs the touch actions such as clicking, long pressing, multiple clicking and the like. Obviously, the above parameters can be comprehensively analyzed to determine more complex touch instructions.
The controller 250 may receive various touch instructions, such as clicks, long presses, swipes, etc., input by a user through the touch assembly 276. In order to implement touch assistance, a preset judgment rule may be built in the operating system, and a touch instruction may be set as a touch instruction for implementing touch assistance. Therefore, after receiving the touch command, the controller 250 can determine the touch action corresponding to the touch command, so as to determine whether the touch command input by the user is used for executing the touch assistance interaction.
Because the touch control instructions can be used frequently according to different usage habits and operation difficulties, some touch control instructions are not used frequently, and therefore in practical application, the touch control instructions for executing touch control assistance should be different from touch control instructions of other purposes. And the use frequency of the touch control instruction for executing the basic functions such as opening and adjusting is low, so that the touch control instruction for executing the touch control assistance can be set as an unusual touch control instruction. For example, the touch instruction for performing touch assistance may be a multi-click instruction, a long press instruction, a multi-finger touch instruction, or the like.
S2: and responding to the touch instruction, and generating a mapping window according to the current user interface.
After acquiring the touch instruction for performing touch assistance, the controller 250 may generate a mapping window according to the current user interface. The mapping window may be content displayed in an area of the current user interface that is inconvenient for the user to operate, so that the user may operate through the mapping window without moving to other locations. Therefore, all or part of the controls in the current user interface are included in the mapping window, and the controls in the mapping window have association relations with the controls in the current user interface.
To generate the mapping window, after acquiring the touch command, the controller 250 may acquire the display content of the current user interface, thereby generating a mapping window according to the acquired display content. The content in the mapping window may include all of the content in the current user interface. For example, if the current user interface is a control home page, the content of the entire user interface may be included in the mapping window, i.e., the screen content in the mapping window is the same as the user interface content, and only the window size is smaller than the size of the current user interface.
In some embodiments, to be able to fully display the current user interface content, the frame scale of the mapping window may be the same as the display scale of the current user interface, e.g., both are 16:9 frame scales.
The content in the mapping window may also include only a portion of the content that is inconvenient for the user to perform the interactive operation region. For example, when the current user interface is a control homepage and it is determined that the current operation position of the user is close to the left area of the screen, the content of the right area inconvenient for the user to perform the touch interactive operation may be displayed in the mapping window. Obviously, the mapping window only needs to be capable of completely displaying the area inconvenient for the user to operate, so that the mapping window does not need to keep the same picture scale as the current user interface.
It should be noted that, in order to enable the user to perform the touch interaction operation in the mapping window without moving the position, the size of the mapping window should be controlled within a range of the area convenient for the user to operate. For example, the mapping windows are each less than 650mm wide and tall so that a user can complete a one-handed operation within the mapping window.
S3: and displaying the mapping window in the association area of the touch instruction.
After generating the mapping window, the controller 250 may control the display 275 to display the mapping window in the associated area of the touch command. The associated area of the touch command may be an area related to an input position of the touch command. For example, the current display screen range may be divided into a plurality of preset partitions according to the display content, and the associated area is the same preset partition where the touch instruction is located. Obviously, the associated area is also in an area where the user is convenient to perform touch operation.
In order to enable a touch interactive operation in the mapping window, the mapping window may be displayed at the topmost layer of the screen. For example, as shown in fig. 8, the mapping window may be displayed as a floating window in the screen, so as to avoid the original content of the screen from blocking the mapping window. When the mapping window maintains the state of the floating window, the display position can be adjusted through further touch interaction operation. For example, a sliding touch command may be input through the top of the mapping window to drag the mapping window to other positions in the display screen.
As can be seen from the above technical solutions, according to the touch assistance interaction method provided by the present application, when a user needs to perform an operation in a region with a relatively long distance, a touch instruction for performing touch assistance interaction is input through a specific touch action, so that after the touch instruction is acquired by the display device 200, a mapping window is generated according to a current user interface, and the mapping window is displayed in an associated region that is convenient for the user to operate, so that the user can complete the touch action through the mapping window. The method is more in line with the operation habit of the user, and can realize the touch operation on the whole user interface on the premise of not moving the position.
Based on the above embodiment, the user may display the mapping window by inputting a specific touch instruction at any time of the touch operation. Accordingly, the controller 250 may determine whether it is an instruction for performing touch assistance interaction for each touch instruction received. To this end, in some embodiments, as shown in fig. 9, the step of acquiring the touch instruction for executing the touch assistance input by the user further includes:
s101: monitoring a touch action input by a user;
s102: and if the touch action is the same as the preset touch action, generating a touch instruction for executing touch assistance.
With the touch action input by the user, a touch event can be generated in the operating system. The controller 250 may monitor the generated touch events in real time to determine the touch actions entered by the user. The touch actions include a long press action, a multi-click action, a sliding action, a multi-finger action, a click action inputted at a preset position, and the like, and may also include other types of touch actions according to the type of the touch component 276. By detecting, if it is determined that the touch action is the same as the preset touch action, a touch instruction for performing touch assistance is generated to trigger the controller 250 to perform a program related to generating the mapping window.
In general, touch events may include touch information such as a press point and a lift point, and accordingly have relevant information such as a press time, a press position, a lift time, a lift position, and the like. touch information contained by the touch event can be obtained by electrical signals generated by the touch component 276. For example, a touch event may be detected by the touch assembly 276, causing a change in the voltage of the touch assembly 276 at the touch location, forming an electrical signal. After the formed electrical signal is sent to the controller 250, the controller 250 can identify the voltage change condition in the electrical signal, and determine the position and time of the voltage change condition to form a touch event. The formed touch event may include being represented by a specific event code, for example, "motion event. Action_down" represents a touch event corresponding to a pressed point and "motion event. Action_up" represents a touch event corresponding to a lifted point.
The preset touch actions can be set uniformly according to the interaction strategy in the operating system, that is, different triggering modes can be set in different display devices 200 to display the mapping window. The following describes how to determine whether the touch action is the same as the preset touch action by several specific examples:
For example, the preset touch action may be a long-press touch action, that is, triggered by a long-press manner.
The controller 250 may monitor a Touch event, monitor "motion event. Action_down" when the user clicks one Touch, record the pressed position and time, monitor "motion event. Action_up" when the user lifts the finger, and record the lifted position and time. By setting the long-press time threshold and the distance threshold, if the distance between the two recorded positions is smaller than the distance threshold and the time interval between the two recorded positions exceeds the time threshold, the touch action input by the user is determined to be the long-press action, so that the input action is determined to be the same as the preset touch action, namely, the input touch instruction is a touch instruction for executing touch assistance.
For example, the preset touch action may be a double-click touch action, that is, triggered by a double-click manner.
The controller 250 may record the first and second click events by a control program and set a time threshold for successive clicks. By listening to the Touch event, the user listens to "motionevent. Action_down" when the user clicks for the first Touch, marking the beginning of the first click event. When the user lifts the hand, monitor the "motionevent. Action_up" event, mark the end of the first click event, and record the end time. And then monitoring a second click event, for example, starting a 'motion event. Action_down', recording the starting time of the second click, ending the second click event when the 'motion event. Action_up' is sent out for the second time, calculating the time difference of the two click events, and determining that the user inputs double-click touch action if the time difference is smaller than a set threshold value, so that the input action is the same as the preset touch action, namely, the input touch instruction is the touch instruction for executing touch assistance.
In example three, the preset touch action may be a multi-finger touch action, that is, triggered by a multi-finger manner.
As shown in fig. 10, the detection routine in the controller 250 may be triggered after a "activity_pointer_down" event is heard. When the current event triggers, the number of trigger fingers can be acquired through a 'getPointCount ()' command. The number is defined according to the configuration requirement of the touch component 276, if five fingers are defined, when the "getPointCorunt ()" is obtained as 5, it is determined that the user inputs the multi-finger touch action, so that the input action is the same as the preset touch action, and it is determined that the input touch instruction is a touch instruction for executing touch assistance.
For multi-finger touch actions, the corresponding touch condition on each touch point can be further judged, so that more complex gesture actions are determined. Such as pinch gestures in which two fingers slide inward simultaneously, spread gestures in which two fingers slide outward simultaneously, rotation gestures in which two fingers slide in a circular manner in the same direction, etc.
In the fourth example, the preset touch action may be a sliding touch action, that is, triggered by a sliding manner.
As shown in fig. 11, the sliding touch manner may set a specific input action limit to distinguish between the sliding touch manners of other functions. The mapping window is triggered in a sliding manner, such as at a screen boundary position. The controller 250 also records coordinate points in the "motion event. Action_down" and "motion event. Action_up" events by monitoring Touch events, such as single-finger sliding, respectively, determines the current sliding direction according to rules, including left sliding, right sliding, UP sliding, DOWN sliding, etc., and determines that the user input action is the same as the preset Touch action when the recorded coordinate points are in the area near the screen boundary, and determines that the input Touch instruction is a Touch instruction for executing Touch assistance.
The sliding touch action can be completed by inputting a specific path through multiple fingers. For example, as shown in fig. 12, 13, input may be accomplished by a double-finger inward swipe to perform a spread-out gesture motion. Alternatively, as shown in fig. 14, the input is completed by rotating the slide command by double data. Alternatively, as shown in fig. 15, the input of the touch operation is completed by sliding in the same direction.
In the fifth example, the preset touch action may be a click action input at a preset position, that is, a touch instruction for executing touch assistance is input through a partial control of the UI interface.
As shown in fig. 16, a hover touch menu may be displayed on the user interface, where a control-assisted shortcut key control may be included, through which a user may input a click touch action. Similarly, the controller 250 records a coordinate point and a time in the "motion_down" and "motion_up" events, respectively, by monitoring the Touch event, and if the coordinate points recorded twice are both within the control assistance key range and the time recorded twice is less than the preset time threshold, determines that the user inputs a Touch instruction for performing Touch assistance.
It should be noted that, since the Touch component 276 can make the controller 250 monitor the Touch event, the Touch manner in the above example can be implemented, or the Touch manner can be customized, i.e. the Touch manner is defined according to the requirements of the display device 200 and the requirements of the user, and the form is not limited to the above.
According to the technical scheme, the touch event corresponding to the touch operation of the user can be monitored, the touch action of the user is detected according to the touch event, and when the touch action is the same as the preset touch action, the touch instruction for executing touch assistance is determined to be input by the user. Therefore, the user can trigger the generation of the mapping window on any interface through specific touch actions, and the user operation is facilitated.
After acquiring the touch instruction for performing touch assistance, the controller 250 may generate a mapping window according to the current user interface. That is, in actual application, after the controller 250 monitors the Touch event, the Touch event is parsed, if it is determined that the same action as the preset Touch action is input, drawing a mapping window, where the drawn mapping window may have different forms according to different application environments, including but not limited to the following several types:
For example, the mapping window may include the entire content of the current user interface. In some embodiments, the step of generating a mapping window from the current user interface includes:
s211: performing screenshot on the whole picture of the current user interface;
s212: scaling the screenshot results to generate a global thumbnail;
s213: traversing the positions of all the controls in the current user interface;
s214: and establishing an association relation between each control and the pixel points in the global thumbnail according to the position of the control in the current user interface, and generating the mapping window.
To generate the mapping window, the controller 250 may first take a screenshot at the current user interface by executing a graphical intercept program. All the display contents of the current user interface are included in the intercepted pattern. For example, if the current user interface is a control homepage, all pattern pictures in the status bar, menu bar and content area are included in the intercepted image. After the screenshot, the controller 250 may also scale down the intercepted image to form a global thumbnail. The reduced scale may be determined according to the size of the mapping window, i.e., the reduced global thumbnail size is consistent with the preset mapping window size.
After the global thumbnail is obtained, the association relation between the thumbnail and the corresponding control on the user interface can be further established, namely, the association relation between each control and the pixel point in the global thumbnail is established by traversing the positions of the controls in the current user interface, so that a mapping window is generated. After the association relation is established, if the user inputs the touch action at the pixel point position corresponding to the mapping window, the input touch action can be associated to the user interface according to the established association relation, which is equivalent to inputting a touch instruction on the user interface.
According to the technical scheme, the method for generating the mapping window in the embodiment can construct the basic pattern of the mapping window by intercepting the global thumbnail, so that the user can conveniently implement touch operation. And then, through establishing an association relation between the pixel points in the mapping window and the controls in the user interface, the mapping window can support touch operation of a user. Therefore, the method for generating the mapping window can realize multi-window display and support touch assistance operation under the condition of occupying smaller processing capacity of the controller 250.
In some embodiments, if the mapping window includes the entire content of the current user interface, the step of generating the mapping window from the current user interface may include:
S221: acquiring the size of a display area of a mapping window;
s222: and executing the display program which is the same as that of the current user interface according to the display area size, and generating the mapping window.
In the operating system of the display apparatus 200, a display program of each user interface may be built in, including a set display UI frame, display contents, rendering program, and the like. These display programs are called up with the user's interaction to construct a specific display. Thus, in this embodiment, a specific display window may be generated by the display program to form the mapping window.
Obviously, the generated mapping window is small relative to the size of the current user interface so that the user can perform touch interactive operation. Therefore, after acquiring the touch instruction for performing touch assistance, the controller 250 may acquire the display area size of the mapping window first, and then perform the same display procedure as the current user interface according to the display area size, so as to generate a mapping window with the same content as the current user interface but a smaller size.
For example, when the current display screen is a control homepage, after receiving a touch instruction for executing touch assistance, a preset mapping window size is obtained to be 650×365mm. Extracting relevant display programs of the control homepage from the operating system, namely extracting a UI layout frame of the control homepage; and then the corresponding display content is acquired, wherein the display content can be acquired from the server 400, or the acquired content of the display device 200 can be directly extracted. Finally, the controller 250 may execute a rendering program to generate a display screen having a size of 650×365mm.
It can be seen that, in the method for generating a mapping window provided in the foregoing embodiment, a user interface with the same content as the current user interface but a smaller size may be generated on the mapping window by using a display program in the operating system, so that a user may directly perform an interaction on the mapping window, and directly execute a related control program, such as a jump interface, adjust operation data, etc., after performing the interaction.
It should be noted that, since the mapping window and the current user interface are two identical interfaces, when the interaction performed on the mapping window causes a change in the display content, the user interface outside the mapping window also changes. In order to reduce the computational load of the controller 250, the display duration of the mapping window may be set, so that the mapping window may be automatically closed after a certain time is displayed. For example, the preset duration may be set to 20s, that is, after the user invokes the mapping window or within 20s after the last operation in the mapping window, the mapping window is automatically closed if the touch operation is not performed any more.
Because the display range of the mapping window is generally smaller, when the display content in the current user interface is more or the control graph is smaller, the display content on the mapping window is easy to be unclear or the operation is inconvenient because of the smaller graph, therefore, for the user interface, only part of the content can be displayed, as shown in fig. 17, so that the content in the mapping window is clearer and the operation is more convenient for the user. That is, in some embodiments, the mapping window may only display a portion of the content in the current interface, and the step of generating the mapping window from the current user interface further includes:
S231: acquiring a direct operation area corresponding to the touch instruction;
s232: performing screenshot on an indirect operation area of a current user interface;
s233: scaling the screenshot results to generate a local thumbnail;
s234: traversing the positions of the controls in the indirect operation area;
s235: and establishing an association relation between each control and the pixel points in the local thumbnail according to the position of the control in the indirect operation area, and generating the mapping window.
For the display screen of the display device 200, the user interface may be divided into a plurality of partitions in advance, and different partitions may correspond to different sites of the user so that the user performs touch interactive operations at different positions. To perform the operation, specific partitions may be divided according to the screen size of the display device 200. For example, for a large screen display 275 of 65 inches or more, the display screen may be divided into two in the middle of the screen, i.e., including a left side section and a right side section.
While for a portion of the display device 200 it may assume different rotational states in the application, such as a landscape state and a portrait state. Therefore, when dividing the preset partition, different partition dividing modes can be provided under different rotation states. For example, the display screen may be divided into an upper section and a lower section at a screen intermediate position in a vertical screen state.
For this reason, after receiving a touch instruction for executing touch assistance, a direct operation area corresponding to the touch instruction is obtained, where the direct operation area is a preset partition to which a finger lifting position in the touch instruction belongs, and other preset partitions outside the direct operation area are used as indirect operation areas. Obviously, the indirect operation area is far away from the operation position of the user, so that the touch operation is inconvenient to execute. For example, the controller 250 determines that the position of "motionevent. Action_up" is within the left partition by listening to the touch event, and the left partition is a direct operation area and the corresponding right area is an indirect operation area.
And similarly, according to the generation method of the mapping window, screenshot is carried out on the display content in the indirect operation area, and the intercepted image is scaled to generate the local thumbnail. And generating a mapping window by traversing the positions of the controls in the indirect operation area and establishing an association relation between each control and the pixel points in the local thumbnail according to the positions of the controls in the indirect operation area.
It can be seen that the generated mapping window can include the content in the indirect operation area, so that the content in the indirect operation area is moved to the mapping window convenient for the user to operate, the displayed content is not excessively reduced, and the user can conveniently execute interactive operation on the mapping window.
In some embodiments, when the mapping window only displays a portion of the content in the current interface, the content displayed in the mapping window may also be customized. I.e. in the step of generating a mapping window from the current user interface, the controller is further configured to:
s241: acquiring the interface type of a current user interface;
s242: extracting a custom mapping area according to the interface type;
s243: executing screenshot on a custom mapping area of a current user interface;
s244: scaling the screenshot results to generate a local thumbnail;
s245: traversing the positions of all the controls in the custom mapping area;
s246: and establishing an association relation between each control and the pixel points in the region thumbnail according to the position of the control in the custom mapping region, and generating the mapping window.
In order to be able to perform a touch operation, the content displayed in the mapping window should include controls, and the content and number of controls contained thereon are also different for different interfaces. For example, the number of controls on the control home page is large, and the content of the controls covers a variety of situations, including icons on status bars, menu entries, media asset links, and the like. The number of controls on the playing interface is small, and buttons related to playing control, such as pause/play key, fast forward/fast backward key, volume control bar, selection button, etc., are generally included. Thus, different custom mapping regions may be set for different types of user interfaces.
The controller 250 may acquire an interface type of the current user interface and determine the custom mapping region according to the user interface type. For custom mapping regions, the settings may be unified automatically by the operating system or server 400, or manually by the user. For example, the custom region of the playback interface may not include a specific playback screen that is displayed only with little interaction, but only include a control button region at the bottom of the playback screen.
After determining the custom map region, controller 250 may generate the map window in the same manner as in the other embodiments described above. The mapping window can be generated by intercepting the region thumbnail of the custom mapping region and traversing the positions of the controls in the custom mapping region to establish the association relationship between each control and the pixel points in the region thumbnail.
For example, as shown in fig. 18, after a user inputs a touch instruction for touch assistance on a playback interface, the controller 250 may detect the type of the current user interface, thereby obtaining that the interface type of the current user interface is the playback interface. For a playback interface, users typically only interoperate with controls such as pause/play keys, fast forward/rewind keys, volume control bars, selection buttons, and the like. Thus, the mapping window may include only the display content of the area where the controls are located. The screenshot is carried out on the region where the controls are located, and the association relationship between the pattern in the mapping window and the current user interface is established by traversing the positions of the controls in the screenshot region.
Because the playing interface generally has an independent touch interaction strategy, for example, pause/play can be completed by double-clicking the display content area, fast forward/fast backward can be completed by sliding, and the like, the display content of the mapping window can also not comprise the controls and only comprise the selection button area, so that the display content in the mapping window is further simplified, and the user can conveniently complete interactive operation.
It can be seen that in the above embodiment, the user-defined mapping area may be determined by detecting the interface type of the current user interface, so that the picture in the user-defined mapping area is displayed in the mapping window, and the touch operation on the mapping window may be associated to the user interface by establishing the association relationship, so as to implement the touch assistance operation. In addition, because the custom area for mapping can be only aimed at the part with the control, the mapping window can be displayed through a smaller area, and the excessive shielding of the mapping window to the user interface can be relieved.
It should be noted that, the mapping window may be generated by associating a touch area with a specific user interface, or combining with other local or custom areas, so as to form a mapping window more beneficial to user operation. For example, as shown in FIG. 19, the lower region of the current user interface is a list of thumbnails, supporting both swipe and click events; and the area above the thumbnail list is a picture detail display area and supports sliding and rotating operation. In the display process, the display area part does not limit the operation control area, and can be controlled at any position. Thus, two corresponding partitions may be formed in the mapping window, i.e. may be mapped into a touch pad and list mapping area.
Similarly, after the mapping window is generated, the user can slide left and right in the list mapping area and click the operation of the event, and then the mapping window detects the operation of the user to issue an instruction, so that the switching of the list is realized; and performing operation in the touch pad area, such as rightward rotation operation, issuing instructions after the touch pad monitors rightward rotation, and performing rightward rotation of pictures after the front full-screen page monitors rightward rotation.
After the mapping window is generated, the mapping window may be displayed on the top layer of the user interface by the display 275, and in order to facilitate the user to perform further touch operation, the display position of the mapping window should be in an area convenient for the user to operate. In order to display the mapping window in the area convenient for operation, in some embodiments of the present application, the displaying position of the mapping window may be controlled according to the position of the touch instruction input by the user, that is, the step of displaying the mapping window in the associated area of the touch instruction further includes:
s310: monitoring touch actions input by a user, and recording the lifting position of a finger in the touch instruction;
s320: and displaying the mapping window by taking the finger lifting position as a reference.
The controller 250 may monitor a touch event of a user when a touch instruction is input in real time, and record a finger lifting position corresponding to the touch event, that is, a position of "motion event. Action_up". Typically, the position where the user's finger is lifted is within an area convenient for the user to operate. Accordingly, controller 250 may display a mapping window on display 275 based on the finger lift position.
The map window may be displayed in the vicinity of the finger lifting position with reference to the finger lifting position. The displayed mapping window may be caused to overlay the finger lift position. For example, the center or any point of the mapping window is made the same as the coordinates of the finger-raised position; the displayed mapping window may also be located in a specific area near the finger lift position.
That is, as shown in fig. 20, in some embodiments, the step of displaying the mapping window based on the finger lifting position further includes:
s3211: calculating the distance between the finger lifting position and the edge position of the display screen;
s3212: if the distance is greater than or equal to a judgment threshold, displaying the mapping window by taking the finger lifting position as a center point;
S3213: and if the distance is smaller than the judging threshold value, translating the mapping window by taking the finger lifting position as a starting point so as to completely display the mapping window.
In this embodiment, the display of the mapping window is normally performed centering on the finger-lifted position. Before the mapping window is displayed, a judgment can be made for the finger lifting position to determine whether the region corresponding to the finger lifting position has enough display space to display the mapping window completely. If the display is complete, the display of the mapping window is directly performed by taking the lifting position of the finger as the center; if the full display is not possible, the mapping window is translated with the finger lifting position as a starting point until the full display is possible.
Specifically, the finger lifting position coordinates may be represented by pixel points. For example, as shown in fig. 21, a plane rectangular coordinate system is constructed with the upper left corner of the display screen as the origin and the number of pixels (or distance) from the left side frame and the top frame as a unit length, and the finger lifting position coordinates P (x) 0 ,y 0 )。
After acquiring the finger lifting position coordinates P (x 0 ,y 0 ) Thereafter, the distance between the finger lift position and the screen edge position of the display 275 may be calculated in conjunction with the screen size (W H) of the current display 275, including: finger lifting Distance ll=x of position from left side frame 0 The method comprises the steps of carrying out a first treatment on the surface of the Distance lr=w-x between finger lifting position and right side frame 0 The method comprises the steps of carrying out a first treatment on the surface of the Distance lt=y of finger lift position from top bezel 0 The method comprises the steps of carrying out a first treatment on the surface of the Distance lb=h-y between finger lifting position and bottom rim 0
According to the calculated distance, the calculated distance can be respectively compared with a judging threshold value to determine whether the mapping window can be completely displayed when the current finger lifting position corresponds to the central area. Wherein the judgment threshold value may be determined according to the position of the mapping window, and may have different judgment threshold values in the lateral and longitudinal directions. For example, the horizontal judgment threshold is 1/2 of the width of the mapping window, and the vertical judgment threshold is 1/2 of the height of the mapping window.
By comparing the calculated distances in each direction with the judgment thresholds in each direction, the display mode of the mapping window can be determined. If the distances in all directions are larger than or equal to the judging threshold value, the mapping window can be completely displayed, namely, the mapping window is displayed by taking the lifting position of the finger as a center point. If the distance in any direction is less than the judgment threshold, the mapping window is determined to be unable to be completely displayed in the corresponding direction, and the mapping window can be translated to the opposite direction of the direction so as to completely display the mapping window. For example, if the distance LL between the finger-raised position and the left side frame is smaller than the lateral judgment threshold LX, the mapping window is shifted rightward by a distance greater than or equal to LX-LL.
According to the technical scheme, the mapping window can be displayed, and the mapping area can be drawn by taking touch operation as the center under normal conditions. And when the position is close to the edge, the drawing of the surrounding area is performed based on the edge point. The display mode not only can completely display the mapping window, but also can correlate the display position of the mapping window with the touch action of the user, so that the user can conveniently execute subsequent operations.
In some embodiments, the step of displaying the mapping window based on the finger-lifted position may further include:
s3221: acquiring a preset partition to which the finger lifting position belongs;
s3222: and displaying the mapping window in the preset partition.
In order to display the mapping window, the display screen may be divided into a plurality of preset partitions according to the screen size of the display 275, and one region is designated in each preset partition for displaying the mapping window. Clearly, the area for displaying the mapping window is convenient for the user to perform touch interaction operation, and is required to reduce the shielding of the display screen and the control in the current user interface as much as possible.
For example, the screen of the display 275 may be divided into two preset sections of a left area in which the display position of the mapping window is near the upper left corner of the display 275, and a right area in which the user is convenient for touch operation when approaching the left area. Similarly, the display position of the mapping window in the right region is near the upper right corner of the display 275.
According to the partition dividing mode, a preset partition to which the touch instruction belongs can be obtained through the recorded finger lifting position, and a mapping window is displayed in the preset partition to which the touch instruction belongs is determined. For example, if the recorded finger lift position coordinates are located within the left area, a mapping window is displayed in an area near the upper left corner in the left area.
Therefore, the embodiment controls the display position of the mapping window in a preset partition mode, so that the mapping window can be guaranteed to be at the same position every time the mapping window is called. And the influence of the user operation on the display position can be reduced. For example, when the user performs the interactive operation near the left area, the user stretches the arm to input a touch instruction for performing touch assistance near the middle area, and then the mapping window is still displayed at the most suitable position corresponding to the left area, so as to alleviate the influence of the individual operation.
Based on the touch assistance interaction method provided in the above embodiments, a display device 200 is also provided in some embodiments of the present application. Including a display 275, a touch assembly 276, and a controller 250. Wherein the display 275 is configured to display a user interface, the touch assembly 276 is configured to detect a touch action entered by a user, and the controller 250 is configured to perform the following program steps:
S1: acquiring a touch instruction input by a user and used for executing touch assistance;
s2: responding to the touch instruction, and generating a mapping window according to the current user interface;
s3: and displaying the mapping window in the association area of the touch instruction.
And the mapping window comprises all or part of controls in the current user interface, and the controls in the mapping window have association relations with the controls in the current user interface. It can be seen that the display device 200 provided in this embodiment may be used to implement the touch-assisted interaction method provided in the above embodiment. The touch assembly 276 may be disposed on the display 275 to form a touch screen to detect touch actions of a user in real time and to convert the detected touch actions into touch instructions for the controller 250. The controller 250 then generates a mapping window according to the current user interface in response to the received touch command, thereby controlling the display 275 to display the mapping window in the associated area of the touch command.
After the mapping window is displayed, the user can execute control operation in an inconvenient area by means of the mapping window by inputting touch instructions on the mapping window. Therefore, as shown in fig. 22, in order to implement touch assistance interaction, in some embodiments of the present application, a touch assistance interaction method is further provided, including the following steps:
S4: and acquiring an action instruction input by a user on the mapping window.
Obviously, all or part of the controls in the current user interface are included in the mapping window, and the controls in the mapping window have association relations with the controls in the current user interface. I.e. the mapping window is the one generated by any of the above embodiments. After the mapping window is displayed, the controller 250 may perform a position determination for each touch input by the user, to determine whether the input position is in the mapping window area or the user interface area. If the input touch control action is positioned in the mapping window, determining that an action instruction is input on the mapping window by the user.
The determination of the touch operation position may be based on the finger pressing operation position or the finger lifting position, and the finger pressing operation is a start operation of all the touch operations, so that the user operation purpose can be reflected, and the finger pressing operation position may be preferable as the touch operation position. The action command may also include various forms such as a click action, a long press action, a multi-click action, and a slide action. Each form of action can realize different control actions by acting on different controls, so that touch interaction operation is realized.
S5: and responding to the action instruction, and extracting the touch position of the action instruction in the mapping window.
After acquiring the action instruction, the controller 250 may extract its touch position in the mapping window in response to the action instruction. Since the user's finger is typically in surface contact with the touch assembly 276, the touch action may typically cover multiple pixels in the mapping window, and thus the touch location of the action instruction in the mapping window may be a set of multiple pixels. For the sliding touch action, the covered passing area belongs to the touch action in the sliding process, so that the touchable position also comprises all pixel points of the passed area in the duration of the touch action.
S6: and executing the control action corresponding to the touch position in the current user interface according to the association relation.
After extracting the touch position of the action instruction in the mapping window, the controller 250 may also execute the control action corresponding to the mapping window according to the association relationship between the mapping window and the user interface. For example, the user inputs a click action command in the mapping window, and the touch position corresponding to the action command is on a media resource link control graphic located in the lower right corner area of the mapping window. And then according to the association relation between the mapping window and the user interface, associating the clicking action instruction to a media resource link control positioned in the lower right corner area in the user interface so as to play the media resource. In this way, the user can directly input the click touch action on the media link control of the user interface, so that the user can execute the touch operation in the inconvenient operation area without moving the standing position.
Based on the above-described display device 200, further touch interaction operations may be implemented after the mapping window is displayed. For example, the user may manipulate the display device 200 to launch a "mirror" application that may obtain an image of the user through an image acquisition module such as a camera and display in a picture frame. In the user interface of the "look mirror" application, auxiliary controls, such as "change clothes", may also be included for adjusting the displayed image content. That is, after the user clicks any one of the buttons for changing clothes, a clothes pattern corresponding to the control is added on the displayed image.
Because the clothes are of a plurality of types, the number of buttons for changing clothes is also large, namely, a part of buttons cannot be touched by a user, and therefore, the user can call the mapping window through a touch instruction. The mapping window may include at least those "change clothes" buttons that cannot be touched, so that after the mapping window is displayed, the user may click on any button area in the mapping window.
After the user clicks the "change clothes" button area located at the lower right corner on the mapping window, the controller 250 may obtain the action instruction, extract the touch position of the action instruction in the mapping window, and associate the click instruction to the "change clothes" button located at the lower right corner in the application interface. Since the user inputs a click command on the "change clothes" button located in the lower right corner, a clothes pattern corresponding to the button can be added to the frame area.
The user interface displayed on display 275 will also send changes as the user performs a touch action on the mapping window. That is, in the partial touch action, the displayed content needs to jump to display different user interfaces, and in the partial touch action, the display state of the control is adjusted so that the user can execute other touch operations. Thus, in some embodiments, after performing a control action in the current user interface corresponding to the touch location, the method further comprises the steps of:
s701: detecting the image content of the mapping window and the current user interface;
s702: and refreshing the display content of the mapping window if the mapping window is inconsistent with the picture content of the current user interface.
In this embodiment, the controller 250 may detect the screen contents of the mapping window and the current user interface within a preset time after each execution of the control action, or detect the screen contents of the mapping window and the current user interface according to the set frequency, and compare the screen contents of the mapping window and the current user interface after the detection is completed. If the picture content is consistent, namely the specific content in the current user interface is not changed by the control action, continuing to keep the current display content of the mapping window; if the detected picture content is inconsistent, refreshing the display content of the mapping window to make the picture on the mapping window consistent with the user interface.
For example, as shown in fig. 23, when the current user aims at viewing "application 18" in the application interface, the user performs long-press touch control at the position of the icon at the lower right corner of the mapping window. The controller 250 judges the current user's control by monitoring the Touch event, if the event is a long-press event, the event is downloaded, that is, the control action corresponding to the Touch position in the current user interface is executed, and the control action is executed to display the control contents of unloading and details on the application icon of the application 18.
After the control action is executed, whether the mapping interface has a change relative to the user interface or not can be detected regularly through a timer, if the mapping interface has the change, the refreshing of the mapping window is carried out, and the timing is ended. The toppaper can be obtained by a related method of an actigram, and whether the front page changes or not can be judged.
Similarly, the controller 250 may continue to perform Touch event detection in real time in the mapping window, if an event trigger is detected, then perform event judgment, if it is detected that the user clicks on the "details" location of the "application 18", then open the details page of the application 18. And then by detecting the picture content of the mapping window and the current user interface, if the detected picture content is inconsistent, the mapping window is refreshed and displayed as a detail page of the application 18. The specific refreshing process may be the same as the above-described manner of generating the mapping window, and will not be described herein.
Based on the touch-assisted interaction method, a display device includes a display 275, a touch assembly 276, and a controller 250. Wherein the display 275 is configured to display a user interface, the touch assembly 276 is configured to detect a touch action entered by a user, and the controller 250 is configured to perform the following program steps:
s4: acquiring an action instruction input by a user on a mapping window, wherein the mapping window comprises all or part of controls in a current user interface, and the controls in the mapping window have an association relationship with the controls in the current user interface;
s5: responding to the action instruction, and extracting the touch position of the action instruction in a mapping window;
s6: and executing the control action corresponding to the touch position in the current user interface according to the association relation.
As can be seen from the above technical solutions, after the mapping window is displayed, the display device 200 and the touch-assisted interaction method provided in the foregoing embodiments may enable the controller 250 of the display device 200 to respond to the action instruction by inputting the action instruction on the mapping window, and extract the corresponding touch position, so as to execute the control action corresponding to the touch position in the current user interface according to the association relationship. According to the method, the operation on the mapping window can be equivalently implemented in the current user interface through the association relation between the mapping window and the control in the current user interface, so that a user can finish the operation in the whole user interface in a region convenient to operate, and the user experience is improved.
The foregoing detailed description of the embodiments is merely illustrative of the general principles of the present application and should not be taken in any way as limiting the scope of the invention. Any other embodiments developed in accordance with the present application without inventive effort are within the scope of the present application for those skilled in the art.

Claims (12)

1. A display device, characterized by comprising:
a display configured to display a user interface;
the touch control component is configured to detect a touch control action input by a user;
a controller configured to:
acquiring a touch instruction input by a user and used for executing touch assistance;
responding to the touch instruction, and acquiring the interface type of the current user interface;
extracting a custom mapping area according to the interface type;
executing screenshot on a custom mapping area of a current user interface;
scaling the screenshot results to generate a local thumbnail;
traversing the positions of all the controls in the custom mapping area;
establishing an association relation between each control and pixel points in the local thumbnail according to the position of the control in the custom mapping area, and generating a mapping window;
And displaying the mapping window in the association area of the touch instruction.
2. The display device of claim 1, wherein in the step of obtaining a touch instruction for performing touch assistance entered by a user, the controller is further configured to:
monitoring touch actions input by a user, wherein the touch actions comprise long-press actions, multi-click actions, sliding actions, multi-finger actions and click actions input at preset positions;
and if the touch action is the same as the preset touch action, generating a touch instruction for executing touch assistance.
3. The display device of claim 1, wherein the controller is further configured to:
responding to the touch instruction, and executing screenshot on the whole picture of the current user interface;
scaling the screenshot results to generate a global thumbnail;
traversing the positions of all the controls in the current user interface;
and establishing an association relation between each control and the pixel points in the global thumbnail according to the position of the control in the current user interface, and generating the mapping window.
4. The display device of claim 1, wherein the controller is further configured to:
Acquiring the size of a display area of a mapping window;
and executing a display program built in an operating system according to the display area size to generate the mapping window.
5. The display device of claim 1, wherein the controller is further configured to:
responding to the touch instruction, and acquiring a direct operation area corresponding to the touch instruction, wherein the direct operation area is a preset partition to which a finger lifting position in the touch instruction belongs;
executing screenshot on an indirect operation area of a current user interface, wherein the indirect operation area is a preset partition outside the direct operation area;
scaling the screenshot results to generate a local thumbnail;
traversing the positions of the controls in the indirect operation area;
and establishing an association relation between each control and the pixel points in the local thumbnail according to the position of the control in the indirect operation area, and generating the mapping window.
6. The display device of claim 1, wherein in the step of displaying the mapping window in the associated region of the touch instruction, the controller is further configured to:
monitoring touch actions input by a user, and recording the lifting position of a finger in the touch instruction;
And displaying the mapping window by taking the finger lifting position as a reference.
7. The display device of claim 6, wherein in the step of displaying the mapping window based on the finger lift position, the controller is further configured to:
calculating the distance between the finger lifting position and the edge position of the display screen;
if the distance is greater than or equal to a judgment threshold, displaying the mapping window by taking the finger lifting position as a center point;
and if the distance is smaller than the judging threshold value, translating the mapping window by taking the finger lifting position as a starting point so as to completely display the mapping window.
8. The display device of claim 6, wherein in the step of displaying the mapping window based on the finger lift position, the controller is further configured to:
acquiring a preset partition to which the finger lifting position belongs;
and displaying the mapping window in the preset partition.
9. A display device, characterized by comprising:
a display configured to display a user interface;
the touch control component is configured to detect a touch control action input by a user;
A controller configured to:
acquiring the interface type of a current user interface;
extracting a custom mapping area according to the interface type;
executing screenshot on a custom mapping area of a current user interface;
scaling the screenshot results to generate a local thumbnail;
traversing the positions of all the controls in the custom mapping area;
establishing an association relation between each control and pixel points in the local thumbnail according to the position of the control in the custom mapping area, and generating a mapping window;
acquiring an action instruction input by a user on the mapping window;
responding to the action instruction, and extracting the touch position of the action instruction in the mapping window;
and executing the control action corresponding to the touch position in the current user interface according to the association relation.
10. The display device of claim 9, wherein the controller is further configured to:
detecting the image content of the mapping window and the current user interface;
and refreshing the display content of the mapping window if the mapping window is inconsistent with the picture content of the current user interface.
11. The touch assistance interaction method is characterized by being applied to display equipment, wherein the display equipment comprises a display, a touch assembly and a controller, and comprises the following steps of:
Acquiring a touch instruction input by a user and used for executing touch assistance;
responding to the touch instruction, and acquiring the interface type of the current user interface;
extracting a custom mapping area according to the interface type;
executing screenshot on a custom mapping area of a current user interface;
scaling the screenshot results to generate a local thumbnail;
traversing the positions of all the controls in the custom mapping area;
establishing an association relation between each control and pixel points in the local thumbnail according to the position of the control in the custom mapping area, and generating a mapping window;
and displaying the mapping window in the association area of the touch instruction.
12. The touch assistance interaction method is characterized by being applied to display equipment, wherein the display equipment comprises a display, a touch assembly and a controller, and comprises the following steps of:
acquiring a touch instruction input by a user and used for executing touch assistance;
responding to the touch instruction, and acquiring the interface type of the current user interface;
extracting a custom mapping area according to the interface type;
executing screenshot on a custom mapping area of a current user interface;
scaling the screenshot results to generate a local thumbnail;
Traversing the positions of all the controls in the custom mapping area;
establishing an association relation between each control and pixel points in the local thumbnail according to the position of the control in the custom mapping area, and generating a mapping window;
acquiring an action instruction input by a user on the mapping window;
responding to the action instruction, and extracting the touch position of the action instruction in the mapping window;
and executing the control action corresponding to the touch position in the current user interface according to the association relation.
CN202010831749.6A 2020-08-18 2020-08-18 Display equipment and touch control assisting interaction method Active CN114157889B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010831749.6A CN114157889B (en) 2020-08-18 2020-08-18 Display equipment and touch control assisting interaction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010831749.6A CN114157889B (en) 2020-08-18 2020-08-18 Display equipment and touch control assisting interaction method

Publications (2)

Publication Number Publication Date
CN114157889A CN114157889A (en) 2022-03-08
CN114157889B true CN114157889B (en) 2024-04-16

Family

ID=80460423

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010831749.6A Active CN114157889B (en) 2020-08-18 2020-08-18 Display equipment and touch control assisting interaction method

Country Status (1)

Country Link
CN (1) CN114157889B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114579013B (en) * 2022-03-14 2022-12-20 北京华璨电子有限公司 Touch double-screen device based on windows system
CN115834754A (en) * 2022-09-29 2023-03-21 歌尔科技有限公司 Interaction control method and device, head-mounted display equipment and medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102937876A (en) * 2011-11-23 2013-02-20 微软公司 Dynamic scaling of a touch sensor
KR101294201B1 (en) * 2013-02-05 2013-08-16 주식회사 유소프테이션 Portable device and operating method thereof
CN103955339A (en) * 2014-04-25 2014-07-30 华为技术有限公司 Terminal operation method and terminal equipment
CN104484111A (en) * 2014-12-30 2015-04-01 小米科技有限责任公司 Content display method and device for touch screen
CN106527948A (en) * 2016-11-14 2017-03-22 珠海市魅族科技有限公司 Screen touch control method and device
CN106527656A (en) * 2016-10-19 2017-03-22 北京奇虎科技有限公司 Display method, device and terminal equipment
CN108196748A (en) * 2017-12-28 2018-06-22 努比亚技术有限公司 Terminal display control method, terminal and computer readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102937876A (en) * 2011-11-23 2013-02-20 微软公司 Dynamic scaling of a touch sensor
KR101294201B1 (en) * 2013-02-05 2013-08-16 주식회사 유소프테이션 Portable device and operating method thereof
CN103955339A (en) * 2014-04-25 2014-07-30 华为技术有限公司 Terminal operation method and terminal equipment
CN104484111A (en) * 2014-12-30 2015-04-01 小米科技有限责任公司 Content display method and device for touch screen
CN106527656A (en) * 2016-10-19 2017-03-22 北京奇虎科技有限公司 Display method, device and terminal equipment
CN106527948A (en) * 2016-11-14 2017-03-22 珠海市魅族科技有限公司 Screen touch control method and device
CN108196748A (en) * 2017-12-28 2018-06-22 努比亚技术有限公司 Terminal display control method, terminal and computer readable storage medium

Also Published As

Publication number Publication date
CN114157889A (en) 2022-03-08

Similar Documents

Publication Publication Date Title
CN111970549B (en) Menu display method and display device
CN112181207B (en) Display device and geometric figure recognition method
CN111901646A (en) Display device and touch menu display method
CN112087671B (en) Display method and display equipment for control prompt information of input method control
CN114157889B (en) Display equipment and touch control assisting interaction method
CN112165641A (en) Display device
CN113630569B (en) Display apparatus and control method of display apparatus
CN111954059A (en) Screen saver display method and display device
CN114430492B (en) Display device, mobile terminal and picture synchronous scaling method
CN111984167B (en) Quick naming method and display device
CN111913622B (en) Screen interface interactive display method and display equipment
CN111787350B (en) Display device and screenshot method in video call
CN112235621B (en) Display method and display equipment for visual area
CN113810747B (en) Display equipment and signal source setting interface interaction method
CN112473121B (en) Display device and avoidance ball display method based on limb identification
CN111935530B (en) Display equipment
CN111259639B (en) Self-adaptive adjustment method of table and display equipment
CN113485613A (en) Display equipment and method for realizing free-drawing screen edge painting
CN114760513A (en) Display device and cursor positioning method
CN111897463A (en) Screen interface interactive display method and display equipment
CN114417035A (en) Picture browsing method and display device
CN112367550A (en) Method for realizing multi-title dynamic display of media asset list and display equipment
CN111988649A (en) Control separation amplification method and display device
CN111913621B (en) Screen interface interactive display method and display equipment
CN114079827A (en) Menu display method and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant