CN115589501A - Device, system and method for remotely controlling television - Google Patents

Device, system and method for remotely controlling television Download PDF

Info

Publication number
CN115589501A
CN115589501A CN202211053431.5A CN202211053431A CN115589501A CN 115589501 A CN115589501 A CN 115589501A CN 202211053431 A CN202211053431 A CN 202211053431A CN 115589501 A CN115589501 A CN 115589501A
Authority
CN
China
Prior art keywords
control interface
virtual control
coordinate system
gesture operation
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211053431.5A
Other languages
Chinese (zh)
Inventor
武家兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Eswin Computing Technology Co Ltd
Haining Eswin IC Design Co Ltd
Original Assignee
Beijing Eswin Computing Technology Co Ltd
Haining Eswin IC Design Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Eswin Computing Technology Co Ltd, Haining Eswin IC Design Co Ltd filed Critical Beijing Eswin Computing Technology Co Ltd
Priority to CN202211053431.5A priority Critical patent/CN115589501A/en
Publication of CN115589501A publication Critical patent/CN115589501A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program

Landscapes

  • Engineering & Computer Science (AREA)
  • Social Psychology (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a device, a system and a method for remotely controlling a television, and relates to the technical field of intelligent equipment control. The apparatus of the present application comprises: the device comprises an acquisition unit, an identification unit, a generation unit and a sending unit; the system comprises an acquisition unit, a display unit and a control unit, wherein the acquisition unit is used for acquiring a virtual control interface corresponding to an actual control interface currently displayed by a television terminal, the virtual control interface is obtained by performing holographic projection based on the actual control interface, and the virtual control interface and the actual control interface display the same content information; the recognition unit is used for recognizing gesture operation performed on the virtual control interface; the generating unit is used for generating a control instruction for controlling the television terminal based on the gesture operation; and the sending unit is used for sending the control instruction to the television terminal so as to control the television terminal to display the specified content information. The method and the device are applied to optimizing operation control of the television terminal.

Description

Device, system and method for remotely controlling television
Technical Field
The present application relates to the field of intelligent device control technologies, and in particular, to a device, a system, and a method for remotely controlling a television.
Background
With the development of technological innovation, smart televisions have more and more powerful functions, and provided application services are more and more abundant, for example, besides live programs, the smart televisions can also be accessed to networks to watch massive videos, or based on installed application programs, the smart televisions enjoy diversified intelligent services such as games, image processing, document reading and the like.
At present, for a scene that a television remote controller is still adopted to control and operate a television, the following similar operation situations inevitably occur: after the user moves, the remote controller is often not nearby, and if the user wants to control the television at the moment, the user needs to find the remote controller first and then can continue to control the television; or, since most of the television display interfaces are designed in a multi-level interface form, for example, the main display interface is divided into a plurality of option areas which further jump to the sub display interfaces, if a user wants to search for a specified display content, the user needs to continuously touch the direction key of the remote controller to move to the required option area to enter the sub display interface, but needs to touch the direction key again to continue moving the search.
For a user, the specified display content is searched in a physical key mode of the remote controller, so that the operation is complex, the efficiency is low, and the use experience is greatly reduced.
Disclosure of Invention
The application provides a device and a method for remotely controlling a television, and mainly aims to generate a control instruction by using gesture operation performed in a virtual control interface obtained by holographic projection so as to control a television terminal to display specified content information, optimize and control television operation, have simple steps and high efficiency, and greatly improve the use experience of a user.
In order to achieve the above purpose, the present application mainly provides the following technical solutions:
the first aspect of the present application provides a device for remotely controlling a television, which is applied to a back control detection terminal, and the device includes:
the television terminal comprises an acquisition unit, a display unit and a control unit, wherein the acquisition unit is used for acquiring a virtual control interface corresponding to an actual control interface currently displayed by the television terminal, the virtual control interface is obtained by performing holographic projection based on the actual control interface, and the virtual control interface and the actual control interface display the same content information;
the recognition unit is used for recognizing gesture operation performed on the virtual control interface;
the generating unit is used for generating a control instruction for controlling the television terminal based on the gesture operation;
and the sending unit is used for sending the control instruction to the television terminal so as to control the television terminal to display the specified content information.
In some variations of the first aspect of the present application, the apparatus further comprises:
the construction unit is used for establishing a plane rectangular coordinate system on a projection plane corresponding to the virtual control interface before the gesture operation performed on the virtual control interface is identified;
the construction unit is further configured to establish a Z coordinate system on the projection plane perpendicular to the planar rectangular coordinate system;
the construction unit is further configured to construct a three-dimensional coordinate system corresponding to the projection plane according to the plane rectangular coordinate system and the Z coordinate system.
In some modified embodiments of the first aspect of the present application, the identification unit includes:
the acquisition module is used for acquiring gesture actions occurring in front of the virtual control interface;
the judging module is used for judging whether the gesture action occurs in a range of a Z coordinate system value in the three-dimensional coordinate system as a first preset threshold value;
the determining module is used for determining the gesture action as a target gesture operation to be recognized when the gesture action is judged to occur in a range of a first preset threshold value of a Z coordinate system value in the three-dimensional coordinate system;
and the identification module is used for identifying the target gesture operation on the virtual control interface based on the touch point and the moving direction corresponding to the target gesture operation.
In some variations of the first aspect of the present application, the identification module is further specifically configured to:
if the touch points corresponding to the target gesture operation are detected to be one in the virtual control interface and translate towards one direction on a plane rectangular coordinate system, identifying the target gesture operation as a sliding operation on the virtual control interface; or the like, or, alternatively,
if one touch point corresponding to the target gesture operation is detected in the virtual control interface and moves on the Z coordinate system, identifying that the target gesture operation is a click operation on the virtual control interface; or the like, or, alternatively,
and if two touch points corresponding to the target gesture operation are detected in the virtual control interface and move in the same direction or in the opposite direction on the plane rectangular coordinate system, identifying that the target gesture operation is the same-direction or opposite-direction sliding operation of two fingers on the virtual control interface.
In some modified embodiments of the first aspect of the present application, the generating unit includes:
the calculation module is used for calculating a first relative position coordinate of the touch point in the plane rectangular coordinate system based on the touch point generated in the virtual control interface by the gesture operation;
the calculation module is further configured to calculate a proportional relationship between a first interface size corresponding to the actual control interface and a second interface size corresponding to the virtual control interface based on the first interface size corresponding to the actual control interface and the second interface size corresponding to the virtual control interface;
the conversion module is used for converting the first relative position coordinate into a second relative position coordinate in an actual control interface currently displayed by the television terminal according to the proportional relation;
and the generating module is used for generating a control instruction corresponding to the gesture operation according to the gesture operation and the second relative position coordinate corresponding to the touch point.
In some variations of the first aspect of the present application, the apparatus further comprises:
the detection unit is used for detecting whether the position distance between the virtual control interface and the user is larger than a second preset threshold value or not;
and the control unit is used for controlling the operation of re-executing the projection virtual control interface when detecting that the position distance between the virtual control interface and the user is greater than a second preset threshold value, so as to reduce the position distance to reach the second preset threshold value.
In some variations of the first aspect of the present application, the apparatus further comprises:
the monitoring unit is used for monitoring page switching operation presented on the virtual control interface;
and the sending unit is further used for sending a holographic projection stopping signal to the holographic projection terminal if the operation of jumping out of the virtual control interface is detected.
A second aspect of the present application provides a system for remotely controlling a television, the system comprising: the system comprises: the device comprises a television terminal, a holographic projection terminal and a playback detection terminal, wherein the device for remotely controlling the television is applied to the playback detection terminal.
A third aspect of the application provides a method of remotely controlling a television, the method comprising:
acquiring a virtual control interface corresponding to an actual control interface currently displayed by a television terminal, wherein the virtual control interface is obtained by performing holographic projection based on the actual control interface, and the virtual control interface and the actual control interface have the same display content information;
recognizing gesture operation performed on the virtual control interface;
generating a control instruction corresponding to the virtual control interface based on the gesture operation;
and sending the control instruction to the television terminal to control the television terminal to display the specified content information.
In some variations of the third aspect of the present application, prior to the identifying the gesture operation performed on the virtual control interface, the method further comprises:
establishing a plane rectangular coordinate system on a projection plane corresponding to the virtual control interface;
a Z coordinate system is established on the projection plane and is perpendicular to the plane rectangular coordinate system;
and constructing a three-dimensional coordinate system corresponding to the projection plane according to the plane rectangular coordinate system and the Z coordinate system.
In some variations of the third aspect of the present application, the recognizing a gesture operation performed on the virtual control interface includes:
acquiring gesture actions occurring in front of the virtual control interface;
judging whether the gesture action occurs in a range of a Z coordinate system value in the three-dimensional coordinate system as a first preset threshold value or not;
if yes, determining the gesture action as a target gesture operation to be recognized;
and identifying the target gesture operation on the virtual control interface based on the touch point and the moving direction corresponding to the target gesture operation.
In some modified embodiments of the third aspect of the present application, the recognizing, on the virtual control interface, the target gesture operation based on the touch point and the moving direction corresponding to the target gesture operation includes:
if the touch points corresponding to the target gesture operation are detected to be one in the virtual control interface and translate towards one direction on a plane rectangular coordinate system, identifying the target gesture operation as a sliding operation on the virtual control interface; or the like, or a combination thereof,
if one touch point corresponding to the target gesture operation is detected in the virtual control interface and moves on the Z coordinate system, identifying that the target gesture operation is a click operation on the virtual control interface; or the like, or a combination thereof,
and if two touch points corresponding to the target gesture operation are detected in the virtual control interface and move in the same direction or in the opposite direction on the plane rectangular coordinate system, identifying that the target gesture operation is the same-direction or opposite-direction sliding operation of two fingers on the virtual control interface.
In some modified embodiments of the third aspect of the present application, the generating a control instruction for controlling the television terminal based on the gesture operation includes:
calculating a first relative position coordinate of a touch point in the plane rectangular coordinate system based on the touch point generated in the virtual control interface by the gesture operation;
calculating a proportional relation between a first interface size and a second interface size based on the first interface size corresponding to the actual control interface and the second interface size corresponding to the virtual control interface;
converting the first relative position coordinate into a second relative position coordinate in an actual control interface currently displayed by the television terminal according to the proportional relation;
and generating a control instruction for controlling the television terminal according to the gesture operation and the second relative position coordinate corresponding to the touch point.
In some variations of the third aspect of the present application, the method further comprises:
detecting whether the position distance between the virtual control interface and the user is larger than a second preset threshold value or not;
and if so, controlling to re-execute the operation of the projection virtual control interface so as to reduce the position distance to reach the second preset threshold value.
In some variations of the third aspect of the present application, the method further comprises:
monitoring page switching operation presented in the virtual control interface;
and if the operation of jumping out of the virtual control interface is detected, sending a holographic projection stopping signal to the holographic projection terminal.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method of remotely controlling a television as described above.
A fifth aspect of the present application provides an electronic device, comprising: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method of remotely controlling a television as described above when executing the computer program.
By means of the technical scheme, the technical scheme provided by the application at least has the following advantages:
the application provides a device, a system and a method for remotely controlling a television, and the device provided by the application comprises: the device comprises an acquisition unit, an identification unit, a generation unit and a sending unit. The method comprises the steps that an acquisition unit is used for acquiring a virtual control interface corresponding to an actual control interface currently displayed by the television terminal, the virtual control interface is obtained by performing holographic projection on the actual control interface and is the same as the display content information of the actual control interface, a recognition unit is used for recognizing gesture operation performed on the virtual control interface, a generation unit is used for generating a control instruction corresponding to the gesture operation, and a sending unit is used for sending the control instruction back to the television terminal to control the television terminal to display the specified content information. This application is with the control command that the gesture operation that realizes at holographic projection's virtual control interface obtained, remote control television terminal, and no longer need entity remote controller equipment to participate in the control, compare in prior art, solved because of the physics button mode with the remote controller control television terminal leads to complex operation, the technical problem of inefficiency, the technical scheme that this application provided can the optimal control television operation, and the step is simple, efficient, has improved user experience greatly.
The foregoing description is only an overview of the technical solutions of the present application, and the present application can be implemented according to the content of the description in order to make the technical means of the present application more clearly understood, and the following detailed description of the present application is given in order to make the above and other objects, features, and advantages of the present application more clearly understandable.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the application. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
fig. 1 is a schematic structural diagram of a system for remotely controlling a television according to an embodiment of the present application;
fig. 2 is a block diagram illustrating an apparatus for remotely controlling a television according to an embodiment of the present disclosure;
fig. 3 is a block diagram of another apparatus for remotely controlling a television according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a three-dimensional coordinate system corresponding to an exemplary virtual control interface according to an embodiment of the present disclosure;
fig. 5 is a schematic view illustrating a scenario in which a user remotely controls a television terminal based on a virtual control interface according to an embodiment of the present application;
fig. 6 is a flowchart of a method for remotely controlling a television according to an embodiment of the present application;
fig. 7 is a flowchart of another method for remotely controlling a television according to an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present application are shown in the drawings, it should be understood that the present application may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
In order to realize the control of the television terminal in a physical key mode instead of a remote controller, an embodiment of the present application provides a system for remotely controlling a television, as shown in fig. 1, the system includes: the television terminal 1, the holographic projection terminal 2 and the feedback detection terminal 3, wherein the television terminal 1, the holographic projection terminal 2 and the feedback detection terminal 3 can be connected in a wired or wireless mode. The following explains the operation principle of controlling the television terminal 1 by using the system, with a device for remotely controlling a television provided on the playback detection terminal 3 as an execution subject.
An embodiment of the present application provides an apparatus for remotely controlling a television, as shown in fig. 2, the apparatus includes: the acquiring unit 31 is configured to acquire a virtual control interface corresponding to an actual control interface currently displayed by the television terminal, where the virtual control interface is obtained by performing holographic projection on the basis of the actual control interface; a recognition unit 32 configured to recognize a gesture operation performed on the virtual control interface; the generating unit 33 is configured to generate a control instruction corresponding to the virtual control interface based on the gesture operation; and a sending unit 34, configured to send a control instruction to the television terminal to control the television terminal to display the specified content information.
In the embodiment of the application, the holographic projection terminal can project the actual control interface currently displayed by the television terminal into the virtual control interface based on the image signal output by the television terminal, so that the content information displayed by the virtual control interface and the actual control interface is the same. The actual control interface of the television terminal comprises a main interface of the television terminal, a sub-interface into which the index path enters, and further comprises a related menu interface before the specific functional application service is not opened, for example, for a video application program installed on the television terminal, before a specified video is not opened for playing, the video application program can be called as the actual control interface, so that the video application program can be holographically projected into a corresponding virtual interface.
For the virtual control interface, it is equivalent to an operable "virtual touch screen interface", and in the embodiment of the present application, after the acquiring unit 31 acquires the "virtual touch screen interface", the recognizing unit 32 is used to recognize a gesture operation occurring on the "virtual touch screen interface". Further, the generating unit 33 is utilized to obtain a corresponding control instruction through gesture resolving operation.
Illustratively, if the gesture parsing operation is a touch click once in the "virtual touch screen interface" and a slide is performed in a left or right direction of the horizontal direction of the interface, the corresponding control instruction is obtained as "controlling the virtual control interface to page (i.e., switching the display interface to the left or right) based on the left or right slide operation".
Further, the sending unit 34 is used for sending the control instruction to the television terminal, and since the virtual control interface is actually a virtual interface obtained by performing holographic projection on the actual control interface currently displayed by the television terminal, the control operation on the virtual control interface is obtained based on the control instruction, which is equivalent to how to operate the actual control interface currently displayed by the television terminal, that is, how to perform a back control operation on the television terminal.
As the control instruction exemplarily illustrated above, after the television terminal receives the control instruction, the operation of switching the display interface to the left or to the right of the actual control interface currently displayed by the television terminal is executed, so as to obtain the specified content information that the user actually wants to be presented by the television terminal interface.
More than, the embodiment of the application provides a device for remotely controlling a television, can remotely control a television terminal based on a control instruction obtained by gesture operation realized on a virtual control interface of holographic projection, and does not need an entity remote controller device to participate in control.
In some modified embodiments, the present application provides another apparatus for remotely controlling a television, as shown in fig. 3, the apparatus includes: in addition to the acquiring unit 31, the identifying unit 32, the generating unit 33 and the sending unit 34, the apparatus provided in the foregoing embodiment is further explained in detail, which specifically includes the following steps:
as shown in fig. 3, the apparatus further includes: the construction unit 35 is specifically configured to establish a rectangular plane coordinate system on a projection plane corresponding to the virtual control interface before recognizing the gesture operation performed on the virtual control interface; a Z coordinate system is established on the projection plane perpendicular to the plane rectangular coordinate system; and constructing a three-dimensional coordinate system corresponding to the projection plane according to the plane rectangular coordinate system and the Z coordinate system.
Illustratively, the embodiment of the present application provides a three-dimensional coordinate system as shown in fig. 4, and a planar rectangular coordinate system (X and Y coordinate systems) and a Z coordinate system in the virtual control interface, wherein the central points of the two coordinate systems are coincident, thereby constructing a three-dimensional coordinate system. In addition, the position of the central point in the virtual control interface is not limited, namely the position of the central point can be changed in a user-defined mode according to the actual coordinate calculation requirement, so that the position coordinate of a certain touch-controllable option area in the interface can be conveniently calculated by using a three-dimensional coordinate system. For example, the center point of the three-dimensional coordinate system may be located at the very center of the virtual control interface, or may be located at an interface edge of the virtual control interface.
Further, as shown in fig. 3, the identification unit 32 is further divided to include: the obtaining module 321, the judging module 322, the determining module 323, and the identifying module 324 are used to perform gesture recognition operations on the virtual control interface constructed based on the constructing unit 35, and the specific explanation is as follows:
an obtaining module 321, configured to obtain a gesture motion occurring in front of the virtual control interface.
In the embodiment of the present application, the virtual control interface may be projected right in front of the user, so that based on the activity of the user in the space, many user actions may occur in front of the virtual control interface, and the example of the present application mainly captures gesture actions of the user, that is, includes: finger motion, fist making motion, hand movement in space, and the like.
The determining module 322 is configured to determine whether the gesture occurs within a range in which a value of a Z coordinate system in the three-dimensional coordinate system is a first preset threshold.
The determining module 323 is configured to determine the gesture motion as a target gesture operation to be recognized when it is determined that the gesture motion occurs within a range where a Z coordinate system value in the three-dimensional coordinate system is a first preset threshold.
In the embodiment of the application, in the space between the virtual control interface and the user right in front of the virtual control interface, the gesture actions of the user can be captured to be diverse, and some gesture actions which are not used for operating and controlling the virtual control interface can be doped, so that the embodiment of the application can lock an effective space range on the basis that the value of a Z coordinate system in a three-dimensional coordinate system is a first preset threshold, and the gesture actions captured in the effective space range are used as target gesture operations to be recognized, thereby achieving the purposes of effectively primarily screening a great amount of diverse gesture actions which are not actually used for operating and intending on the virtual control interface, and reducing the workload of subsequently recognizing the gesture actions.
The recognition module 324 is configured to recognize a target gesture operation on the virtual control interface based on the touch point and the moving direction corresponding to the target gesture operation.
In the embodiment of the present application, the touch point refers to a relative position point where a user touches and clicks the virtual control interface, and it should be noted that, since the virtual control interface is not an actual touchable physical interface (e.g., a touchable display screen), the touch point cannot be measured by the reference of "touching the interface".
Therefore, the countermeasure of the embodiment of the present application is: and judging whether the actual intention of the gesture action of the user is touch click according to the forward displacement and the reverse displacement of the target gesture operation on the Z coordinate system, and if so, further determining the relative position of the target gesture operation in a plane rectangular coordinate system, namely judging to be a touch point corresponding to the target gesture operation.
For example, in the embodiment of the present application, an effective space range is locked based on that the Z coordinate system in the three-dimensional coordinate system takes 10 cm, and in the effective space, in consideration of the detection error, for the target gesture operation as a movement combination obtained by a forward displacement and a reverse displacement on the Z coordinate system, whether the position of Z =0 is reached or not, it may be determined that the actual intention of the user gesture action is to touch and click the virtual control interface.
For the moving direction corresponding to the target gesture operation, the embodiment of the present application mainly refers to the moving direction of the target gesture operation in a plane rectangular coordinate system and the moving direction in a Z coordinate system.
In the embodiment of the application, the target gesture operation is further recognized according to the touch point and the moving direction, and specifically, the operation principle of the recognition module 324 is explained as follows:
for the identification module 324, it is identified that: and if one touch point corresponding to the target gesture operation is detected and the touch point is translated towards one direction on the plane rectangular coordinate system, identifying that the target gesture operation is a sliding operation on the virtual control interface.
For example, in the plane rectangular coordinate system, the target gesture operation is performed in a left or right direction, an upper or lower direction, and an oblique direction in any direction with one touch point, that is, the target gesture operation is recognized as a sliding operation on the virtual control interface.
Still alternatively, for the identifying module 324, identifying: and if the touch points corresponding to the target gesture operation are detected to be one and move on the Z coordinate system, identifying that the target gesture operation is a click operation on the virtual control interface.
For example, in the Z-coordinate system, the target gesture operation is displaced in a forward direction or a reverse direction by one touch point, that is, the target gesture operation is recognized as a click operation on the virtual control interface.
Still alternatively, for the identifying module 324, identifying: and if two touch points corresponding to the target gesture operation are detected and move in the same direction or in the opposite direction on the plane rectangular coordinate system, identifying that the target gesture operation is the same-direction or opposite-direction sliding operation of two fingers on the virtual control interface.
Further, as shown in fig. 3, the generating unit 33 is further divided to include: the calculation module 331, the conversion module 332, and the generation module 333, so as to generate a corresponding control instruction based on the gesture operation recognized by the recognition unit 32 in the virtual control interface by using these modules, which is specifically explained as follows:
the calculating module 331 is configured to calculate a first relative position coordinate of the touch point in the rectangular plane coordinate system based on the touch point generated in the virtual control interface by the gesture operation.
In the embodiment of the present application, since a three-dimensional coordinate system (including a planar rectangular coordinate system and a Z coordinate system) is already established in the virtual control interface, a real intention of a gesture operation can be determined according to the Z coordinate system, if the real intention is a click operation, it should be noted that, since the virtual control interface is not a physical touchable screen, a relative position exists between the touch point and the virtual control interface, that is, a relative position also exists between the virtual control interface and the planar rectangular coordinate system, and specifically, the relative position can be regarded as a position coordinate on the planar rectangular coordinate system.
The calculating module 331 is further configured to calculate a proportional relationship between the first interface size and the second interface size based on the first interface size corresponding to the actual control interface and the second interface size corresponding to the virtual control interface.
In the embodiment of the application, after the holographic projection operation is performed, the size of the virtual control interface is not necessarily the same as the size of the actual control interface of the television terminal, but the virtual control interface and the actual control interface of the television terminal have a certain proportional relationship, for example, the virtual control interface and the actual control interface can be scaled up or down in an equal proportion manner.
The converting module 332 is configured to convert the first relative position coordinate into a second relative position coordinate in an actual control interface currently displayed by the television terminal according to the proportional relationship.
The virtual control interface is obtained by holographic projection of the actual control interface, the content information displayed by the two is the same, but the sizes of the two are not necessarily consistent, so that in the embodiment of the application, based on a proportional relationship existing between the sizes of the two, the position coordinate of a second relative position where the touch point is located in the actual control interface corresponding to the television terminal is obtained by calculation based on a first relative position coordinate obtained by the touch point in the virtual control interface. Illustratively, the following specific implementation methods may be employed:
for example, a planar rectangular coordinate system may also be constructed for the actual control interface corresponding to the television terminal, and it is ensured that the relative position of the central point of the planar rectangular coordinate system in the actual control interface is the same as the relative position of the central point of the planar rectangular coordinate system in the virtual control interface of the virtual control interface, so as to facilitate the coordinate conversion calculation based on the respective planar rectangular coordinate systems of different interfaces.
It should be noted that, in the embodiment of the present application, the position coordinates in the interface where the central points of the two planar rectangular coordinate systems are located are not specifically limited, but may be set by self-definition according to the actual coordinate operation requirement, but it is required to ensure that the relative positions in the interface where the central points are located are consistent.
Furthermore, according to the proportional relationship between the virtual control interface and the actual control interface corresponding to the television terminal, the position coordinates in the planar rectangular coordinate system corresponding to the virtual control interface can be efficiently converted into the position coordinates in the planar rectangular coordinate system corresponding to the actual control interface of the television terminal.
The generating module 333 is configured to generate a control instruction corresponding to the gesture operation according to the gesture operation and the second relative position coordinate corresponding to the touch point.
In the examples of the present application, the following are exemplified:
for example, if the gesture operation is recognized as a sliding operation performed on the virtual control interface by using one touch point, and the virtual control interface is a holographic projection interface of the actual control interface of the television terminal, it is equivalent to recognize the gesture operation as a sliding operation performed on the actual control interface by using one touch point.
And then according to the relative position coordinates of the touch points in the actual control interface, it is known clearly from which position the touch sliding is initiated, accordingly, for example, if the gesture operation is a leftward or right sliding, a control instruction is correspondingly generated as 'controlling the actual control interface of the television terminal, and turning pages by a leftward or right sliding'.
Further, as shown in fig. 3, the apparatus further includes: a detection unit 36 and a control unit 37.
The detecting unit 36 is configured to detect whether a position distance between the virtual control interface and the user is greater than a second preset threshold.
And the control unit 37 is configured to, when it is detected that the position distance between the virtual control interface and the user is greater than the second preset threshold, control to re-execute the operation of projecting the virtual control interface so as to reduce the position distance to reach the second preset threshold.
In the embodiment of the present application, it is further required to detect whether a position distance between the virtual control interface and the user is an effective distance convenient for the user to perform a gesture operation, and if the position distance exceeds the effective distance, it is difficult for the user to perform a corresponding control operation on the virtual control interface when the user is far away from the virtual control interface in space.
For example, as shown in fig. 5, the present application provides a schematic view of a scene in which a user remotely controls a television terminal based on a virtual control interface, and in fig. 4, a virtual control interface is spatially projected between the user and the television terminal, and since the virtual control interface is obtained by holographically projecting an actual control interface of the television terminal, the virtual control interface and the actual control interface will present the same display content information. In the scene schematic diagram, a user operates the virtual control interface by gestures to achieve corresponding operations on the actual control interface of the television terminal, so that the virtual control interface is required to receive effective gesture operations, and the position distance between the user and the virtual control interface is required to be ensured to be within the mentioned effective distance.
And if the distance between the projection and the user can not be reduced to reach a second preset threshold value after the operation of the projection virtual control interface is controlled to be executed again, triggering prompt information to prompt the user to move the moving position to the virtual projection interface so as to reduce the distance between the moving position and the user.
Further, as shown in fig. 3, the apparatus further includes:
and the monitoring unit 38 is used for monitoring the page switching operation presented in the virtual control interface.
And the sending unit 34 is further configured to send a holographic projection stopping signal to the holographic projection terminal if the virtual control interface jumping operation is detected.
In the embodiment of the application, the holographic projection terminal is used for executing the holographic projection operation on the actual control interface corresponding to the television terminal, so that the television terminal is controlled in a physical key mode instead of a remote controller based on the gesture operation executed on the virtual control interface.
Therefore, when the control interface jumps out due to the user control operation, for example, the video is played, the holographic projection should be closed, the holographic projection stopping signal should be sent to the holographic projection terminal, and the video is continuously played by the television terminal, so that the video playing effect is prevented from being influenced, and the user experience is ensured to enjoy the specific application service provided by the application program installed on the television terminal.
Further, as an application of the apparatus shown in fig. 2, the present application provides a method for remotely controlling a television. The embodiment of the method corresponds to the embodiment of the apparatus, and for convenience of reading, details in the embodiment of the method are not repeated one by one, but it should be clear that the method in the embodiment can correspondingly implement all the contents in the embodiment of the apparatus, and as shown in fig. 6, the following specific steps are provided for the embodiment of the present application:
401. and acquiring a virtual control interface corresponding to the actual control interface currently displayed by the television terminal.
The virtual control interface is obtained by performing holographic projection based on the actual control interface, and the virtual control interface and the actual control interface display the same content information.
402. And recognizing gesture operation performed on the virtual control interface.
403. And generating a control instruction corresponding to the virtual control interface based on the gesture operation.
404. And sending a control instruction to the television terminal to control the television terminal to display the specified content information.
Further, as an application of the apparatus shown in fig. 3, another method for remotely controlling a television is provided in the embodiment of the present application. This embodiment of the method corresponds to the embodiment of the apparatus described above, and for convenience of reading, details in the embodiment of the method are not repeated one by one in the embodiment of the apparatus described above, but it should be clear that the method in this embodiment can correspondingly implement all the contents in the embodiment of the apparatus described above, and as shown in fig. 7, the method in this embodiment of the present application provides the following specific steps:
501. and acquiring a virtual control interface corresponding to the actual control interface currently displayed by the television terminal.
The virtual control interface is obtained by performing holographic projection based on the actual control interface, and the virtual control interface and the actual control interface display the same content information.
502. And establishing a plane rectangular coordinate system on a projection plane corresponding to the virtual control interface.
503. And establishing a Z coordinate system on the projection plane perpendicular to the plane rectangular coordinate system.
504. And constructing a three-dimensional coordinate system corresponding to the projection plane according to the plane rectangular coordinate system and the Z coordinate system to obtain the three-dimensional coordinate system corresponding to the virtual control interface.
505. And recognizing gesture operation performed on the virtual control interface in a three-dimensional coordinate system corresponding to the virtual control interface.
The step can be explained in detail as follows: acquiring gesture actions occurring in front of the virtual control interface; judging whether the gesture action occurs in a range of a Z coordinate system value in a three-dimensional coordinate system as a first preset threshold value; if yes, determining the gesture action as a target gesture operation to be recognized; and identifying the target gesture operation on the virtual control interface based on the touch point and the moving direction corresponding to the target gesture operation.
Further, the process of recognizing the target gesture operation can be further exemplarily explained as follows:
acquiring a touch point and a moving direction corresponding to target gesture operation in a virtual control interface;
example 1: if the detected touch points corresponding to the target gesture operation are one and translate towards one direction on the plane rectangular coordinate system, identifying the target gesture operation as a sliding operation on the virtual control interface; or the like, or, alternatively,
example 2: if the touch points corresponding to the target gesture operation are detected to be one and move on the Z coordinate system, identifying the target gesture operation as a click operation on the virtual control interface;
example 3: and if two touch points corresponding to the target gesture operation are detected and move in the same direction or in the opposite direction on the plane rectangular coordinate system, identifying the target gesture operation as the same-direction or opposite-direction sliding operation of two fingers on the virtual control interface.
506. And generating a control instruction for controlling the television terminal based on the gesture operation.
In the embodiment of the present application, a specific implementation method for generating a control instruction for controlling a television terminal may be further detailed as follows:
firstly, calculating a first relative position coordinate of a touch point in a plane rectangular coordinate system based on the touch point generated in a virtual control interface by gesture operation; secondly, calculating a proportional relation between the first interface size and the second interface size based on the first interface size corresponding to the actual control interface and the second interface size corresponding to the virtual control interface; thirdly, converting the first relative position coordinate into a second relative position coordinate in an actual control interface currently displayed on the television terminal according to the proportional relation; and finally, generating a control instruction for controlling the television terminal according to the gesture operation and the second relative position coordinate corresponding to the touch point.
507. And sending a control instruction to the television terminal to control the television terminal to display the specified content information.
In addition, the method provided by the embodiment of the application further comprises the following steps: in the process of remotely controlling the television terminal, detecting whether the position distance between the virtual control interface and the user is greater than a second preset threshold value; if yes, controlling to execute the operation of the projection virtual control interface again so as to reduce the position distance to reach a second preset threshold value. Monitoring page switching operation presented in the virtual control interface; and if the operation of jumping out of the virtual control interface is detected, sending a holographic projection stopping signal to the holographic projection terminal.
In summary, the embodiments of the present application provide an apparatus, a system, and a method for remotely controlling a television, where a virtual control interface corresponding to an actual control interface currently displayed by a television terminal is obtained by using the apparatus provided in the embodiments of the present application, and the virtual control interface is obtained by performing holographic projection on the actual control interface and is the same as content information displayed on the actual control interface. And a three-dimensional coordinate system corresponding to the virtual control interface is established to facilitate gesture operation in the three-dimensional coordinate system so as to identify and generate a control instruction for the television terminal, and the control instruction is sent back to the television terminal to control the television terminal to display the specified content information. According to the scheme provided by the embodiment of the application, an entity remote controller device is not required to participate in control, the operation steps are simplified, the operation efficiency is improved, and therefore the user experience is greatly improved through optimized control operation.
The device for remotely controlling the television comprises a processor and a memory, wherein the acquiring unit, the identifying unit, the generating unit, the sending unit and the like are stored in the memory as program units, and the processor executes the program units stored in the memory to realize corresponding functions.
The processor comprises a kernel, and the kernel calls the corresponding program unit from the memory. The kernel can be set to be one or more than one, the kernel parameters are adjusted to generate a control instruction by utilizing gesture operation carried out in a virtual control interface obtained by holographic projection so as to control the television terminal to display the designated content information, the television operation is optimally controlled, the steps are simple, the efficiency is high, and the user experience is greatly improved.
The embodiment of the application provides a computer readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the method for remotely controlling a television is realized.
An embodiment of the present application provides an electronic device, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method of remotely controlling a television as described above when executing the computer program.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a device includes one or more processors (CPUs), memory, and a bus. The device may also include input/output interfaces, network interfaces, and the like.
The memory may include volatile memory in a computer readable medium, random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM), including at least one memory chip. The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising one of 8230; \8230;" 8230; "does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art to which the present application pertains. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (11)

1. A device for remotely controlling a television is applied to a back control detection terminal, and comprises:
the television terminal comprises an acquisition unit, a display unit and a control unit, wherein the acquisition unit is used for acquiring a virtual control interface corresponding to an actual control interface currently displayed by the television terminal, the virtual control interface is obtained by performing holographic projection based on the actual control interface, and the virtual control interface and the actual control interface display the same content information;
the recognition unit is used for recognizing gesture operation performed on the virtual control interface;
the generating unit is used for generating a control instruction for controlling the television terminal based on the gesture operation;
and the sending unit is used for sending the control instruction to the television terminal so as to control the television terminal to display the specified content information.
2. The apparatus of claim 1, further comprising:
the construction unit is used for establishing a plane rectangular coordinate system on a projection plane corresponding to the virtual control interface before the gesture operation performed on the virtual control interface is identified;
the construction unit is also used for establishing a Z coordinate system on the projection plane perpendicular to the plane rectangular coordinate system;
the construction unit is further configured to construct a three-dimensional coordinate system corresponding to the projection plane according to the plane rectangular coordinate system and the Z coordinate system.
3. The apparatus of claim 2, wherein the identification unit comprises:
the acquisition module is used for acquiring gesture actions occurring in front of the virtual control interface;
the judging module is used for judging whether the gesture action occurs in a range of a Z coordinate system value in the three-dimensional coordinate system as a first preset threshold value;
the determining module is used for determining the gesture action as a target gesture operation to be recognized when the gesture action is judged to occur in a range of a first preset threshold value of a Z coordinate system value in the three-dimensional coordinate system;
and the identification module is used for identifying the target gesture operation on the virtual control interface based on the touch point and the moving direction corresponding to the target gesture operation.
4. The apparatus of claim 3, wherein the identification module is further specifically configured to:
if one touch point corresponding to the target gesture operation is detected in the virtual control interface and the touch point is translated towards one direction on a plane rectangular coordinate system, identifying that the target gesture operation is a sliding operation on the virtual control interface; or the like, or a combination thereof,
if one touch point corresponding to the target gesture operation is detected in the virtual control interface and moves on the Z coordinate system, identifying that the target gesture operation is a click operation on the virtual control interface; or the like, or a combination thereof,
and if two touch points corresponding to the target gesture operation are detected in the virtual control interface and move in the same direction or in the opposite direction on the plane rectangular coordinate system, identifying that the target gesture operation is the same-direction or opposite-direction sliding operation of two fingers on the virtual control interface.
5. The apparatus of claim 2, wherein the generating unit comprises:
the calculation module is used for calculating a first relative position coordinate of the touch point in the plane rectangular coordinate system based on the touch point generated in the virtual control interface by the gesture operation;
the calculation module is further configured to calculate a proportional relationship between a first interface size corresponding to the actual control interface and a second interface size corresponding to the virtual control interface based on the first interface size corresponding to the actual control interface and the second interface size corresponding to the virtual control interface;
the conversion module is used for converting the first relative position coordinate into a second relative position coordinate in an actual control interface currently displayed by the television terminal according to the proportional relation;
and the generating module is used for generating a control instruction corresponding to the gesture operation according to the gesture operation and the second relative position coordinate corresponding to the touch point.
6. The apparatus of any one of claims 1 to 5, further comprising:
the detection unit is used for detecting whether the position distance between the virtual control interface and the user is larger than a second preset threshold value or not;
and the control unit is used for controlling to re-execute the operation of the projection virtual control interface when detecting that the position distance between the virtual control interface and the user is greater than a second preset threshold value so as to reduce the position distance to reach the second preset threshold value.
7. The apparatus of any one of claims 1 to 5, further comprising:
the monitoring unit is used for monitoring page switching operation presented on the virtual control interface;
and the sending unit is also used for sending a holographic projection stopping signal to the holographic projection terminal if the operation of jumping out of the virtual control interface is detected.
8. A system for remotely controlling a television, the system comprising: a television terminal, a holographic projection terminal, and a playback detection terminal to which the apparatus for remotely controlling a television according to any one of claims 1 to 7 is applied.
9. A method for remotely controlling a television, the method being applied to a playback detection terminal, the method comprising:
acquiring a virtual control interface corresponding to an actual control interface currently displayed by a television terminal, wherein the virtual control interface is obtained by performing holographic projection based on the actual control interface, and the virtual control interface and the actual control interface display the same content information;
recognizing gesture operation performed on the virtual control interface;
generating a control instruction corresponding to the virtual control interface based on the gesture operation;
and sending the control instruction to the television terminal to control the television terminal to display the specified content information.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when being executed by a processor, carries out the method of remotely controlling a television according to claim 9.
11. An electronic device, comprising: memory, processor and computer program stored on the memory and executable on the processor, which when executed by the processor implements a method of remotely controlling a television according to claim 9.
CN202211053431.5A 2022-08-30 2022-08-30 Device, system and method for remotely controlling television Pending CN115589501A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211053431.5A CN115589501A (en) 2022-08-30 2022-08-30 Device, system and method for remotely controlling television

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211053431.5A CN115589501A (en) 2022-08-30 2022-08-30 Device, system and method for remotely controlling television

Publications (1)

Publication Number Publication Date
CN115589501A true CN115589501A (en) 2023-01-10

Family

ID=84771777

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211053431.5A Pending CN115589501A (en) 2022-08-30 2022-08-30 Device, system and method for remotely controlling television

Country Status (1)

Country Link
CN (1) CN115589501A (en)

Similar Documents

Publication Publication Date Title
CN109240576B (en) Image processing method and device in game, electronic device and storage medium
KR102225802B1 (en) Method and program for making reactive video
KR100687737B1 (en) Apparatus and method for a virtual mouse based on two-hands gesture
CN108596092B (en) Gesture recognition method, device, equipment and storage medium
US10198421B2 (en) Method for inserting or deleting cells, rows or columns in spreadsheet and a device therefor
CN106415472B (en) Gesture control method and device, terminal equipment and storage medium
US9785244B2 (en) Image projection apparatus, system, and image projection method
CN110102044B (en) Game control method based on smart band, smart band and storage medium
KR20130101536A (en) Camera-based information input method and terminal
CN108920070B (en) Screen splitting method and device based on special-shaped display screen, storage medium and mobile terminal
CN101869484A (en) Medical diagnosis device having touch screen and control method thereof
CN104063128A (en) Information processing method and electronic equipment
TW201939260A (en) Method, apparatus, and terminal for simulating mouse operation by using gesture
US11989352B2 (en) Method display device and medium with laser emission device and operations that meet rules of common touch
WO2014194148A2 (en) Systems and methods involving gesture based user interaction, user interface and/or other features
CN106371715B (en) Method and device for realizing multi-item switching
CN103777856A (en) Method and system for processing touch event into remote control gesture and remote control terminal
Chua et al. Hand gesture control for human–computer interaction with Deep Learning
CN103729131A (en) Human-computer interaction method and associated equipment and system
US10628031B2 (en) Control instruction identification method and apparatus, and storage medium
CN102968245A (en) Method and device for cooperatively controlling mouse touch and method and system for smart television interaction
KR101433543B1 (en) Gesture-based human-computer interaction method and system, and computer storage media
CN112190930B (en) Game role control method and device
US20150324025A1 (en) User input device and method thereof
US8884904B2 (en) Touch panel apparatus, system and operation method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination