CN113014980B - Remote control method and device and electronic equipment - Google Patents

Remote control method and device and electronic equipment Download PDF

Info

Publication number
CN113014980B
CN113014980B CN202110207294.5A CN202110207294A CN113014980B CN 113014980 B CN113014980 B CN 113014980B CN 202110207294 A CN202110207294 A CN 202110207294A CN 113014980 B CN113014980 B CN 113014980B
Authority
CN
China
Prior art keywords
item
feature information
face feature
user
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110207294.5A
Other languages
Chinese (zh)
Other versions
CN113014980A (en
Inventor
秦秋平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202110207294.5A priority Critical patent/CN113014980B/en
Publication of CN113014980A publication Critical patent/CN113014980A/en
Application granted granted Critical
Publication of CN113014980B publication Critical patent/CN113014980B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • H04N21/4825End-user interface for program selection using a list of items to be played back in a given order, e.g. playlists

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

According to the remote control method, the remote control device and the electronic equipment disclosed by the embodiment of the disclosure, when different users execute the project switching gesture, the project switching mode may be different, namely when the project switching gesture is detected, first face feature information in a video frame image is firstly identified, the first face feature information is matched with pre-stored face feature information, if the pre-stored face feature information comprises target face feature information with the matching degree with the first face feature information being greater than a preset threshold value, the user indicated by the first face feature information can be represented as a historical user, and therefore project filtering can be carried out, so that the user can efficiently browse the favorite projects. That is, the method of determining whether to filter the items according to the face feature information can reduce the operation steps of the user in the process of remote control, so that the computing resources and the display resources can be saved.

Description

Remote control method and device and electronic equipment
Technical Field
The disclosure relates to the technical field of internet, and in particular relates to a remote control method, a remote control device and electronic equipment.
Background
With the progress of scientific technology, the functions of the terminal equipment are more diversified, so that the lives of people are enriched. For example, people can watch a large number of digital network programs (such as sports programs, synthetic programs, humanity programs and the like) by using the television, so that more choices can be given to users, the users can always find programs which need to be watched, and the watching experience of the users is changed.
Disclosure of Invention
This disclosure is provided in part to introduce concepts in a simplified form that are further described below in the detailed description. This disclosure is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
The embodiment of the disclosure provides a remote control method, a remote control device and electronic equipment, which can realize filtering of items which are not favored by a user by identifying facial feature information of the user, so that the user can quickly locate the items which are desired to browse, the steps and time spent by the user for skipping the items which are not favored are saved, the operation steps for adjusting the items to the items which are desired to browse by the user are reduced, and the computing resources and the display resources are saved.
In a first aspect, an embodiment of the present disclosure provides a remote control method, including: in response to detecting the predefined item switch gesture, obtaining a video frame image comprising an item switch gesture image; determining whether the pre-stored face feature information comprises target face feature information or not according to the matching degree of the first face feature information and the pre-stored face feature information in the video frame image, wherein the matching degree of the target face feature information and the first face feature information is larger than a preset threshold value; based on the determination result, it is determined whether to perform item filtering.
In a second aspect, embodiments of the present disclosure provide a remote control apparatus, including: an acquisition unit configured to acquire a video frame image including an item switch gesture image in response to detection of a predefined item switch gesture; the first determining unit is used for determining whether the pre-stored face feature information comprises target face feature information or not according to the matching degree of the first face feature information and the pre-stored face feature information in the video frame image, wherein the matching degree of the target face feature information and the first face feature information is larger than a preset threshold value; and a second determining unit configured to determine whether to perform item filtering based on a result of the determination.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: one or more processors; and a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the remote control method as described in the first aspect.
In a fourth aspect, embodiments of the present disclosure provide a computer readable medium having stored thereon a computer program which, when executed by a processor, implements the steps of the remote control method as described in the first aspect.
According to the remote control method, the remote control device and the electronic equipment, when different users execute the project switching gesture, project switching modes may be different, namely when the project switching gesture is detected, first face feature information in a video frame image is firstly identified, the first face feature information is matched with pre-stored face feature information, if the pre-stored face feature information comprises target face feature information with the matching degree with the first face feature information being larger than a preset threshold value, the user indicated by the first face feature information can be represented as a historical user, and therefore project filtering can be carried out, so that the user can efficiently browse the favorite projects. That is, the method of determining whether to filter the items according to the face feature information can reduce the operation steps of the user in the process of remote control, so that the computing resources and the display resources can be saved.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
FIG. 1 is a flow chart of one embodiment of a remote control method according to the present disclosure;
FIG. 2 is a program play order representation intent in accordance with one embodiment of the remote control method of the present disclosure;
FIG. 3 is a schematic structural view of one embodiment of a remote control device according to the present disclosure;
FIG. 4 is an exemplary system architecture in which a remote control method of one embodiment of the present disclosure may be applied;
fig. 5 is a schematic diagram of a basic structure of an electronic device provided according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
Referring to fig. 1, a flow of one embodiment of a remote control method according to the present disclosure is shown. The remote control method can be applied to a terminal device. The remote control method as shown in fig. 1 includes the steps of:
in response to detecting the predefined item switch gesture, a video frame image including an item switch gesture image is acquired, step 101.
In some embodiments, a predefined item switch gesture may be understood as a gesture for human-machine interaction, i.e., when the user performs the predefined item switch gesture, it may be characterized that the user wants to switch items for viewing at this time.
In some embodiments, an item may be understood as a television program, game, movie category, picture style, etc. As an example, when the item is a television program, the user performs a predefined switch gesture, which may then characterize the user's desire to watch other programs; when the item is a game, the user performs a predefined switching gesture, which may indicate that the user may want to play other games, when the item is a movie category, the user performs a predefined switching gesture may be understood as the user wants to change the movie category being browsed (e.g., comedy category, horror category, art category, martial arts category, etc.), and when the item is a picture category, the user performs a predefined switching gesture may be understood as the user wants to change the picture style being browsed (e.g., mechanical style, classical style, popular style, euler style, etc.).
In other words, after the user performs the predefined item switch gesture, it may be characterized that the user needs to change the item currently being presented by the executing subject.
As an example, when the user's finger is directed to the execution subject (e.g., television) and a left-to-right swiping operation is performed, it may be characterized that the next item is required to be viewed at this time, and correspondingly, when the user's finger is directed to the television and a right-to-left swiping operation is performed, it may be characterized that the next item is required to be viewed at this time. Of course, the item switch gesture may be set according to an actual application scenario, and the specific type of gesture of the item switch gesture is not limited herein.
In some embodiments, the executing body may perform video capturing on the user in real time through the image capturing device, and may determine whether the user is performing the item switching gesture by detecting a change condition of the user gesture in the video frame image. Of course, in the specific embodiment, there are many ways of executing the subject to detect the item switch gesture, and this is merely illustrative, and the specific detection form of the item switch gesture is not limited.
In some embodiments, when the item switch gesture is detected, a video frame image including an item switch gesture image corresponding to the item switch gesture may be acquired. As an example, a video frame image that is clearer in facial features of the user and includes an item-switching gesture image may be acquired. For example, 10 video frame images each include a project switch gesture image, and 3 video frame images each include a complete and clear face image, so that any one of the 3 video frame images can be acquired.
Step 102, determining whether the pre-stored face feature information comprises target face feature information according to the matching degree of the first face feature information and the pre-stored face feature information in the video frame image.
Here, the matching degree of the target face feature information and the first face feature information is greater than a preset threshold.
In some embodiments, the face feature information of different users is different, and thus, it may be determined whether the target face feature information exists by matching the first face feature information with the pre-stored face feature information. In other words, the target face feature information indicates the same user as the first face feature information.
In some embodiments, pre-storing face feature information may include: facial feature information of a user who has performed the item switch gesture. As an example, the pre-stored face feature information may include face feature information of at least a portion of the user who performed the item switch gesture to the execution subject.
Step 103, based on the determination result, it is determined whether to perform item filtering.
In some embodiments, the pre-stored face feature information including the target face feature information may then characterize the user performing the above-mentioned item switch gesture, and also performed the item switch gesture for switching the item. In other words, the user can understand the history user of the execution subject (in other words, the user indicated by the pre-stored face feature information can understand the history user), so that the non-favorite items of the history user can be determined according to the item watching record of the history user, and the non-favorite items of the user can be filtered, so that the number of times of executing the item switching gesture by the user is reduced, the user can quickly locate the desired item, the steps and time spent by the user for skipping the non-favorite item are saved, the operation steps for adjusting the user to the item desired to browse are reduced, and the computing resources and the display resources are saved.
It can be seen that when executing the project switching gesture, the project switching manner may be different for different users, that is, when detecting the project switching gesture, the first face feature information in the video frame image is first identified, the first face feature information is matched with the pre-stored face feature information, if the pre-stored face feature information includes the target face feature information with the matching degree with the first face feature information being greater than the preset threshold, the user indicated by the first face feature information can be represented as a history user, so that the project filtering can be performed, and the user can efficiently browse the favorite projects. That is, the method of determining whether to filter the items according to the face feature information can reduce the operation steps of the user in the process of remote control, so that the computing resources and the display resources can be saved.
In some embodiments, step 103 (determining whether to perform item filtering based on the determination result) may specifically include:
and determining to perform item filtering in response to determining that the pre-stored face feature information includes target face feature information. And after determining to filter the items, determining a target user identifier corresponding to the target face feature information according to a pre-stored user list, determining an item to be filtered corresponding to the target user identifier, and filtering the item to be filtered corresponding to the target user identifier.
Here, the user list may include correspondence of face feature information and user identification.
In some embodiments, when the pre-stored face feature information includes the target face feature information, the user currently performing the item switching gesture may be characterized as a history user, so that item filtering may be performed.
In some embodiments, a user identification may be used to indicate a user, i.e., a user identification may be used to indicate a user.
In some embodiments, the user list records a correspondence between face feature information and user identifiers, where one user identifier may correspond to a plurality of face feature information. For example, when the user is performing the item switching gesture, the user may be operating the execution subject, or the user may be performing the item switching gesture diagonally to the execution subject. Thus, the collected face feature information may slightly differ, and one user identifier may correspond to a plurality of face feature information. Correspondingly, if the matching degree of the first face feature information and any face information in the plurality of face feature information corresponding to the target user identifier is greater than a preset threshold, the user who is executing the item switching gesture at this time can be characterized as the target user (the target user identifier indicates the target user). Of course, a user identifier may also correspond to a face feature information, where the face feature information may record face features written by the user under different conditions.
In some embodiments, the user identifier corresponding to the historical user of the execution subject is recorded in the user list, that is, after a new user performs the project switching gesture, a new user identifier may be created for the user, and the face feature information in the video frame image is stored in the user list corresponding to the user identifier. In other words, the user list establishes a corresponding relation between the face feature information and the user identifier, and whether the user currently executing the gesture operation is a history user can be known through the user list.
In some embodiments, when the user list includes the target user identifier, the user may be characterized as a history user, so that an item to be filtered corresponding to the user identifier may be determined, and item filtering may be performed according to the determined item to be filtered.
It can be seen that the determined items to be filtered are filtered, namely the items which are not favored by the user are filtered, so that the user can quickly switch to the items which are expected to browse, the operation steps for switching to the items which are expected to browse by the user can be reduced, the time for switching to the items which are expected to browse is saved, and the user can quickly browse to the items which are favored by the user.
In some embodiments, the item to be filtered corresponding to the target user identification may be determined by:
and acquiring an item playing record corresponding to the target user identifier, and determining the item to be filtered corresponding to the target user identifier according to the times that the single playing time length of each item is smaller than a preset time length threshold value.
Here, the item play record may include a single play time length of the item, the number of times of the item play, the time of the item play, and the like.
In some embodiments, when the item is a television program, the play duration may be understood as the duration of playing the television program, and when the item is a game, the play duration may be understood as the duration of playing the game by the user.
In some embodiments, the preset duration threshold may be set according to practical situations, and may be 10 seconds when the item is a television program, for example. Of course, the specific value of the preset duration threshold is not limited, and only needs to be reasonably set according to actual conditions.
In some embodiments, if the single play duration of the item is less than the preset duration threshold, it may be indicated that after the user switches to the item, the user may be less interested in the item after the time that the item stays is short, that is, after the user switches to the item, the user switches to the next item quickly.
In some embodiments, if the number of times that the single play duration of a certain item is smaller than the preset duration threshold is already greater than the preset number threshold (it should be noted that the preset number threshold may be set according to the actual situation, and the specific value of the preset number threshold is not limited here, for example, may be 3 times); that is, if the target user switches to a certain item multiple times and then switches to the next item rapidly, it may be indicated that the target user does not like the item, in other words, it may be indicated that the item is an item to be filtered corresponding to the target user identifier.
Illustrating: if the target user views the item B5 times, wherein none of 4 times is observed for more than 10 seconds, it may be characterized that the target user is not interested in the content played by the item B, that is, the item B may be determined as the item to be filtered corresponding to the target user identifier.
In some embodiments, after the items to be filtered are filtered, the first item may be determined and presented according to the item switch direction indicated by the item switch gesture.
Here, the first item is an item that is not among items to be filtered in an item switching direction along a current item start.
In some embodiments, the first item may be understood as an item of interest to the user.
In some embodiments, the first item presented may be the item of greater interest to the user, as items not of interest to the user are filtered.
In some embodiments, the program item playing may have a program item playing sequence table, for convenience of understanding, please refer to fig. 2 for illustration, fig. 2 is a partial program item playing sequence table, that is, the normal program item switching sequence is item a, item B, item C, item D, item E, and item F. For example, if the currently playing item is item C, and if the user is a new user and the user performs a gesture of switching the next item, then item D should be played; accordingly, if the user performs the switch gesture to play the previous item, the item B should be played. If the user is a history user, and the user identifier of the history user is used to determine that the items to be filtered of the history user include the item B, the item D and the item E, then after the user performs the program item switching gesture for playing the next item, the item F may be played, and at this time, the item F may be understood as the first item.
It can be seen that, since the items to be filtered are filtered, the new item playing sequence can be understood as the items interested by the user, so that the user can quickly locate the items desired to be browsed, and the experience of the user in the process of browsing the items can be improved.
In some embodiments, based on the playing duration of the first item played this time, it is determined whether to determine the first item as the item to be filtered corresponding to the target user identifier.
In some embodiments, in response to determining that the playing duration of the first item played this time is less than a preset duration threshold, a historical playing record corresponding to the target user identifier is obtained, and according to the item playing record, the number of times that the single playing duration of the first item is less than the preset duration threshold is determined; and in response to determining that the number of times that the single play time length of the first item is smaller than the preset time length threshold value is a preset value, determining the first item as an item to be filtered corresponding to the target user identification.
Here, the above-described item play record includes a single play duration of the item.
In some embodiments, the preset value may be set according to the actual situation.
In some embodiments, when the current playing time length of the first item is smaller than a preset time length threshold, a historical playing record of the target user for the first item may be obtained, and the number of times that the playing time length of the first item is smaller than the preset time length threshold may be determined from the historical playing record of the first item, if the number of times is equal to a preset value, it may be indicated that the target user has switched the item to the first item multiple times, then the target user is immediately switched to another item, that is, it may be indicated that the user is not dared to interest in the first item, and therefore, the first item may be determined as the item to be filtered. Therefore, the target user can not preferentially display the first item to the target user when switching the items next time.
In some embodiments, step 103 (determining whether to perform item filtering based on the determination result) may specifically further include: and in response to determining that the pre-stored face feature information does not comprise the target face feature information, determining that the items are not filtered.
In some embodiments, after determining that the item is not filtered, a second item may also be determined and presented according to the item switch direction indicated by the item switch gesture.
Here, the second item is an item in the above item switching direction along the current item.
In some embodiments, after determining that the item is not filtered, it may be characterized that the user is likely a new user, and thus, the user's item browsing preference is not determined, and thus, the items may be played sequentially in a preset item playing order. As shown in fig. 2, the currently playing item is item C, and if the user performs an item switch gesture for playing the next item, item D may be presented, where item D may be understood as the second item.
In some embodiments, a corresponding first user identification may be added to the first facial feature information.
In some embodiments, the first user identifier and the current playing duration of the second item may be stored correspondingly.
In some embodiments, the item may be a television program.
That is, the user currently executing the predefined item switching gesture may be determined as a history user, and the browsing record of the user may be recorded, so as to analyze the browsing preference of the user, and facilitate the user to quickly switch to the item that the user likes to browse when the user subsequently performs the item switching.
It can be seen that when there is no target face feature information matched with the first face feature information, a first user identifier may be added to the first face feature information, and the first user identifier may be stored in correspondence with the item playing duration, that is, an item playing record corresponding to the first user identifier is established, so as to determine an item to be filtered corresponding to the first user identifier. In this way, the user can finish the determination of browsing the favorite in the normal project switching process, in other words, the user does not need to set the favorite or the non-favorite projects specifically, the non-favorite projects can be determined only in the normal switching process, and the non-favorite projects of the user can be automatically filtered when the user browses the projects next time, so that the user can quickly locate the desired projects, the steps and the time spent by the user for skipping the non-favorite projects are saved, the operation steps for adjusting the user to the desired browsed projects are reduced, the computing resources and the display resources are saved, and accordingly, the browsing experience of the user is improved.
With further reference to fig. 3, as an implementation of the method shown in the foregoing figures, the present disclosure provides an embodiment of a remote control device, which corresponds to the embodiment of the remote control method shown in fig. 1, and which is particularly applicable to various electronic apparatuses.
As shown in fig. 3, the remote control apparatus of the present embodiment includes: an acquisition unit 301 for acquiring a video frame image including an item switch gesture image in response to detecting a predefined item switch gesture; a first determining unit 302, configured to determine, according to a degree of matching between first face feature information and pre-stored face feature information in a video frame image, whether the pre-stored face feature information includes target face feature information, where the degree of matching between the target face feature information and the first face feature information is greater than a preset threshold; a second determining unit 303, configured to determine whether to perform item filtering based on a determination result.
In some embodiments, the second determining unit 303 is specifically further configured to: in response to determining that the target face feature information is included in the pre-stored face feature information, determining to perform item filtering,
and, the above apparatus further includes a third determining unit 304, specifically configured to: determining a target user identifier corresponding to the target face feature information according to a pre-stored user list, wherein the user list comprises the corresponding relation between the face feature information and the user identifier; determining items to be filtered corresponding to the target user identification; and filtering the items to be filtered corresponding to the target user identification.
In some embodiments, the third determining unit 304 is specifically further configured to: acquiring a project play record corresponding to the target user identifier, wherein the project play record comprises a project single play duration; and determining the item to be filtered corresponding to the target user identifier according to the times that the single play time length of each item is smaller than the preset time length threshold value.
In some embodiments, the second determining unit 303 is specifically further configured to: and determining and displaying a first item according to the item switching direction indicated by the item switching gesture, wherein the first item is an item which is not in the items to be filtered in the item switching direction along the current item.
In some embodiments, the second determining unit 303 is specifically further configured to: and determining whether to determine the first item as an item to be filtered corresponding to the target user identifier based on the playing time length of the first item.
In some embodiments, the second determining unit 303 is specifically further configured to: responding to the fact that the playing time length of the first item played this time is smaller than a preset time length threshold value, and acquiring a historical playing record corresponding to the target user identifier, wherein the item playing record comprises a single playing time length of the item; determining the times that the single play time length of the first item is smaller than the preset time length threshold according to the item play record; and in response to determining that the number of times that the single play time length of the first item is smaller than the preset time length threshold is a preset value, determining the first item as an item to be filtered corresponding to the target user identifier.
In some embodiments, the second determining unit 303 is specifically further configured to: determining not to filter the items in response to determining that the pre-stored face feature information does not include the target face feature information; and determining and displaying a second item according to the item switching direction indicated by the item switching gesture, wherein the second item is an item along the item switching direction of the current item.
In some embodiments, the second determining unit 303 is specifically further configured to: and adding a first user identification corresponding to the first face characteristic information.
In some embodiments, the second determining unit 303 is specifically further configured to: and correspondingly storing the first user identifier and the current playing time length of the second item.
In some embodiments, the items include television programs.
Referring to fig. 4, fig. 4 illustrates an exemplary system architecture in which a remote control method of an embodiment of the present disclosure may be applied.
As shown in fig. 4, the system architecture may include terminal devices 401, 402, 403, a network 404, and a server 405. The network 404 may be used as a medium to provide communication links between the terminal devices 401, 402, 403 and the server 405. The network 404 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The terminal devices 401, 402, 403 may interact with the server 405 through the network 404 to receive or send messages or the like. Various client applications, such as a web browser application, a search class application, a news information class application, may be installed on the terminal devices 401, 402, 403. The client application in the terminal device 401, 402, 403 may receive the instruction of the user and perform the corresponding function according to the instruction of the user, for example, adding the corresponding information in the information according to the instruction of the user.
The terminal devices 401, 402, 403 may be hardware or software. When the terminal devices 401, 402, 403 are hardware, they may be various electronic devices having a display screen and supporting web browsing, including but not limited to smartphones, tablet computers, electronic book readers, MP3 players (Moving Picture Experts Group Audio Layer III, dynamic video expert compression standard audio plane 3), MP4 (Moving Picture Experts Group Audio Layer IV, dynamic video expert compression standard audio plane 4) players, laptop and desktop computers, and the like. When the terminal devices 401, 402, 403 are software, they can be installed in the above-listed electronic devices. Which may be implemented as multiple software or software modules (e.g., software or software modules for providing distributed services) or as a single software or software module. The present invention is not particularly limited herein.
The server 405 may be a server that provides various services, for example, receives information acquisition requests sent by the terminal devices 401, 402, 403, and acquires presentation information corresponding to the information acquisition requests in various ways according to the information acquisition requests. And related data showing the information is transmitted to the terminal devices 401, 402, 403.
It should be noted that the information processing method provided by the embodiment of the present disclosure may be performed by a terminal device, and accordingly, a remote control apparatus may be provided in the terminal devices 401, 402, 403. In addition, the information processing method provided by the embodiment of the present disclosure may also be executed by the server 405, and accordingly, the information processing apparatus may be provided in the server 405.
It should be understood that the number of terminal devices, networks and servers in fig. 4 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to fig. 5, a schematic diagram of a configuration of an electronic device (e.g., a terminal device or server in fig. 4) suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 5 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 5, the electronic device may include a processing means (e.g., a central processor, a graphics processor, etc.) 501 that may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 502 or a program loaded from a storage means 408 into a Random Access Memory (RAM) 503. In the RAM503, various programs and data required for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM502, and the RAM503 are connected to each other via a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
In general, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 507 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 508 including, for example, magnetic tape, hard disk, etc.; and communication means 509. The communication means 509 may allow the electronic device to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 shows an electronic device having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or from the storage means 508, or from the ROM 502. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 501.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: in response to detecting the predefined item switch gesture, obtaining a video frame image comprising an item switch gesture image; determining whether the pre-stored face feature information comprises target face feature information or not according to the matching degree of the first face feature information and the pre-stored face feature information in the video frame image, wherein the matching degree of the target face feature information and the first face feature information is larger than a preset threshold value; based on the determination result, it is determined whether to perform item filtering.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. The name of the unit is not limited to the unit itself in some cases, and for example, the acquisition unit 401 may also be described as "a unit that acquires a video frame image including an item switching gesture image".
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (13)

1. A remote control method, comprising:
in response to detecting the predefined item switch gesture, obtaining a video frame image comprising an item switch gesture image;
determining whether the pre-stored face feature information comprises target face feature information or not according to the matching degree of the first face feature information and the pre-stored face feature information in the video frame image, wherein the matching degree of the target face feature information and the first face feature information is larger than a preset threshold value; the user indicated by the pre-stored face characteristic information is a user with information of programs which are not favored by the user;
determining whether to perform item filtering based on the determination result;
wherein, based on the determination result, determining whether to perform item filtering includes: determining to perform item filtering in response to determining that the target face feature information is included in the pre-stored face feature information; or (b)
And determining not to perform item filtering in response to determining that the target face feature information is not included in the pre-stored face feature information.
2. The method of claim 1, wherein if item filtering is determined, the method further comprises:
determining a target user identifier corresponding to the target face feature information according to a pre-stored user list, wherein the user list comprises the corresponding relation between the face feature information and the user identifier;
determining items to be filtered corresponding to the target user identification;
and filtering the items to be filtered corresponding to the target user identification.
3. The method of claim 2, wherein the item to be filtered corresponding to the target user identity is determined by:
acquiring a project play record corresponding to the target user identifier, wherein the project play record comprises a project single play duration;
and determining the item to be filtered corresponding to the target user identifier according to the times that the single play time length of each item is smaller than a preset time length threshold value.
4. A method according to claim 3, characterized in that the method further comprises:
And determining and displaying a first item according to the item switching direction indicated by the item switching gesture, wherein the first item is an item which is not in the items to be filtered in the item switching direction along the beginning of the current item.
5. The method according to claim 4, wherein the method further comprises:
and determining whether to determine the first item as an item to be filtered corresponding to the target user identifier based on the playing time length of the first item played at this time.
6. The method of claim 5, wherein determining whether to determine the first item as the item to be filtered corresponding to the target user identifier based on a playing duration of the first item played this time, comprises:
responding to the fact that the playing time length of the first item played this time is smaller than a preset time length threshold value, and acquiring a historical playing record corresponding to the target user identifier, wherein the item playing record comprises a single playing time length of the item;
determining the times that the single play time length of the first item is smaller than the preset time length threshold according to the item play record;
and in response to determining that the number of times that the single play time length of the first item is smaller than the preset time length threshold is a preset value, determining the first item as an item to be filtered corresponding to the target user identifier.
7. The method of claim 1, wherein if it is determined that the item is not filtered, the method further comprises:
and determining and displaying a second item according to the item switching direction indicated by the item switching gesture, wherein the second item is an item along the item switching direction of the current item.
8. The method of claim 7, wherein the method further comprises: and adding a first user identification corresponding to the first face characteristic information.
9. The method of claim 8, wherein the method further comprises:
and correspondingly storing the first user identifier and the current playing time length of the second item.
10. The method of claim 1, wherein the items comprise television programs.
11. A remote control apparatus, comprising:
an acquisition unit configured to acquire a video frame image including an item switch gesture image in response to detection of a predefined item switch gesture;
the first determining unit is used for determining whether the pre-stored face feature information comprises target face feature information or not according to the matching degree of the first face feature information and the pre-stored face feature information in the video frame image, wherein the matching degree of the target face feature information and the first face feature information is larger than a preset threshold value; the user indicated by the pre-stored face characteristic information is a user with information of programs which are not favored by the user;
A second determination unit configured to determine whether to perform item filtering based on a determination result;
the second determining unit is specifically configured to determine to perform item filtering if the pre-stored face feature information includes the target face feature information in response to determining that the pre-stored face feature information includes the target face feature information; or (b)
And determining not to perform item filtering in response to determining that the target face feature information is not included in the pre-stored face feature information.
12. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs,
when executed by the one or more processors, causes the one or more processors to implement the method of any of claims 1-10.
13. A computer readable medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method according to any one of claims 1-10.
CN202110207294.5A 2021-02-23 2021-02-23 Remote control method and device and electronic equipment Active CN113014980B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110207294.5A CN113014980B (en) 2021-02-23 2021-02-23 Remote control method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110207294.5A CN113014980B (en) 2021-02-23 2021-02-23 Remote control method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN113014980A CN113014980A (en) 2021-06-22
CN113014980B true CN113014980B (en) 2023-07-18

Family

ID=76385763

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110207294.5A Active CN113014980B (en) 2021-02-23 2021-02-23 Remote control method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN113014980B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108108649A (en) * 2016-11-24 2018-06-01 腾讯科技(深圳)有限公司 Auth method and device
CN109214301A (en) * 2018-08-10 2019-01-15 百度在线网络技术(北京)有限公司 Control method and device based on recognition of face and gesture identification
CN109829360A (en) * 2018-12-15 2019-05-31 深圳壹账通智能科技有限公司 Electrical equipment control method, device, electrical equipment and storage medium
CN110389662A (en) * 2019-06-19 2019-10-29 深圳壹账通智能科技有限公司 Content displaying method, device, storage medium and the computer equipment of application program
CN110611734A (en) * 2019-08-08 2019-12-24 深圳传音控股股份有限公司 Interaction method and terminal
WO2021004186A1 (en) * 2019-07-11 2021-01-14 成都市喜爱科技有限公司 Face collection method, apparatus, system, device, and medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108108649A (en) * 2016-11-24 2018-06-01 腾讯科技(深圳)有限公司 Auth method and device
CN109214301A (en) * 2018-08-10 2019-01-15 百度在线网络技术(北京)有限公司 Control method and device based on recognition of face and gesture identification
CN109829360A (en) * 2018-12-15 2019-05-31 深圳壹账通智能科技有限公司 Electrical equipment control method, device, electrical equipment and storage medium
CN110389662A (en) * 2019-06-19 2019-10-29 深圳壹账通智能科技有限公司 Content displaying method, device, storage medium and the computer equipment of application program
WO2021004186A1 (en) * 2019-07-11 2021-01-14 成都市喜爱科技有限公司 Face collection method, apparatus, system, device, and medium
CN110611734A (en) * 2019-08-08 2019-12-24 深圳传音控股股份有限公司 Interaction method and terminal

Also Published As

Publication number Publication date
CN113014980A (en) 2021-06-22

Similar Documents

Publication Publication Date Title
CN109640188B (en) Video preview method and device, electronic equipment and computer readable storage medium
CN109600656B (en) Video list display method and device, terminal equipment and storage medium
CN110866183B (en) Social interface recommendation method and device, electronic equipment and storage medium
CN113076048B (en) Video display method and device, electronic equipment and storage medium
WO2023088442A1 (en) Live streaming preview method and apparatus, and device, program product and medium
CN111258736B (en) Information processing method and device and electronic equipment
CN111290819B (en) Method and device for displaying operation prompt and electronic equipment
WO2023051294A1 (en) Prop processing method and apparatus, and device and medium
CN114154012A (en) Video recommendation method and device, electronic equipment and storage medium
CN109462779B (en) Video preview information playing control method, application client and electronic equipment
CN114371896B (en) Prompting method, device, equipment and medium based on document sharing
CN114564269A (en) Page display method, device, equipment, readable storage medium and product
US20230421857A1 (en) Video-based information displaying method and apparatus, device and medium
CN112312225B (en) Information display method and device, electronic equipment and readable medium
CN111935442A (en) Information display method and device and electronic equipment
CN111726691A (en) Video recommendation method and device, electronic equipment and computer-readable storage medium
CN113721807A (en) Information display method and device, electronic equipment and storage medium
CN114443897A (en) Video recommendation method and device, electronic equipment and storage medium
CN111246245B (en) Method and device for pushing video aggregation page, server and terminal equipment
CN110809166B (en) Video data processing method and device and electronic equipment
CN114528433B (en) Template selection method and device, electronic equipment and storage medium
CN113014980B (en) Remote control method and device and electronic equipment
US20220335568A1 (en) Material display method and apparatus, terminal, and storage medium
CN115379245B (en) Information display method and device and electronic equipment
CN111246254A (en) Video recommendation method and device, server, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant