Disclosure of Invention
The application provides a method, equipment and a system for querying video metadata, which can be matched with proxy server cache and are beneficial to improving the query efficiency of the video metadata.
In a first aspect, a method for querying video metadata is provided, including:
the client sends a video metadata query request to the proxy server, wherein the video metadata query request contains a filtering factor, and the filtering factor is generated according to a filtering condition aiming at the client; and receiving the video metadata returned by the proxy server, wherein the video metadata is acquired from the cache by the proxy server according to the video metadata query request.
Because the video metadata query request contains the filtering factor, and the filtering factor is generated according to the filtering condition of the client, the video metadata query request based on the filtering factor can be matched with the proxy server cache.
It should be noted that the filter condition for the client includes not only the filter condition obtained by the client itself, but also the filter condition set by the video system for the client.
Based on the first aspect, in one possible design, the client acquires a filtering factor from the video system, where the filtering factor is generated by the video system according to a filtering condition for the client; the filtering conditions for the client comprise filtering conditions set by the video system for the client and filtering conditions reported by the client to the video system.
Because the filtering factor is generated by the video system, when the filtering condition of the client is modified, the program code at the client side does not need to be modified, and the expandability of the client is improved.
In a possible design based on the first aspect, the client receives the filtering factor delivered by the video system from a first interface with the video system.
Based on the first aspect, in one possible design, the client reports the new filtering condition to the video system through a second interface between the client and the video system; and receiving a new filter factor returned by the video system through a first interface between the video system and the video system, wherein the new filter factor is generated by the new filter condition.
Based on the first aspect, in one possible design, the first interface is a heartbeat interface, or the first interface is a login interface.
In addition, it should be noted that the first interface may also be a new interface between a preset client and the video system.
In a second aspect, a method for querying video metadata is provided, including:
the method comprises the steps that a proxy server receives a video metadata query request sent by a client, wherein the video metadata query request contains a filtering factor, and the filtering factor is generated according to a filtering condition aiming at the client; and inquiring the video metadata matched with the video metadata inquiry request from the cache according to the video metadata inquiry request, and returning the inquired video metadata to the client.
Because the video metadata query request comprises the filtering factor generated aiming at the filtering condition of the client, the proxy server can be added on the basis of the existing video metadata query system through the video metadata query request containing the filtering factor, and the query efficiency of the video metadata is improved.
Based on the second aspect, in one possible design, if it is determined that video metadata matching the video metadata query request is not stored in the cache, the proxy server forwards the video metadata query request to the video system; and receiving the video metadata returned by the video system and returning the video metadata returned by the video system to the client.
In a possible design according to the second aspect, the proxy server stores the video metadata obtained from the video system in a cache in correspondence with the video metadata query request.
The proxy server correspondingly stores the video metadata acquired from the video system and the video metadata query request into the cache, so that when the proxy server receives the same video metadata query request again, the proxy server can directly acquire the video metadata from the proxy server without querying the video system, and the video system shares part of tasks of querying the video metadata to a certain extent, thereby improving the processing efficiency of the video system.
In a third aspect, a method for generating a filter factor is provided, including:
the video system receives the filter condition reported by the client;
and the video system generates a filter factor according to the filter condition reported by the client and the filter condition set by the video system for the client, and sends the filter factor to the client.
On the basis of the third aspect, in one possible design, the video system issues the filtering factor to the client through a first interface with the client.
It should be noted that the first interface may be a heartbeat interface, a login interface, or a preset newly added interface between the video system and the client.
On the basis of the third aspect, in a possible design, after receiving a new filtering condition reported by a client, the video system generates a new filtering factor according to the new filtering condition reported by the client, and sends the new filtering factor to the client.
In a fourth aspect, there is provided a video metadata query device, including: the client comprises a sending module and a receiving module, wherein the sending module is used for sending a video metadata query request to a proxy server, the video metadata query request comprises a filtering factor, and the filtering factor is generated according to a filtering condition aiming at the client; the receiving module is used for receiving the video metadata returned by the proxy server, and the video metadata is acquired from the cache by the proxy server according to the video metadata query request.
Based on the fourth aspect, in one possible design, the receiving module obtains a filtering factor from the video system before the sending module sends the video metadata query request to the proxy server, where the filtering factor is generated by the video system according to a filtering condition for the client; the filtering conditions for the client comprise filtering conditions set by the video system for the client and filtering conditions reported by the client to the video system.
Based on the fourth aspect, in one possible design, the receiving module receives the filtering factor issued by the video system from a first interface with the video system.
Based on the fourth aspect, in a possible design, the sending module reports the new filtering condition to the video system through a second interface between the sending module and the video system; the receiving module receives a new filtering factor returned by the video system through the first interface, and the new filtering factor is generated by the new filtering condition.
In a possible design based on the fourth aspect, the first interface is a heartbeat interface or the first interface is a login interface.
In a fifth aspect, there is provided a query device for video metadata, including: the receiving module is used for receiving a video metadata query request sent by a client, wherein the video metadata query request contains a filtering factor, and the filtering factor is generated according to a filtering condition aiming at the client; the processing module is used for inquiring the video metadata matched with the video metadata inquiry request from the cache according to the filtering factor; and the sending module is used for returning the video metadata inquired by the processing module to the client.
Based on the fifth aspect, in a possible design, when the processing module determines that video metadata matching the video metadata query request is not stored in the cache, the sending module forwards the video metadata query request to the video system; the receiving module is also used for receiving video metadata returned by the video system; the sending module is also used for returning the video metadata returned by the video system received by the receiving module to the client.
In a possible design based on the fifth aspect, the processing module is further configured to store the video metadata acquired by the receiving module from the video system in a cache in correspondence with the video metadata query request.
In a sixth aspect, there is provided a query system for video metadata, comprising a device capable of implementing any one of the possible designs provided in the fourth aspect or the fourth aspect, and a device capable of implementing any one of the possible designs provided in the fifth aspect or the fifth aspect.
In a seventh aspect, an embodiment of the present application further provides a video metadata query device, including a processor, a memory, and a communication interface, where the communication interface is used to receive and send information, the memory is used to store a software program and received or sent data information, and the processor is used to read the software program and data stored in the memory and control the communication interface to implement the method provided in the first aspect or any implementation manner of the first aspect.
In an eighth aspect, an embodiment of the present application further provides a video metadata query device, which includes a processor, a memory, and a communication interface, where the communication interface is used to receive and send information, the memory is used to store a software program and received or sent data information, and the processor is used to read the software program and data stored in the memory and implement the method provided in the second aspect or any implementation manner of the second aspect.
In a ninth aspect, embodiments of the present application further provide a computer storage medium, where the storage medium may be nonvolatile, that is, the content is not lost after power is turned off. The storage medium stores therein a software program which, when read and executed by one or more processors, may implement the method provided by the first aspect or any one of the implementations of the first aspect described above.
In a tenth aspect, embodiments of the present application further provide a computer storage medium, where the storage medium may be nonvolatile, that is, the content is not lost after power is turned off. The storage medium stores therein a software program which, when read and executed by one or more processors, may implement the method provided by the second aspect or any one of the implementations of the second aspect described above.
In an eleventh aspect, the present application further provides a computer program product containing instructions, which when run on a computer, causes the computer to perform the method of the above aspects.
Detailed Description
Embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
As shown in fig. 2, a query system of video metadata applied in the query method of video metadata according to the embodiment of the present application includes a client, a proxy server and a video system, where the client may be a terminal device, such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a set-top box, and a smart television, which is installed with a video application or an application capable of browsing videos, and the proxy server may be another server having a cache and query function, such as a Content Delivery Network (CDN).
The following describes the embodiments of the present application in detail with reference to fig. 2 as an example.
As shown in fig. 3, the method for querying video metadata according to the embodiment of the present application includes:
step 300, a client sends a video metadata query request to a proxy server, wherein the video query request comprises a filtering factor, and the proxy server receives the video metadata query request sent by the client, wherein the filtering factor is generated according to a filtering condition aiming at the client.
It should be noted that the filter condition for the client includes not only the filter condition obtained by the client according to the configuration of the client or other external factors (such as the geographic location, etc.), but also the filter condition set by the video system for the client. In general, the filtering conditions set by the video system for the client are filtering conditions of terminal attributes (such as content encoding format, etc.) and filtering conditions of user attributes (such as user packets, etc.).
In step 310, the proxy server queries the video metadata matched with the filtering factor from the cache according to the video metadata query request.
The cache stores the video metadata query request and the corresponding video metadata, and the storage space where the cache is located may be an internal storage space of the proxy server, or may be an external virtual storage space (e.g., a cloud storage space) or an external entity storage space (e.g., a hard disk) managed by the proxy server.
In step 320, the proxy server returns the queried video metadata to the client, and the client receives the video metadata returned by the proxy server.
In the embodiment of the application, since the filtering factor included in the video metadata query request is generated for the filtering condition of the client, a proxy server can be introduced into the existing video metadata query system through the video metadata query request, thereby being beneficial to improving the query efficiency of the video metadata. And because the video metadata query request comprises the filtering factor, compared with the video metadata query request comprising the filtering condition in the prior art, the size of a message body is reduced, the data flow is saved, and the implementation mode of a client is simplified.
In addition, if the proxy server determines that the video metadata matched with the video metadata query request is not stored in the cache, the video metadata query request is forwarded to the video system, the video system queries the video metadata returned to the client after receiving the video query request, the queried video metadata is returned to the proxy server, and the proxy server returns the video metadata returned by the video system to the client.
In order to avoid repeated inquiry of the video system when the client sends the same request again, the proxy server correspondingly stores the video metadata acquired from the video system and the video metadata inquiry request into a cache.
The video metadata query request comprises the filtering factor, so that the video metadata query request corresponds to the video metadata, and the accuracy of the video metadata query is ensured.
It should be understood that, in the embodiment of the present application, the filtering factor may be generated by the client, or may be generated by the video system and then sent to the client. When a filtering factor is generated by a client, the video system needs to send all filtering conditions set for the client to the client in advance, and the client needs to send the generated filtering factor and the filtering conditions obtained by the client used for generating the filtering factor according to the configuration of the client or other external factors to the video system, so that the video system can inquire video metadata matched with the filtering factor when receiving a video metadata request including the filtering factor.
When the filtering factor is generated by the video system, the client needs to report the filtering condition obtained by the client according to the configuration of the client or other external factors to the video system in advance, the video system generates the filtering factor according to the filtering condition reported by the client and the filtering condition set by the video system for the client, and then the filtering factor is issued to the client. Specifically, the video system issues the filtering factor through a first interface between the client and the client, and the client receives the filtering factor through the first interface between the client and the video system. The first interface may be a heartbeat interface, a login interface, or an interface newly added between the client and the video system.
It should be noted that the login interface is usually an interface through which a user sends authentication information, such as a user name and a password or an authentication code, to the video system when the user logs in at the client, the video system verifies whether the client is authorized through the authentication information, and then returns an authorization result through the login interface, and the client further determines whether the video system can be logged in based on the authorization result.
In addition, the heartbeat interface is generally an interface that the client side is configured to send a heartbeat request to the video system periodically (generally, one period is in units of minutes, for example, one period may be 5 minutes), and is configured to notify the video system that the client side is in an online state. Optionally, in this embodiment of the application, when the filtering condition of the video metadata set in the client changes (a new filtering condition is added, or a filtering condition reported to the video system before the client is modified), the changed filtering condition may also be carried in the heartbeat request, and the video system may generate a new filtering factor based on the changed filtering condition, and carry the new filtering factor in the heartbeat response and issue the new filtering factor to the client.
In this embodiment, the client may also report a new filtering condition to the video system through a second interface between the client and the video system, and the video system receives the new filtering condition, generates a new filtering factor according to the new filtering condition and the filtering condition set by the video system for the client, and returns the new filtering factor to the client through the first interface. The second interface may be a login interface, a heartbeat interface, or a new interface configured in advance. The first interface and the second interface may be the same interface or different interfaces.
In this embodiment, the filter factor generation method may be a Message Digest Algorithm, the input parameter is a character string for each filter condition of the client, and the output result is a hash value, and the filter factor generation method may also be an information-Digest Algorithm (Message-Digest Algorithm 5, MD5), a Secure Hash Algorithm (SHA), an original Integrity verification Message Digest (RACE Integrity verification algorithms, ripem) Algorithm, or the like.
It should be noted that, in the embodiment of the present application, the filtering condition for the client is divided into two cases:
in case one, the filtering condition for the client may be a filtering condition of the terminal attribute and a user attribute filtering condition, and in case that the filtering condition of the terminal attribute (such as terminal type, video content definition, content frame rate, content coding format, geographical location where the terminal is located, etc.) and the filtering condition of the user attribute (such as subnet operator, user group, area, parental control level, language category, etc.) are commonly common filtering conditions for each query interface between the client and the video system, and the filtering conditions are generally equal in value for most query interfaces, in order to simplify query requests for video metadata, a filtering factor is generated based on the filtering conditions, and when the user inputs personalized filtering conditions such as date, type of queried video metadata (movie) at the client based on his own requirements, the video metadata query request can be directly carried in the video metadata query request, namely the video metadata query request comprises the filtering factor and at least one filtering condition. Thus avoiding carrying filtering conditions common to these query interfaces each time video metadata is queried. The filtering conditions shared by the query interfaces comprise filtering conditions acquired by the client based on self configuration or other external factors and filtering conditions set by the video system for the client, so that when a video metadata query request based on the embodiment of the application is stored corresponding to video metadata, the proxy server can uniquely determine the video metadata matched with the video metadata query request according to the video metadata query request. In this case, when the user does not input the filtering condition at the client, only the filtering factor is included in the video metadata query request sent by the client to the video system.
In the second case, in the embodiment of the present application, the filter condition for the client is all filter conditions set for the client and the video system for the client, and includes not only a filter condition common to the query interface, but also a filter condition personalized by the user based on the user's own needs, such as a date, a type of the queried video metadata (movie), and the like, at this time, the video metadata query request only includes a filter factor. In the second case, the filtering factors are generated by all the filtering conditions for the client, but in the general case, the filtering factors input by the user are set based on the requirements of the user, so when the client sets the filtering conditions for the first time, the filtering conditions input by the user need to be sent in advance and stored in the video system, and when the user inputs the same filtering conditions at the client later, the filtering conditions do not need to be sent to the video system.
In the first case, the filter factor may be generated by the client or generated by the video system, in order to improve the expandability of the terminal, the filter factor is generated by the video system in a normal case, and in the second case, the filter factor may be generated by the client or generated by the video system, but since the filter factor input by the user is required to be used when the filter factor is generated, it is more convenient for the client to generate the filter factor, the filter factor generated for the first time is to enable the video system to search for video metadata matching the filter factor based on the filter factor, the terminal is to carry the filter condition set for the user in the client in the video query request, and when the subsequent terminal sends the same video metadata query request, it is not required to carry the filter condition set for the user in the client again. The proxy server may store the visual filter factors and corresponding video metadata.
Based on the same concept, video metadata query devices shown in fig. 4a and fig. 5a are also provided in the embodiments of the present application, and since the method corresponding to the device shown in fig. 4a and the device shown in fig. 5a in the embodiments of the present application is the method for querying video metadata shown in fig. 3 implemented in the present application, for implementation of the device shown in fig. 4a and the device shown in fig. 5a in the embodiments of the present application, reference may be made to implementation of the method for querying video metadata shown in fig. 3, and repeated parts are not described again.
As shown in fig. 4a, an apparatus for querying video metadata according to an embodiment of the present application includes: the video client comprises a sending module 400a and a receiving module 410a, wherein the sending module 400a is used for sending a video metadata query request to a proxy server, the video metadata query request contains a filtering factor, and the filtering factor is generated according to a filtering condition aiming at the client; the receiving module 410a is configured to receive video metadata returned by the proxy server, where the video metadata is obtained from the cache by the proxy server according to the video metadata query request.
In one possible design, the receiving module 410a obtains a filtering factor from the video system before the sending module 400a sends the video metadata query request to the proxy server, the filtering factor being generated by the video system according to the filtering condition for the client; the filtering conditions for the client comprise filtering conditions set by the video system for the client and filtering conditions reported by the client to the video system.
In one possible design, the receiving module 410a receives the filtering factor issued by the video system from a first interface with the video system.
In one possible design, the sending module 400a reports the new filtering condition to the video system through a second interface with the video system; the receiving module receives a new filtering factor returned by the video system through the first interface, and the new filtering factor is generated by the new filtering condition.
In one possible design, the first interface is a heartbeat interface or the first interface is a login interface.
It should be understood that the specific division of the modules described above is by way of example only and is not limiting of the present application.
When the query device for video metadata shown in fig. 4a exists as a single entity device, the hardware structure of the query device may be as shown in fig. 4b, where the sending module 400a and the receiving module 410a shown in fig. 4a may be implemented by a communication interface 420b, and in addition, the device shown in fig. 4b may further include a processor 410b and a memory 430b, where the memory 430b is used to store a software program and data information and the like received and sent by the communication interface 420b, and the processor 410b is used to read the software program and data stored in the memory 430b and control the communication interface to receive and send data, so as to implement the method shown in fig. 3 in the embodiment of the present application.
The processor 410b may be a general Central Processing Unit (CPU), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, configured to perform related operations to implement the technical solution provided in the embodiment of the present Application.
It should be noted that although the apparatus shown in fig. 4b only shows the processor 410b, the communication interface 420b and the memory 430b, in a specific implementation, a person skilled in the art will understand that the apparatus also contains other components necessary for normal operation. Also, it will be apparent to those skilled in the art that the apparatus may also contain hardware components to perform other additional functions, according to particular needs. Furthermore, it should be clear to a person skilled in the art that the apparatus may also comprise only the devices or modules necessary for implementing the embodiments of the application, and not necessarily all of the devices shown in fig. 4 b.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
As shown in fig. 5a, an apparatus for querying video metadata according to an embodiment of the present application includes: the video metadata query system comprises a receiving module 500a, a sending module 510a and a processing module 520a, wherein the receiving module 500a is configured to receive a video metadata query request sent by a client, the video metadata query request includes a filtering factor, and the filtering factor is generated according to a filtering condition for the client; the processing module 520a is configured to query the video metadata matched with the filtering factor from the cache according to the video metadata query request; the sending module 510a is configured to return the video metadata queried by the processing module to the client.
In one possible design, when the processing module 520a determines that video metadata matching the video metadata query request is not stored in the cache, the sending module 510a forwards the video metadata query request to the video system; the receiving module 500a is further configured to receive video metadata returned by the video system; the sending module 510a is further configured to return the video metadata returned by the video system received by the receiving module 500a to the client.
In a possible design according to the fifth aspect, the processing module 520a is further configured to store the video metadata acquired by the receiving module 500a from the video system in a cache in correspondence with the video metadata query request.
It should be understood that the specific division of the modules described above is by way of example only and is not limiting of the present application.
When the query device for video metadata shown in fig. 5a exists as a single entity device, the hardware structure of the query device may be as shown in fig. 5b, where the sending module 510a and the receiving module 500a shown in fig. 5a may be implemented by the communication interface 520b, the processing module 520a may be implemented by the processor 510b, and in addition, the device shown in fig. 5b may further include a memory 530b, where the memory 530b is used for storing software programs and data information and the like sent and received by the communication interface 520b, and the processor 510b is used for reading the software programs and data stored in the memory 530b and executing the method shown in fig. 3 in this embodiment.
The processor 510b may be a general Central Processing Unit (CPU), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, configured to perform related operations to implement the technical solution provided in the embodiment of the present Application.
It should be noted that although the apparatus shown in fig. 5b only shows the processor 510b, the communication interface 520b and the memory 530b, in a specific implementation, a person skilled in the art will understand that the apparatus also comprises other components necessary for normal operation. Also, it will be apparent to those skilled in the art that the apparatus may also contain hardware components to perform other additional functions, according to particular needs. Furthermore, it should be clear to a person skilled in the art that the apparatus may also comprise only the devices or modules necessary for implementing the embodiments of the application, and not necessarily all of the devices shown in fig. 5 b.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, e.g., from one website site, computer, server, or data center via a wired (e.g., coaxial cable, optical fiber, digital subscriber line (DS L)) or wireless (e.g., infrared, wireless, microwave, etc.) manner to another website site, computer, server, or data center.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.