CN107169019B - Video metadata query method, device and system - Google Patents

Video metadata query method, device and system Download PDF

Info

Publication number
CN107169019B
CN107169019B CN201710221659.3A CN201710221659A CN107169019B CN 107169019 B CN107169019 B CN 107169019B CN 201710221659 A CN201710221659 A CN 201710221659A CN 107169019 B CN107169019 B CN 107169019B
Authority
CN
China
Prior art keywords
video
client
video metadata
filtering
video system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710221659.3A
Other languages
Chinese (zh)
Other versions
CN107169019A (en
Inventor
余勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201710221659.3A priority Critical patent/CN107169019B/en
Publication of CN107169019A publication Critical patent/CN107169019A/en
Application granted granted Critical
Publication of CN107169019B publication Critical patent/CN107169019B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/73Querying
    • G06F16/735Filtering based on additional data, e.g. user or group profiles

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

A query method, equipment and a system of video metadata relate to the technical field of data processing, wherein the method comprises the following steps: the client sends a video metadata query request to the proxy server, the video query request comprises a filtering factor, the proxy server queries video metadata matched with the video query request from a cache according to the filtering factor, the queried video metadata are returned to the client, and the client receives the video metadata returned by the proxy server. Because the video metadata query request contains the filtering factor, compared with a video metadata query mode in the prior art, the video metadata query request can be matched with a proxy server for caching, and the query efficiency of the video metadata is improved to a certain extent.

Description

Video metadata query method, device and system
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a method, a device, and a system for querying video metadata.
Background
The video metadata refers to relevant information for describing the video, and can assist a user to know the content of the video, such as the name of the video, the category of the video, the content profile of the video, and the like.
Currently, a query system for video metadata is shown in fig. 1, and includes a client and a video system. In the prior art, a video metadata query request is sent to a video system through a client, where the video metadata query request includes at least one filtering condition, and after receiving the video metadata query request sent by the client, the video system acquires video metadata matched with the at least one filtering condition included in the video metadata query request, and feeds back the acquired video metadata to the client.
Since a video system includes a large amount of video metadata, when a large amount of clients send video metadata query requests to the video system, the query method of video metadata in the prior art may result in low data query efficiency.
Disclosure of Invention
The application provides a method, equipment and a system for querying video metadata, which can be matched with proxy server cache and are beneficial to improving the query efficiency of the video metadata.
In a first aspect, a method for querying video metadata is provided, including:
the client sends a video metadata query request to the proxy server, wherein the video metadata query request contains a filtering factor, and the filtering factor is generated according to a filtering condition aiming at the client; and receiving the video metadata returned by the proxy server, wherein the video metadata is acquired from the cache by the proxy server according to the video metadata query request.
Because the video metadata query request contains the filtering factor, and the filtering factor is generated according to the filtering condition of the client, the video metadata query request based on the filtering factor can be matched with the proxy server cache.
It should be noted that the filter condition for the client includes not only the filter condition obtained by the client itself, but also the filter condition set by the video system for the client.
Based on the first aspect, in one possible design, the client acquires a filtering factor from the video system, where the filtering factor is generated by the video system according to a filtering condition for the client; the filtering conditions for the client comprise filtering conditions set by the video system for the client and filtering conditions reported by the client to the video system.
Because the filtering factor is generated by the video system, when the filtering condition of the client is modified, the program code at the client side does not need to be modified, and the expandability of the client is improved.
In a possible design based on the first aspect, the client receives the filtering factor delivered by the video system from a first interface with the video system.
Based on the first aspect, in one possible design, the client reports the new filtering condition to the video system through a second interface between the client and the video system; and receiving a new filter factor returned by the video system through a first interface between the video system and the video system, wherein the new filter factor is generated by the new filter condition.
Based on the first aspect, in one possible design, the first interface is a heartbeat interface, or the first interface is a login interface.
In addition, it should be noted that the first interface may also be a new interface between a preset client and the video system.
In a second aspect, a method for querying video metadata is provided, including:
the method comprises the steps that a proxy server receives a video metadata query request sent by a client, wherein the video metadata query request contains a filtering factor, and the filtering factor is generated according to a filtering condition aiming at the client; and inquiring the video metadata matched with the video metadata inquiry request from the cache according to the video metadata inquiry request, and returning the inquired video metadata to the client.
Because the video metadata query request comprises the filtering factor generated aiming at the filtering condition of the client, the proxy server can be added on the basis of the existing video metadata query system through the video metadata query request containing the filtering factor, and the query efficiency of the video metadata is improved.
Based on the second aspect, in one possible design, if it is determined that video metadata matching the video metadata query request is not stored in the cache, the proxy server forwards the video metadata query request to the video system; and receiving the video metadata returned by the video system and returning the video metadata returned by the video system to the client.
In a possible design according to the second aspect, the proxy server stores the video metadata obtained from the video system in a cache in correspondence with the video metadata query request.
The proxy server correspondingly stores the video metadata acquired from the video system and the video metadata query request into the cache, so that when the proxy server receives the same video metadata query request again, the proxy server can directly acquire the video metadata from the proxy server without querying the video system, and the video system shares part of tasks of querying the video metadata to a certain extent, thereby improving the processing efficiency of the video system.
In a third aspect, a method for generating a filter factor is provided, including:
the video system receives the filter condition reported by the client;
and the video system generates a filter factor according to the filter condition reported by the client and the filter condition set by the video system for the client, and sends the filter factor to the client.
On the basis of the third aspect, in one possible design, the video system issues the filtering factor to the client through a first interface with the client.
It should be noted that the first interface may be a heartbeat interface, a login interface, or a preset newly added interface between the video system and the client.
On the basis of the third aspect, in a possible design, after receiving a new filtering condition reported by a client, the video system generates a new filtering factor according to the new filtering condition reported by the client, and sends the new filtering factor to the client.
In a fourth aspect, there is provided a video metadata query device, including: the client comprises a sending module and a receiving module, wherein the sending module is used for sending a video metadata query request to a proxy server, the video metadata query request comprises a filtering factor, and the filtering factor is generated according to a filtering condition aiming at the client; the receiving module is used for receiving the video metadata returned by the proxy server, and the video metadata is acquired from the cache by the proxy server according to the video metadata query request.
Based on the fourth aspect, in one possible design, the receiving module obtains a filtering factor from the video system before the sending module sends the video metadata query request to the proxy server, where the filtering factor is generated by the video system according to a filtering condition for the client; the filtering conditions for the client comprise filtering conditions set by the video system for the client and filtering conditions reported by the client to the video system.
Based on the fourth aspect, in one possible design, the receiving module receives the filtering factor issued by the video system from a first interface with the video system.
Based on the fourth aspect, in a possible design, the sending module reports the new filtering condition to the video system through a second interface between the sending module and the video system; the receiving module receives a new filtering factor returned by the video system through the first interface, and the new filtering factor is generated by the new filtering condition.
In a possible design based on the fourth aspect, the first interface is a heartbeat interface or the first interface is a login interface.
In a fifth aspect, there is provided a query device for video metadata, including: the receiving module is used for receiving a video metadata query request sent by a client, wherein the video metadata query request contains a filtering factor, and the filtering factor is generated according to a filtering condition aiming at the client; the processing module is used for inquiring the video metadata matched with the video metadata inquiry request from the cache according to the filtering factor; and the sending module is used for returning the video metadata inquired by the processing module to the client.
Based on the fifth aspect, in a possible design, when the processing module determines that video metadata matching the video metadata query request is not stored in the cache, the sending module forwards the video metadata query request to the video system; the receiving module is also used for receiving video metadata returned by the video system; the sending module is also used for returning the video metadata returned by the video system received by the receiving module to the client.
In a possible design based on the fifth aspect, the processing module is further configured to store the video metadata acquired by the receiving module from the video system in a cache in correspondence with the video metadata query request.
In a sixth aspect, there is provided a query system for video metadata, comprising a device capable of implementing any one of the possible designs provided in the fourth aspect or the fourth aspect, and a device capable of implementing any one of the possible designs provided in the fifth aspect or the fifth aspect.
In a seventh aspect, an embodiment of the present application further provides a video metadata query device, including a processor, a memory, and a communication interface, where the communication interface is used to receive and send information, the memory is used to store a software program and received or sent data information, and the processor is used to read the software program and data stored in the memory and control the communication interface to implement the method provided in the first aspect or any implementation manner of the first aspect.
In an eighth aspect, an embodiment of the present application further provides a video metadata query device, which includes a processor, a memory, and a communication interface, where the communication interface is used to receive and send information, the memory is used to store a software program and received or sent data information, and the processor is used to read the software program and data stored in the memory and implement the method provided in the second aspect or any implementation manner of the second aspect.
In a ninth aspect, embodiments of the present application further provide a computer storage medium, where the storage medium may be nonvolatile, that is, the content is not lost after power is turned off. The storage medium stores therein a software program which, when read and executed by one or more processors, may implement the method provided by the first aspect or any one of the implementations of the first aspect described above.
In a tenth aspect, embodiments of the present application further provide a computer storage medium, where the storage medium may be nonvolatile, that is, the content is not lost after power is turned off. The storage medium stores therein a software program which, when read and executed by one or more processors, may implement the method provided by the second aspect or any one of the implementations of the second aspect described above.
In an eleventh aspect, the present application further provides a computer program product containing instructions, which when run on a computer, causes the computer to perform the method of the above aspects.
Drawings
FIG. 1 is a schematic diagram of a prior art query system for video metadata;
FIG. 2 is a schematic diagram of a system for querying video metadata according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating a method for querying video metadata according to an embodiment of the present disclosure;
fig. 4a and fig. 4b are schematic diagrams of a query device for video metadata according to an embodiment of the present application, respectively;
fig. 5a and fig. 5b are schematic diagrams of a querying device for video metadata according to an embodiment of the present application.
Detailed Description
Embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
As shown in fig. 2, a query system of video metadata applied in the query method of video metadata according to the embodiment of the present application includes a client, a proxy server and a video system, where the client may be a terminal device, such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a set-top box, and a smart television, which is installed with a video application or an application capable of browsing videos, and the proxy server may be another server having a cache and query function, such as a Content Delivery Network (CDN).
The following describes the embodiments of the present application in detail with reference to fig. 2 as an example.
As shown in fig. 3, the method for querying video metadata according to the embodiment of the present application includes:
step 300, a client sends a video metadata query request to a proxy server, wherein the video query request comprises a filtering factor, and the proxy server receives the video metadata query request sent by the client, wherein the filtering factor is generated according to a filtering condition aiming at the client.
It should be noted that the filter condition for the client includes not only the filter condition obtained by the client according to the configuration of the client or other external factors (such as the geographic location, etc.), but also the filter condition set by the video system for the client. In general, the filtering conditions set by the video system for the client are filtering conditions of terminal attributes (such as content encoding format, etc.) and filtering conditions of user attributes (such as user packets, etc.).
In step 310, the proxy server queries the video metadata matched with the filtering factor from the cache according to the video metadata query request.
The cache stores the video metadata query request and the corresponding video metadata, and the storage space where the cache is located may be an internal storage space of the proxy server, or may be an external virtual storage space (e.g., a cloud storage space) or an external entity storage space (e.g., a hard disk) managed by the proxy server.
In step 320, the proxy server returns the queried video metadata to the client, and the client receives the video metadata returned by the proxy server.
In the embodiment of the application, since the filtering factor included in the video metadata query request is generated for the filtering condition of the client, a proxy server can be introduced into the existing video metadata query system through the video metadata query request, thereby being beneficial to improving the query efficiency of the video metadata. And because the video metadata query request comprises the filtering factor, compared with the video metadata query request comprising the filtering condition in the prior art, the size of a message body is reduced, the data flow is saved, and the implementation mode of a client is simplified.
In addition, if the proxy server determines that the video metadata matched with the video metadata query request is not stored in the cache, the video metadata query request is forwarded to the video system, the video system queries the video metadata returned to the client after receiving the video query request, the queried video metadata is returned to the proxy server, and the proxy server returns the video metadata returned by the video system to the client.
In order to avoid repeated inquiry of the video system when the client sends the same request again, the proxy server correspondingly stores the video metadata acquired from the video system and the video metadata inquiry request into a cache.
The video metadata query request comprises the filtering factor, so that the video metadata query request corresponds to the video metadata, and the accuracy of the video metadata query is ensured.
It should be understood that, in the embodiment of the present application, the filtering factor may be generated by the client, or may be generated by the video system and then sent to the client. When a filtering factor is generated by a client, the video system needs to send all filtering conditions set for the client to the client in advance, and the client needs to send the generated filtering factor and the filtering conditions obtained by the client used for generating the filtering factor according to the configuration of the client or other external factors to the video system, so that the video system can inquire video metadata matched with the filtering factor when receiving a video metadata request including the filtering factor.
When the filtering factor is generated by the video system, the client needs to report the filtering condition obtained by the client according to the configuration of the client or other external factors to the video system in advance, the video system generates the filtering factor according to the filtering condition reported by the client and the filtering condition set by the video system for the client, and then the filtering factor is issued to the client. Specifically, the video system issues the filtering factor through a first interface between the client and the client, and the client receives the filtering factor through the first interface between the client and the video system. The first interface may be a heartbeat interface, a login interface, or an interface newly added between the client and the video system.
It should be noted that the login interface is usually an interface through which a user sends authentication information, such as a user name and a password or an authentication code, to the video system when the user logs in at the client, the video system verifies whether the client is authorized through the authentication information, and then returns an authorization result through the login interface, and the client further determines whether the video system can be logged in based on the authorization result.
In addition, the heartbeat interface is generally an interface that the client side is configured to send a heartbeat request to the video system periodically (generally, one period is in units of minutes, for example, one period may be 5 minutes), and is configured to notify the video system that the client side is in an online state. Optionally, in this embodiment of the application, when the filtering condition of the video metadata set in the client changes (a new filtering condition is added, or a filtering condition reported to the video system before the client is modified), the changed filtering condition may also be carried in the heartbeat request, and the video system may generate a new filtering factor based on the changed filtering condition, and carry the new filtering factor in the heartbeat response and issue the new filtering factor to the client.
In this embodiment, the client may also report a new filtering condition to the video system through a second interface between the client and the video system, and the video system receives the new filtering condition, generates a new filtering factor according to the new filtering condition and the filtering condition set by the video system for the client, and returns the new filtering factor to the client through the first interface. The second interface may be a login interface, a heartbeat interface, or a new interface configured in advance. The first interface and the second interface may be the same interface or different interfaces.
In this embodiment, the filter factor generation method may be a Message Digest Algorithm, the input parameter is a character string for each filter condition of the client, and the output result is a hash value, and the filter factor generation method may also be an information-Digest Algorithm (Message-Digest Algorithm 5, MD5), a Secure Hash Algorithm (SHA), an original Integrity verification Message Digest (RACE Integrity verification algorithms, ripem) Algorithm, or the like.
It should be noted that, in the embodiment of the present application, the filtering condition for the client is divided into two cases:
in case one, the filtering condition for the client may be a filtering condition of the terminal attribute and a user attribute filtering condition, and in case that the filtering condition of the terminal attribute (such as terminal type, video content definition, content frame rate, content coding format, geographical location where the terminal is located, etc.) and the filtering condition of the user attribute (such as subnet operator, user group, area, parental control level, language category, etc.) are commonly common filtering conditions for each query interface between the client and the video system, and the filtering conditions are generally equal in value for most query interfaces, in order to simplify query requests for video metadata, a filtering factor is generated based on the filtering conditions, and when the user inputs personalized filtering conditions such as date, type of queried video metadata (movie) at the client based on his own requirements, the video metadata query request can be directly carried in the video metadata query request, namely the video metadata query request comprises the filtering factor and at least one filtering condition. Thus avoiding carrying filtering conditions common to these query interfaces each time video metadata is queried. The filtering conditions shared by the query interfaces comprise filtering conditions acquired by the client based on self configuration or other external factors and filtering conditions set by the video system for the client, so that when a video metadata query request based on the embodiment of the application is stored corresponding to video metadata, the proxy server can uniquely determine the video metadata matched with the video metadata query request according to the video metadata query request. In this case, when the user does not input the filtering condition at the client, only the filtering factor is included in the video metadata query request sent by the client to the video system.
In the second case, in the embodiment of the present application, the filter condition for the client is all filter conditions set for the client and the video system for the client, and includes not only a filter condition common to the query interface, but also a filter condition personalized by the user based on the user's own needs, such as a date, a type of the queried video metadata (movie), and the like, at this time, the video metadata query request only includes a filter factor. In the second case, the filtering factors are generated by all the filtering conditions for the client, but in the general case, the filtering factors input by the user are set based on the requirements of the user, so when the client sets the filtering conditions for the first time, the filtering conditions input by the user need to be sent in advance and stored in the video system, and when the user inputs the same filtering conditions at the client later, the filtering conditions do not need to be sent to the video system.
In the first case, the filter factor may be generated by the client or generated by the video system, in order to improve the expandability of the terminal, the filter factor is generated by the video system in a normal case, and in the second case, the filter factor may be generated by the client or generated by the video system, but since the filter factor input by the user is required to be used when the filter factor is generated, it is more convenient for the client to generate the filter factor, the filter factor generated for the first time is to enable the video system to search for video metadata matching the filter factor based on the filter factor, the terminal is to carry the filter condition set for the user in the client in the video query request, and when the subsequent terminal sends the same video metadata query request, it is not required to carry the filter condition set for the user in the client again. The proxy server may store the visual filter factors and corresponding video metadata.
Based on the same concept, video metadata query devices shown in fig. 4a and fig. 5a are also provided in the embodiments of the present application, and since the method corresponding to the device shown in fig. 4a and the device shown in fig. 5a in the embodiments of the present application is the method for querying video metadata shown in fig. 3 implemented in the present application, for implementation of the device shown in fig. 4a and the device shown in fig. 5a in the embodiments of the present application, reference may be made to implementation of the method for querying video metadata shown in fig. 3, and repeated parts are not described again.
As shown in fig. 4a, an apparatus for querying video metadata according to an embodiment of the present application includes: the video client comprises a sending module 400a and a receiving module 410a, wherein the sending module 400a is used for sending a video metadata query request to a proxy server, the video metadata query request contains a filtering factor, and the filtering factor is generated according to a filtering condition aiming at the client; the receiving module 410a is configured to receive video metadata returned by the proxy server, where the video metadata is obtained from the cache by the proxy server according to the video metadata query request.
In one possible design, the receiving module 410a obtains a filtering factor from the video system before the sending module 400a sends the video metadata query request to the proxy server, the filtering factor being generated by the video system according to the filtering condition for the client; the filtering conditions for the client comprise filtering conditions set by the video system for the client and filtering conditions reported by the client to the video system.
In one possible design, the receiving module 410a receives the filtering factor issued by the video system from a first interface with the video system.
In one possible design, the sending module 400a reports the new filtering condition to the video system through a second interface with the video system; the receiving module receives a new filtering factor returned by the video system through the first interface, and the new filtering factor is generated by the new filtering condition.
In one possible design, the first interface is a heartbeat interface or the first interface is a login interface.
It should be understood that the specific division of the modules described above is by way of example only and is not limiting of the present application.
When the query device for video metadata shown in fig. 4a exists as a single entity device, the hardware structure of the query device may be as shown in fig. 4b, where the sending module 400a and the receiving module 410a shown in fig. 4a may be implemented by a communication interface 420b, and in addition, the device shown in fig. 4b may further include a processor 410b and a memory 430b, where the memory 430b is used to store a software program and data information and the like received and sent by the communication interface 420b, and the processor 410b is used to read the software program and data stored in the memory 430b and control the communication interface to receive and send data, so as to implement the method shown in fig. 3 in the embodiment of the present application.
The processor 410b may be a general Central Processing Unit (CPU), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, configured to perform related operations to implement the technical solution provided in the embodiment of the present Application.
It should be noted that although the apparatus shown in fig. 4b only shows the processor 410b, the communication interface 420b and the memory 430b, in a specific implementation, a person skilled in the art will understand that the apparatus also contains other components necessary for normal operation. Also, it will be apparent to those skilled in the art that the apparatus may also contain hardware components to perform other additional functions, according to particular needs. Furthermore, it should be clear to a person skilled in the art that the apparatus may also comprise only the devices or modules necessary for implementing the embodiments of the application, and not necessarily all of the devices shown in fig. 4 b.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
As shown in fig. 5a, an apparatus for querying video metadata according to an embodiment of the present application includes: the video metadata query system comprises a receiving module 500a, a sending module 510a and a processing module 520a, wherein the receiving module 500a is configured to receive a video metadata query request sent by a client, the video metadata query request includes a filtering factor, and the filtering factor is generated according to a filtering condition for the client; the processing module 520a is configured to query the video metadata matched with the filtering factor from the cache according to the video metadata query request; the sending module 510a is configured to return the video metadata queried by the processing module to the client.
In one possible design, when the processing module 520a determines that video metadata matching the video metadata query request is not stored in the cache, the sending module 510a forwards the video metadata query request to the video system; the receiving module 500a is further configured to receive video metadata returned by the video system; the sending module 510a is further configured to return the video metadata returned by the video system received by the receiving module 500a to the client.
In a possible design according to the fifth aspect, the processing module 520a is further configured to store the video metadata acquired by the receiving module 500a from the video system in a cache in correspondence with the video metadata query request.
It should be understood that the specific division of the modules described above is by way of example only and is not limiting of the present application.
When the query device for video metadata shown in fig. 5a exists as a single entity device, the hardware structure of the query device may be as shown in fig. 5b, where the sending module 510a and the receiving module 500a shown in fig. 5a may be implemented by the communication interface 520b, the processing module 520a may be implemented by the processor 510b, and in addition, the device shown in fig. 5b may further include a memory 530b, where the memory 530b is used for storing software programs and data information and the like sent and received by the communication interface 520b, and the processor 510b is used for reading the software programs and data stored in the memory 530b and executing the method shown in fig. 3 in this embodiment.
The processor 510b may be a general Central Processing Unit (CPU), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, configured to perform related operations to implement the technical solution provided in the embodiment of the present Application.
It should be noted that although the apparatus shown in fig. 5b only shows the processor 510b, the communication interface 520b and the memory 530b, in a specific implementation, a person skilled in the art will understand that the apparatus also comprises other components necessary for normal operation. Also, it will be apparent to those skilled in the art that the apparatus may also contain hardware components to perform other additional functions, according to particular needs. Furthermore, it should be clear to a person skilled in the art that the apparatus may also comprise only the devices or modules necessary for implementing the embodiments of the application, and not necessarily all of the devices shown in fig. 5 b.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, e.g., from one website site, computer, server, or data center via a wired (e.g., coaxial cable, optical fiber, digital subscriber line (DS L)) or wireless (e.g., infrared, wireless, microwave, etc.) manner to another website site, computer, server, or data center.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (17)

1. A method for querying video metadata, comprising:
a client sends a video metadata query request to a proxy server, wherein the video metadata query request contains a filtering factor, and the filtering factor is a hash value generated according to a filtering condition aiming at the client;
and the client receives the video metadata returned by the proxy server, and the video metadata is acquired from a cache by the proxy server according to the video metadata query request.
2. The method of claim 1, wherein before the client sends the video metadata query request to the proxy server, further comprising:
the client acquires the filtering factor from a video system, wherein the filtering factor is generated by the video system according to the filtering condition aiming at the client; the filtering conditions for the client comprise filtering conditions set by the video system for the client and filtering conditions reported by the client to the video system.
3. The method of claim 2, wherein the client obtaining the filter factor from the video system comprises:
and the client receives the filtering factor issued by the video system from a first interface between the client and the video system.
4. The method of claim 2, wherein the method further comprises:
the client reports a new filtering condition to the video system through a second interface between the client and the video system;
and the client receives a new filtering factor returned by the video system through a first interface between the client and the video system, wherein the new filtering factor is generated by the new filtering condition.
5. The method of claim 3 or 4, wherein the first interface is a heartbeat interface or the first interface is a login interface.
6. A method for querying video metadata, comprising:
the method comprises the steps that a proxy server receives a video metadata query request sent by a client, wherein the video metadata query request contains a filtering factor, and the filtering factor generates a hash value according to a filtering condition aiming at the client;
and the proxy server inquires the video metadata matched with the video metadata inquiry request from a cache according to the video metadata inquiry request and returns the inquired video metadata to the client.
7. The method of claim 6, wherein the method further comprises:
if the proxy server determines that the video metadata matched with the video metadata query request are not stored in the cache, forwarding the video metadata query request to a video system;
and the proxy server receives the video metadata returned by the video system and returns the video metadata returned by the video system to the client.
8. The method of claim 7, wherein the method further comprises:
and the proxy server correspondingly stores the video metadata acquired from the video system and the video metadata query request into the cache.
9. An apparatus for querying video metadata, comprising:
a sending module, configured to send a video metadata query request to a proxy server, where the video metadata query request includes a filtering factor, and the filtering factor is a hash value generated according to a filtering condition for the querying device;
and the receiving module is used for receiving the video metadata returned by the proxy server, and the video metadata is acquired from a cache by the proxy server according to the video metadata query request.
10. The device of claim 9, wherein the receiving module is further configured to:
before the sending module sends the video metadata query request to the proxy server, obtaining the filtering factor from a video system, wherein the filtering factor is generated by the video system according to the filtering condition aiming at the query equipment; the filter conditions for the query device include filter conditions set by the video system for the query device and filter conditions reported by the query device to the video system.
11. The apparatus of claim 10, wherein the receiving module obtains the filter factor from the video system, in particular comprising:
and receiving the filtering factor issued by the video system from a first interface between the video system and the video system.
12. The device of claim 10, wherein the sending module is further configured to:
reporting a new filtering condition to the video system through a second interface between the second interface and the video system;
the receiving module is further configured to receive a new filtering factor returned by the video system through a first interface with the video system, where the new filtering factor is generated by the new filtering condition.
13. The device of claim 11 or 12, wherein the first interface is a heartbeat interface or the first interface is a login interface.
14. An apparatus for querying video metadata, comprising:
a receiving module, configured to receive a video metadata query request sent by a client, where the video metadata query request includes a filtering factor, and the filtering factor is a hash value generated according to a filtering condition for the client;
the processing module is used for inquiring the video metadata matched with the video metadata inquiry request from a cache according to the video metadata inquiry request;
and the sending module is used for returning the video metadata inquired by the processing module to the client.
15. The device of claim 14, wherein the sending module is further configured to:
if the processing module determines that the video metadata matched with the filtering factor is not stored in the cache, forwarding the video metadata query request to a video system;
the receiving module is further used for receiving the video metadata returned by the video system;
the sending module is further configured to return the video metadata returned by the video system and received by the receiving module to the client.
16. The device of claim 15, wherein the processing module is further to:
correspondingly storing the video metadata acquired by the receiving module from the video system and the video metadata query request into the cache.
17. A system for querying video metadata, comprising a device according to any one of claims 9 to 13 and a device according to any one of claims 14 to 16.
CN201710221659.3A 2017-04-06 2017-04-06 Video metadata query method, device and system Active CN107169019B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710221659.3A CN107169019B (en) 2017-04-06 2017-04-06 Video metadata query method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710221659.3A CN107169019B (en) 2017-04-06 2017-04-06 Video metadata query method, device and system

Publications (2)

Publication Number Publication Date
CN107169019A CN107169019A (en) 2017-09-15
CN107169019B true CN107169019B (en) 2020-07-24

Family

ID=59849090

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710221659.3A Active CN107169019B (en) 2017-04-06 2017-04-06 Video metadata query method, device and system

Country Status (1)

Country Link
CN (1) CN107169019B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109996125A (en) * 2019-05-07 2019-07-09 北京字节跳动网络技术有限公司 Generate method, apparatus, electronic equipment and the storage medium of video list

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101616007A (en) * 2008-06-24 2009-12-30 华为技术有限公司 A kind of implementation method of MAP server, system and equipment
CN102550021A (en) * 2010-12-24 2012-07-04 华为技术有限公司 Method and system for providing preview video, media server and playing terminal
CN102929958A (en) * 2012-10-10 2013-02-13 无锡江南计算技术研究所 Metadata processing method, agenting and forwarding equipment, server and computing system
CN103548017A (en) * 2011-12-26 2014-01-29 华为技术有限公司 Video search method and video search system
CN104216957A (en) * 2014-08-20 2014-12-17 北京奇艺世纪科技有限公司 Query system and query method for video metadata
CN105897850A (en) * 2015-12-22 2016-08-24 乐视云计算有限公司 Response processing method and system and scheduling proxy server for CDN platform
CN106131175A (en) * 2016-07-01 2016-11-16 微梦创科网络科技(中国)有限公司 A kind of acquisition of information, information-pushing method and equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7904473B2 (en) * 2005-04-04 2011-03-08 Aol Inc. Community-based parental controls
US20080270220A1 (en) * 2005-11-05 2008-10-30 Jorey Ramer Embedding a nonsponsored mobile content within a sponsored mobile content
CN100397401C (en) * 2006-09-14 2008-06-25 浙江大学 Method for multiple resources pools integral parallel search in open websites
US8566855B2 (en) * 2008-12-02 2013-10-22 Sony Corporation Audiovisual user interface based on learned user preferences
US9226037B2 (en) * 2010-12-30 2015-12-29 Pelco, Inc. Inference engine for video analytics metadata-based event detection and forensic search

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101616007A (en) * 2008-06-24 2009-12-30 华为技术有限公司 A kind of implementation method of MAP server, system and equipment
CN102550021A (en) * 2010-12-24 2012-07-04 华为技术有限公司 Method and system for providing preview video, media server and playing terminal
CN103548017A (en) * 2011-12-26 2014-01-29 华为技术有限公司 Video search method and video search system
CN102929958A (en) * 2012-10-10 2013-02-13 无锡江南计算技术研究所 Metadata processing method, agenting and forwarding equipment, server and computing system
CN104216957A (en) * 2014-08-20 2014-12-17 北京奇艺世纪科技有限公司 Query system and query method for video metadata
CN105897850A (en) * 2015-12-22 2016-08-24 乐视云计算有限公司 Response processing method and system and scheduling proxy server for CDN platform
CN106131175A (en) * 2016-07-01 2016-11-16 微梦创科网络科技(中国)有限公司 A kind of acquisition of information, information-pushing method and equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"基于代理缓存策略的分布式VOD系统研究";张剑;《中国优秀硕士学位伦恩全文数据库》;20090430;第10页 *

Also Published As

Publication number Publication date
CN107169019A (en) 2017-09-15

Similar Documents

Publication Publication Date Title
US11232080B2 (en) Systems and methods for providing access to a data file stored at a data storage system
US11036408B2 (en) Rule-based modifications in a data storage appliance monitor
US11082355B2 (en) Controllng distribution of resources in a network
CN111970315A (en) Method, device and system for pushing message
WO2017092351A1 (en) Cache data update method and device
JP5698327B2 (en) Content categorization method and system
CN104283933A (en) Data downloading method, client-side and system
CN104468807A (en) Processing method, cloud end device, local devices and system for webpage cache
CN104216957A (en) Query system and query method for video metadata
WO2019201040A1 (en) File update management method and system and terminal apparatus
CN108206776B (en) Group history message query method and device
US8375124B1 (en) Resumable upload for hosted storage systems
US20150263977A1 (en) Profile-based cache management
CN111177776A (en) Multi-tenant data isolation method and system
US10430441B1 (en) Tagging resources of a remote computing service based on locality
CN112134908B (en) Application adaptation method, server, medium and vehicle-mounted multimedia system
US9704169B2 (en) Digital publication monitoring by geo-location
CN107169019B (en) Video metadata query method, device and system
CN114168847A (en) Data query method and device, electronic equipment and storage medium
US11947553B2 (en) Distributed data processing
CN107992489B (en) Data processing method and server
JP2011510572A (en) Method, apparatus and system for realizing fingerprint technology
US10313469B2 (en) Method, apparatus and system for processing user generated content
CN112995723A (en) EPG data management method, server and readable storage medium
CN113076380B (en) Data synchronization method, device, system, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20180815

Address after: 210012 HUAWEI Nanjing base, 101 software Avenue, Yuhuatai District, Nanjing, Jiangsu.

Applicant after: Huawei Technologies Co.,Ltd.

Address before: 518129 Bantian HUAWEI headquarters office building, Longgang District, Guangdong, Shenzhen

Applicant before: Huawei Technologies Co., Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200201

Address after: 518129 Bantian HUAWEI headquarters office building, Longgang District, Guangdong, Shenzhen

Applicant after: HUAWEI TECHNOLOGIES Co.,Ltd.

Address before: 210012 HUAWEI Nanjing base, 101 software Avenue, Yuhuatai District, Jiangsu, Nanjing

Applicant before: Huawei Technologies Co.,Ltd.

GR01 Patent grant
GR01 Patent grant