CN111343146B - Data auditing method, system, computer readable medium and electronic equipment - Google Patents
Data auditing method, system, computer readable medium and electronic equipment Download PDFInfo
- Publication number
- CN111343146B CN111343146B CN202010079917.0A CN202010079917A CN111343146B CN 111343146 B CN111343146 B CN 111343146B CN 202010079917 A CN202010079917 A CN 202010079917A CN 111343146 B CN111343146 B CN 111343146B
- Authority
- CN
- China
- Prior art keywords
- data
- characteristic value
- audited
- resource
- preset
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/14—Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
- H04L63/1408—Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
- H04L63/1416—Event detection, e.g. attack signature detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/958—Organisation or management of web site content, e.g. publishing, maintaining pages or automatic linking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L69/00—Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
- H04L69/22—Parsing or analysis of headers
Landscapes
- Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Data Mining & Analysis (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Computing Systems (AREA)
- Storage Device Security (AREA)
Abstract
The disclosure relates to a data auditing method, a data auditing system, a computer readable medium and electronic equipment. The method is applied to a data auditing system, the data auditing system is positioned between a client and a server, and the method comprises the following steps: after the data auditing request is acquired, determining whether the characteristic value of the data to be audited is matched with the preset characteristic value of the illegal resource; and if the characteristic value of the data to be audited is determined to be matched with the preset characteristic value of the illegal resource, carrying out the prohibition processing on the resource corresponding to the data to be audited. And determining whether the resource corresponding to the data to be checked is an illegal resource or not through the characteristic value, and carrying out forbidden processing on the resource when determining that the resource is the illegal resource so as to avoid the propagation of the illegal resource. In addition, the defect that the user can obtain the illegal resource through other paths because the resource is only sealed at the source in the related technology is avoided, the illegal resource can be further prevented from being spread in the computer network, and the safety of the computer network is effectively ensured.
Description
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a data auditing method, system, computer-readable medium, and electronic device.
Background
Since the information technology appeared, the corresponding security problem was an unavoidable issue, and the scope of the network security industry is continuously extended along with the requirement of network security. If the importance degree of the data information safety is not enough, the data information has serious potential safety hazard. For example, in the absence of data verification between the client and the server, when data uploaded by the user or data downloaded from the server by the user is embedded in illegal content (e.g., related content related to yellow-related, terrorist, and the like) to make the data an illegal resource, the user may unwittingly upload the illegal resource or download and use the illegal resource. Thus, the security of the computer network cannot be guaranteed.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
In a first aspect, the present disclosure provides a data auditing method applied to a data auditing system, where the data auditing system is located between a client and a server, and the method includes:
after a data auditing request is acquired, determining whether a characteristic value of the to-be-audited data is matched with a preset characteristic value of an illegal resource, wherein the to-be-audited data is sent by the client or the server;
and if the characteristic value of the data to be checked is determined to be matched with the preset characteristic value of the illegal resource, carrying out sealing treatment on the resource corresponding to the data to be checked.
In a second aspect, the present disclosure provides a data auditing system, including:
the first determining module is used for determining whether a characteristic value of the to-be-checked data is matched with a preset characteristic value of an illegal resource after the data auditing request is obtained, wherein the to-be-checked data is sent by the client or the server;
and the forbidding module is used for carrying out forbidding processing on the resource corresponding to the data to be checked if the characteristic value of the data to be checked is determined to be matched with the preset characteristic value of the illegal resource.
In a third aspect, the present disclosure provides a computer readable medium having stored thereon a computer program which, when executed by a processing apparatus, performs the steps of the method of the first aspect of the present disclosure.
In a fourth aspect, the present disclosure provides an electronic device comprising:
a storage device having a computer program stored thereon;
processing means for executing the computer program in the storage means to implement the steps of the method of the first aspect of the present disclosure.
According to the technical scheme, the data auditing system is arranged in front of the client and the server, and can determine whether the resource corresponding to the data to be audited is an illegal resource or not through the characteristic value, and when the resource is determined to be the illegal resource, the resource is sealed and forbidden so as to avoid the propagation of the illegal resource. In addition, since the illegal resources are prohibited by the data auditing system arranged between the client and the server, the defect that in the related technology, the resources are only prohibited at the source so that a user can obtain the illegal resources through other paths is avoided, the illegal resources can be further prevented from being spread in a computer network, and the security of the computer network is effectively ensured.
Additional features and advantages of the present disclosure will be set forth in the detailed description which follows.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale. In the drawings:
fig. 1 is a schematic diagram of an application scenario provided in accordance with one embodiment of the present disclosure.
FIG. 2 is a flow diagram of a data auditing method provided in accordance with one embodiment of the present disclosure.
FIG. 3 is a flow diagram of a data auditing method provided in accordance with another embodiment of the present disclosure.
FIG. 4 is a block diagram of a data auditing system provided according to one embodiment of the present disclosure.
FIG. 5 illustrates a schematic diagram of an electronic device suitable for use in implementing embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
First, an application scenario of the data auditing method provided by the present disclosure is explained. Fig. 1 is a schematic diagram of an application scenario provided in accordance with one embodiment of the present disclosure. As shown in fig. 1, the application scenario may include a client 11, a server 12, and a data auditing system 13. The data auditing system 13 is located between the client 11 and the server 12, and is configured to audit data uploaded by the client 11 or data issued by the server 12, to detect whether a resource corresponding to the data is an illegal resource, and to prohibit the resource when the resource is an illegal resource. The data auditing system 13 may be integrated in an electronic device, or may be integrated in the client 11 and the server 12 in fig. 1. In the present disclosure, the data auditing system 13 is integrated in an electronic device for example.
In an embodiment, a user may upload data through the client 11, and in order to ensure that a resource corresponding to the uploaded data is a legal resource, before the server 12 issues the resource corresponding to the data, the data auditing system 13 may audit the data to determine whether the resource corresponding to the data is a legal resource.
In another embodiment, the user may send a request for accessing data through the client 11, where the request for accessing data is used to request the server 12, and the server 12 issues the data to the client 11 in response to the request for accessing data. In this way, the data auditing system 13 located between the client 11 and the server 12 can audit the data before the client 11 accesses the data to determine whether the resource corresponding to the data is a legal resource.
It should be noted that, in this embodiment, the data auditing system 13 may also have a function of auditing requests for accessing data. For example, the management and control module in the data auditing system 13 may determine, based on the user account information, a target network speed corresponding to the user, whether the user has the right to access the data, and so on, and send the access data request to the agent module in the data auditing system 13 at the target network speed when determining that the user has the right to access the data. The proxy module stores a target address corresponding to the access data request, so that after the proxy module receives the access data request, the access data request can be sent to the target address, so that the client 11 can access data. The present disclosure does not specifically limit the management and control module and the agent module in the data auditing system 13.
FIG. 2 is a flow diagram of a data auditing method provided in accordance with one embodiment of the present disclosure. The method can be applied to the data auditing system in fig. 1, and as shown in fig. 2, the method can include the following steps.
In step 21, after the data auditing request is acquired, it is determined whether the feature value of the data to be audited matches with a preset feature value of an illegal resource. Wherein, the data to be audited is sent by the client or the server.
When a user issues data through a client, the data to be audited is the data sent by the client. When a user accesses data in the server through the client, the data to be audited is the data sent by the server. The data auditing request is used for requesting to audit the data, and can be generated based on user operation or automatically generated by a data auditing system. The manner in which the data audit request is automatically generated will be described below and will not be described in detail here.
In the present disclosure, the eigenvalue of the illegal resource may be stored in the data auditing system or in the eigenvalue management module in the data auditing system in advance, so that after the eigenvalue of the data to be audited is obtained, it can be determined whether the eigenvalue of the data to be audited matches with the prestored eigenvalue of the illegal resource. Specifically, it may be determined whether the feature value of the data to be audited belongs to a feature value of a prestored illegal resource, and if so, it is determined that the feature value of the data to be audited matches a preset feature value of the illegal resource.
It should be noted that, the feature value of the data to be checked and the feature value of the illegal resource can be determined by, but not limited to, the following algorithms: MD5 Message Digest Algorithm (MD5 Message-Digest Algorithm), SHA-1 Algorithm (Secure Hash Algorithm 1), SHA-2 Algorithm (Secure Hash Algorithm 2, Secure Hash Algorithm 2). The present disclosure does not specifically limit the manner in which the characteristic values of the data to be reviewed are determined.
In step 22, if it is determined that the feature value of the data to be checked matches the feature value of the preset illegal resource, the resource corresponding to the data to be checked is subjected to the prohibition processing.
When it is determined that the characteristic value of the to-be-audited data matches the characteristic value of the preset illegal resource according to the above manner, that is, when the characteristic value of the to-be-audited data belongs to the characteristic value of the prestored illegal resource, the resource corresponding to the to-be-audited data can be considered to belong to the illegal resource. At this time, in order to ensure the security of the computer network, the data auditing system may perform a blocking process on the resource corresponding to the data to be audited, and prohibit the illegal resource from being propagated in the computer network.
According to the technical scheme, the data auditing system is arranged in front of the client and the server, and can determine whether the resource corresponding to the data to be audited is an illegal resource through the characteristic value, and when the resource is determined to be the illegal resource, the resource is forbidden to be processed, so that the illegal resource is prevented from being spread. In addition, since the illegal resources are prohibited by the data auditing system arranged between the client and the server, the defect that in the related technology, the resources are only prohibited at the source so that a user can obtain the illegal resources through other paths is avoided, the illegal resources can be further prevented from being spread in a computer network, and the security of the computer network is effectively ensured.
In addition, it is considered that the amount of data corresponding to different types of resources is different. For example, the video resource is usually large in volume, and accordingly, the corresponding data volume is large; the picture and text resource volume is relatively small, and the corresponding data volume is also small. When the data volume is large, it may exist that a resource corresponding to a certain section of data in the data is an illegal resource, and therefore, in the present disclosure, in the case that the data to be audited is video data, it may be determined whether the resource corresponding to the certain section of data is an illegal resource. The Type of the resource corresponding to the data to be audited can be determined based on the Content-Type field in the HTTP message header. For example, if the Content-Type field is MP4, MP3, rmvb, flv, etc., the data to be audited is video data. If the Content-Type field is jpg, gif, png, bmp, pcd, etc., the data to be audited is picture data. And if the Content-Type field is txt, doc, docx and the like, the data to be audited is text data.
Further, considering that if the resource corresponding to the to-be-audited data is a video, the Content of some frame images in the video may relate to illegal Content, in this disclosure, in the case that the to-be-audited data is determined to be video data according to the Content-Type field in the HTTP message header, the feature value of the to-be-audited data may include a feature value corresponding to at least one key frame image in the video data. For example, the feature value of the pending review data may include a feature value corresponding to each key frame image in the video corresponding to the video data, and may also include a feature value corresponding to a preset number of key frame images in the video corresponding to the video data, and this disclosure is not limited in particular.
When the data to be audited is video data and the characteristic value of the data to be audited includes a characteristic value corresponding to at least one key frame image in the video data, the determining whether the characteristic value of the data to be audited matches with a preset characteristic value of an illegal resource may include the following steps:
matching the characteristic value corresponding to the key frame image with the preset characteristic value of the illegal resource;
and if the key frame image matched with the preset characteristic value of the illegal resource exists, determining that the characteristic value of the data to be checked is matched with the preset characteristic value of the illegal resource.
In an embodiment, if all the feature values of the data to be audited are matched with the feature values of the preset illegal resources, it is determined that the feature values of the data to be audited are matched with the feature values of the preset illegal resources.
In another embodiment, if there are a preset number (the preset number is an integer greater than 1) of key frame images matching the feature value of the preset illegal resource, it is determined that the feature value of the data to be checked matches the feature value of the preset illegal resource.
In yet another embodiment, to further avoid the propagation of illegal resources, it is determined that the feature values of the data to be reviewed match the feature values of the preset illegal resources as long as there is a key frame image matching the feature values of the preset illegal resources.
By adopting the technical scheme, under the condition that the data to be audited is the video data, whether the video is legal or not can be determined based on the characteristic value corresponding to the key frame image of the video corresponding to the video data, and therefore the accuracy of judging whether the video is legal or not can be improved.
It should be noted that, if the data to be checked is the picture data or the text data, since the amount of the picture data or the text data is small, the feature value of the whole data can be calculated, and the feature value of each segment does not need to be calculated in segments.
When determining that the resource corresponding to the data to be audited is an illegal resource, in order to facilitate a user to obtain an audit result, after determining that the characteristic value of the data to be audited matches with the preset characteristic value of the illegal resource, the method provided by the disclosure may further include the following steps:
outputting prompt information for indicating that the data to be checked is illegal resource data; and/or
And prohibiting the client from accessing or uploading the data to be checked.
In one embodiment, after determining that the characteristic value of the data to be audited matches the preset characteristic value of the illegal resource, the data auditing system outputs prompt information for indicating that the data to be audited is illegal resource data. For example, if the data auditing system is integrated in an electronic device with a display function, the prompt message can be displayed in a display screen of the electronic device; alternatively, the data auditing system can also broadcast the prompt message in a voice form, and the like. The present disclosure does not specifically limit this.
In another embodiment, after the characteristic value of the data to be audited is determined to be matched with the preset characteristic value of the illegal resource, the client is prohibited from accessing or uploading the data to be audited. For example, if the data to be audited is data sent by the server in response to the request of the client, in this case, the data auditing system may return an empty resource to the client, so as to prevent the user from accessing an illegal resource. For another example, if the data to be audited is data uploaded by the client, in this case, the user may be prohibited from uploading the data to be audited, so as to avoid illegal resource propagation.
In yet another embodiment, after determining that the characteristic value of the data to be audited matches the preset characteristic value of the illegal resource, the data auditing system may output a prompt message that may be used to indicate that the data to be audited is the data of the illegal resource, and may also prohibit the client from accessing or uploading the data to be audited. Fig. 3 is a flowchart of a data auditing method provided according to another embodiment of the present disclosure, and as shown in fig. 3, before determining whether the feature value of the data to be audited matches with the feature value of a preset illegal resource, the method may further include the following steps.
In step 23, in response to receiving the hijack identification request, a hijack identification condition corresponding to the to-be-audited data sent by the client or the server is obtained.
It should be understood by those skilled in the art that with the popularization of networks, more and more network hijacking occurs, and data accessed or uploaded by a user is attacked in a light way, and data leakage may be caused in a severe way. Therefore, in the present disclosure, before auditing the data to be audited, hijacking identification can be performed on the data to be audited. Specifically, the user may manually input a hijacking identification request, or the data auditing system may automatically generate the hijacking identification request when detecting that the client uploads data or accesses data, where the hijacking identification request is used to request to identify whether the data to be audited is hijacked.
In the disclosure, a user may store hijack identification conditions corresponding to the data to be audited in the data auditing system in advance. And the hijack identification conditions corresponding to different data to be audited can be the same or different according to the user requirements. The disclosure is described by taking the example that different hijack identification conditions corresponding to different data to be audited are different. Therefore, when receiving the hijack identification request, the data auditing system firstly acquires the hijack identification condition corresponding to the auditing data.
For example, the hijack identification management module in the data auditing system stores the hijack identification condition, so that when the data auditing system receives the hijack identification request, the hijack identification condition corresponding to the data to be audited can be determined from the hijack identification management module.
The hijacking identification condition may include a hijacking identification condition on a TCP layer, for example, a TCP packet hijacking identification condition, and/or a hijacking identification condition on an HTTP layer, for example, an HTTP packet hijacking identification condition.
In a possible case, the hijacking identification condition includes a TCP packet hijacking identification condition, and accordingly, the data auditing method provided by the present disclosure may further include the following steps:
performing TCP protocol analysis on the data to be audited to obtain a TCP message of the data to be audited;
and if the TCP message has the preset mark, determining that the data to be audited does not meet the TCP message hijack identification condition.
Specifically, the TCP packet hijacking identification condition may be that a preset mark agreed in advance does not exist in the TCP packet. If the checked data is not hijacked, a prearranged preset mark can exist in the TCP message after the TCP protocol analysis is carried out on the checked data, and conversely, if the checked data is hijacked, the prearranged preset mark can not exist in the TCP message after the TCP protocol analysis is carried out. Therefore, whether the data to be checked meets the TCP message hijack identification condition can be judged by judging whether the TCP message has the preset mark.
In an example, after the TCP protocol analysis is performed on the data to be audited to obtain the TCP packet of the data to be audited, if the preset mark exists in the TCP packet, it is determined that the data to be audited does not satisfy the TCP packet hijacking identification condition, that is, it indicates that the data to be audited is not hijacked.
In another possible case, the hijack identification condition includes an HTTP message hijack identification condition, and accordingly, the data auditing method provided by the present disclosure may further include the following steps:
performing HTTP (hyper text transport protocol) analysis on the data to be audited to obtain an HTTP message of the data to be audited;
and if the header information of the HTTP message and/or the parameters of the HTTP message meet the corresponding relation of a preset algorithm, and/or the characteristic value of the data to be audited is consistent with the characteristic value carried in the header and/or the parameters of the HTTP message, determining that the data to be audited does not meet the hijacking identification condition of the HTTP message.
Specifically, the HTTP message hijack identification condition may be that a preset algorithm correspondence is not satisfied between HTTP message header information and/or between HTTP message parameters, and/or that a feature value of the to-be-audited data is inconsistent with a feature value carried in a header and/or a parameter of the HTTP message. The preset algorithm corresponding relationship may be that the result of some fields in the HTTP message header information after the preset algorithm is the same as other fields in the header information, or that the result of some fields in the HTTP message parameters after the preset algorithm is the same as other fields in the parameters. By way of example, assume that the results of the parameters a and b after the hashing algorithm are equal to the parameter c, and so on. In addition, in the stage of identifying whether the data to be audited is hijacked, the characteristic value of the data to be audited can also be determined by, but is not limited to, the following algorithm: MD5 Message Digest Algorithm (MD5 Message-Digest Algorithm), SHA-1 Algorithm (Secure Hash Algorithm 1), SHA-2 Algorithm (Secure Hash Algorithm 2, Secure Hash Algorithm 2). It should be noted that, if the characteristic value of the data to be audited is determined in the hijack identification stage, the characteristic value of the data to be audited may be directly obtained in step 21 in fig. 2.
If the data to be audited is not hijacked, after the HTTP protocol analysis is carried out on the data to be audited, the corresponding relation of the preset algorithm is met between the head information of the HTTP messages and/or between the parameters of the HTTP messages, and/or the characteristic value of the data to be audited is consistent with the characteristic value carried in the head and/or the parameters of the HTTP messages. On the contrary, if the data to be checked is hijacked, after the HTTP protocol is analyzed, the corresponding relation between the head information of the HTTP messages and/or the parameters of the HTTP messages is not satisfied, and/or the characteristic value of the data to be checked is inconsistent with the characteristic value carried in the head and/or the parameters of the HTTP messages. Therefore, whether the data to be checked meets the HTTP message hijack identification condition can be judged by judging whether the header information of the HTTP message and/or the parameters of the HTTP message meet the corresponding relation of the preset algorithm and/or whether the characteristic value of the data to be checked is consistent with the characteristic value carried in the header and/or the parameters of the HTTP message.
For example, if the header information of the HTTP message and/or the parameters of the HTTP message satisfy a preset algorithm corresponding relationship, and/or the feature value of the to-be-audited data is consistent with the feature value carried in the header and/or the parameters of the HTTP message, it is determined that the to-be-audited data does not satisfy the HTTP message hijacking identification condition, that is, it indicates that the to-be-audited data is not hijacked.
In yet another embodiment, the hijacking identification condition comprises a TCP message hijacking identification condition and an HTTP message hijacking identification condition. And respectively judging whether the data to be audited meets TCP message hijack identification conditions and HTTP message hijack identification conditions according to the above modes, if the data to be audited does not meet the TCP message hijack identification conditions and/or the HTTP message hijack identification conditions, determining that the data to be audited does not meet the hijack identification conditions, namely, indicating that the data to be audited is not hijack.
In step 24, if the data to be audited does not satisfy the hijacking identification condition, a data auditing request is automatically generated.
When the data to be audited is determined not to meet the hijacking identification condition according to the mode, the data auditing system can automatically generate a data auditing request so as to further audit the data to be audited which is not hijacked. The specific auditing method is as described above, and is not described herein again.
In addition, if it is determined that the data to be audited meets the hijack identification condition, that is, it is determined that the data to be audited is hijacked, at this time, further auditing of the data to be audited is not needed, and only prompt information representing that the data to be audited is hijacked is output, where the prompt information may include a characteristic value of the data to be audited, and the like.
By adopting the technical scheme, before the data to be audited is audited, hijacking identification is carried out on the data to be audited so as to determine whether the data to be audited is hijacked or not, and the data to be audited is further audited under the condition that the data to be audited is not hijacked, so that the auditing workload can be reduced, and the safety of the data uploaded or accessed by the client can be ensured to a certain extent, and further the safety of a computer network can be ensured.
Based on the same inventive concept, the invention also provides a data auditing system. FIG. 4 is a block diagram of a data auditing system provided according to one embodiment of the present disclosure. As shown in fig. 4, the system 40 may include:
a first determining module 41, configured to determine, after acquiring the data auditing request, whether a feature value of the to-be-audited data matches a feature value of a preset illegal resource, where the to-be-audited data is sent by the client or the server;
and a blocking module 42, configured to perform blocking processing on the resource corresponding to the to-be-checked data if it is determined that the feature value of the to-be-checked data matches a preset feature value of an illegal resource.
Optionally, the data to be checked is video data; the characteristic value of the data to be examined comprises a characteristic value corresponding to at least one key frame image in the video data;
the first determining module 41 may include:
the matching sub-module is used for matching the characteristic value corresponding to the key frame image with the characteristic value of a preset illegal resource;
and the determining sub-module is used for determining that the characteristic value of the data to be audited is matched with the preset characteristic value of the illegal resource if the key frame image matched with the preset characteristic value of the illegal resource exists.
Optionally, the system may further include:
the output module is used for outputting prompt information used for indicating that the data to be checked is illegal resource data; and/or
And the forbidding module is used for forbidding the client to access or upload the to-be-audited data.
Optionally, the system may further include:
the acquisition module is used for responding to the received hijack identification request and acquiring the hijack identification condition corresponding to the data to be audited sent by the client or the server;
and the generating module is used for automatically generating the data auditing request if the data to be audited does not meet the hijacking identification condition.
Optionally, the hijacking identification condition includes a TCP packet hijacking identification condition; the system may further include:
the first analysis module is used for carrying out TCP protocol analysis on the data to be audited so as to obtain a TCP message of the data to be audited;
and the second determining module is used for determining that the data to be audited does not meet the TCP message hijack identification condition if the TCP message has the preset mark.
Optionally, the hijacking identification condition includes an HTTP message hijacking identification condition; the system may further include:
the second analysis module is used for carrying out HTTP protocol analysis on the data to be audited so as to obtain an HTTP message of the data to be audited;
and the third determining module is used for determining that the data to be checked does not meet the HTTP message hijack identification condition if the header information of the HTTP messages and/or the parameters of the HTTP messages meet the corresponding relation of a preset algorithm and/or the characteristic values of the data to be checked are consistent with the characteristic values carried in the headers and/or the parameters of the HTTP messages.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Referring now to FIG. 5, a block diagram of an electronic device 500 suitable for use in implementing embodiments of the present disclosure is shown. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, electronic device 500 may include a processing means (e.g., central processing unit, graphics processor, etc.) 501 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM 502, and the RAM 503 are connected to each other through a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
Generally, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 507 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, and the like; storage devices 508 including, for example, magnetic tape, hard disk, etc.; and a communication device 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 illustrates an electronic device 500 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or installed from the storage means 508, or installed from the ROM 502. The computer program performs the above-described functions defined in the methods of the embodiments of the present disclosure when executed by the processing device 501.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, data auditing systems, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may be interconnected with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: after a data auditing request is acquired, determining whether a characteristic value of the to-be-audited data is matched with a preset characteristic value of an illegal resource, wherein the to-be-audited data is sent by the client or the server; and if the characteristic value of the data to be checked is determined to be matched with the preset characteristic value of the illegal resource, carrying out sealing treatment on the resource corresponding to the data to be checked.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present disclosure may be implemented by software or hardware. For example, the first determining module may be further described as a "module that determines whether the feature value of the data to be checked matches the feature value of the preset illegal resource after the data audit request is acquired".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, a data auditing method is provided, which is applied to a data auditing system, where the data auditing system is located between a client and a server, and the method includes:
after a data auditing request is acquired, determining whether a characteristic value of the to-be-audited data is matched with a preset characteristic value of an illegal resource, wherein the to-be-audited data is sent by the client or the server;
and if the characteristic value of the data to be checked is matched with the preset characteristic value of the illegal resource, carrying out sealing treatment on the resource corresponding to the data to be checked.
According to one or more embodiments of the present disclosure, a data auditing method is provided, in which the data to be audited is video data; the characteristic value of the data to be examined comprises a characteristic value corresponding to at least one key frame image in the video data;
the determining whether the characteristic value of the data to be checked is matched with the preset characteristic value of the illegal resource includes:
matching the characteristic value corresponding to the key frame image with the characteristic value of a preset illegal resource;
and if the key frame image matched with the preset characteristic value of the illegal resource exists, determining that the characteristic value of the data to be audited is matched with the preset characteristic value of the illegal resource.
According to one or more embodiments of the present disclosure, a data auditing method is provided, where if it is determined that a feature value of the data to be audited matches a feature value of a preset illegal resource, the method further includes:
outputting prompt information for indicating that the data to be checked are illegal resources; and/or
And forbidding the client to access or upload the to-be-audited data.
According to one or more embodiments of the present disclosure, a data auditing method is provided, wherein the method further includes:
responding to the received hijacking identification request, and acquiring hijacking identification conditions corresponding to the data to be audited sent by the client or the server;
and if the data to be audited does not meet the hijack identification condition, automatically generating the data audit request.
According to one or more embodiments of the present disclosure, a data auditing method is provided, wherein the hijacking identification condition includes a TCP packet hijacking identification condition; the method further comprises the following steps:
analyzing the TCP protocol of the data to be audited to obtain a TCP message of the data to be audited;
and if the TCP message has a preset mark, determining that the data to be audited does not meet the TCP message hijack identification condition.
According to one or more embodiments of the present disclosure, a data auditing method is provided, wherein the hijacking identification condition includes an HTTP message hijacking identification condition; the method further comprises the following steps:
performing HTTP protocol analysis on the to-be-audited data to obtain an HTTP message of the to-be-audited data;
and if the header information of the HTTP message and/or the parameters of the HTTP message meet the corresponding relation of a preset algorithm, and/or the characteristic value of the data to be audited is consistent with the characteristic value carried in the header and/or the parameters of the HTTP message, determining that the data to be audited does not meet the hijacking identification condition of the HTTP message.
According to one or more embodiments of the present disclosure, there is provided a data auditing system including:
the first determining module is used for determining whether a characteristic value of the to-be-checked data is matched with a preset characteristic value of an illegal resource after the data auditing request is obtained, wherein the to-be-checked data is sent by the client or the server;
and the forbidding module is used for carrying out forbidding processing on the resource corresponding to the data to be checked if the characteristic value of the data to be checked is determined to be matched with the preset characteristic value of the illegal resource.
According to one or more embodiments of the present disclosure, a data auditing system is provided, where the data to be audited is video data; the characteristic value of the data to be examined comprises a characteristic value corresponding to at least one key frame image in the video data;
the first determining module includes:
the matching sub-module is used for matching the characteristic value corresponding to the key frame image with the characteristic value of a preset illegal resource;
and the determining sub-module is used for determining that the characteristic value of the data to be audited is matched with the preset characteristic value of the illegal resource if the key frame image matched with the preset characteristic value of the illegal resource exists.
According to one or more embodiments of the present disclosure, a data auditing system is provided, where the system further includes:
the output module is used for outputting prompt information used for indicating that the data to be checked is illegal resource data; and/or
And the forbidding module is used for forbidding the client to access or upload the to-be-audited data.
According to one or more embodiments of the present disclosure, a data auditing system is provided, where the system further includes:
the acquisition module is used for responding to the received hijack identification request and acquiring the hijack identification condition corresponding to the data to be audited sent by the client or the server;
and the generating module is used for automatically generating the data auditing request if the data to be audited does not meet the hijacking identification condition.
According to one or more embodiments of the present disclosure, a data auditing system is provided, where the hijacking identification condition includes a TCP packet hijacking identification condition; the system further comprises:
the first analysis module is used for carrying out TCP protocol analysis on the data to be audited so as to obtain a TCP message of the data to be audited;
and the second determining module is used for determining that the data to be audited does not meet the TCP message hijack identification condition if the TCP message has the preset mark.
According to one or more embodiments of the present disclosure, a data auditing system is provided, where the hijacking identification condition includes an HTTP message hijacking identification condition; the system further comprises:
the second analysis module is used for carrying out HTTP protocol analysis on the data to be audited so as to obtain an HTTP message of the data to be audited;
and the third determining module is used for determining that the data to be checked does not meet the HTTP message hijack identification condition if the header information of the HTTP messages and/or the parameters of the HTTP messages meet the corresponding relation of a preset algorithm and/or the characteristic values of the data to be checked are consistent with the characteristic values carried in the headers and/or the parameters of the HTTP messages.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims. With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Claims (9)
1. A data auditing method is characterized by being applied to a data auditing system, wherein the data auditing system is positioned between a client and a server, and the method comprises the following steps:
in response to receiving a hijacking identification request, acquiring a hijacking identification condition corresponding to-be-audited data sent by the client or the server;
if the data to be audited does not meet the hijack identification condition, automatically generating a data audit request;
after a data auditing request is acquired, if the data is data uploaded by a client, determining whether a characteristic value of the data to be audited is matched with a characteristic value of a preset illegal resource before a server issues a resource corresponding to the data, or if the data is data issued by the server, determining whether the characteristic value of the data to be audited is matched with the characteristic value of the preset illegal resource before the client accesses the data;
and if the characteristic value of the data to be checked is determined to be matched with the preset characteristic value of the illegal resource, carrying out sealing treatment on the resource corresponding to the data to be checked.
2. The method of claim 1, wherein the pending data is video data; the characteristic value of the data to be examined comprises a characteristic value corresponding to at least one key frame image in the video data;
the determining whether the characteristic value of the data to be checked is matched with the preset characteristic value of the illegal resource includes:
matching the characteristic value corresponding to the key frame image with the characteristic value of a preset illegal resource;
and if the key frame image matched with the preset characteristic value of the illegal resource exists, determining that the characteristic value of the data to be audited is matched with the preset characteristic value of the illegal resource.
3. The method according to claim 1, wherein if it is determined that the eigenvalue of the data to be audited matches the eigenvalue of a preset illegal resource, the method further comprises:
outputting prompt information for indicating that the data to be checked is illegal resource data; and/or
And forbidding the client to access or upload the to-be-audited data.
4. The method according to claim 1, wherein the hijacking identification condition comprises a TCP message hijacking identification condition; the method further comprises the following steps:
analyzing the TCP protocol of the data to be audited to obtain a TCP message of the data to be audited;
and if the TCP message has a preset mark, determining that the data to be audited does not meet the TCP message hijack identification condition.
5. The method according to claim 1, wherein the hijacking identification condition comprises an HTTP message hijacking identification condition; the method further comprises the following steps:
performing HTTP protocol analysis on the to-be-audited data to obtain an HTTP message of the to-be-audited data;
and if the header information of the HTTP message and/or the parameters of the HTTP message meet the corresponding relation of a preset algorithm, and/or the characteristic value of the data to be audited is consistent with the characteristic value carried in the header and/or the parameters of the HTTP message, determining that the data to be audited does not meet the hijacking identification condition of the HTTP message.
6. A data auditing system, characterized in that the data auditing system is located between a client and a server, comprising:
the acquisition module is used for responding to the received hijack identification request and acquiring the hijack identification condition corresponding to the data to be audited sent by the client or the server;
the generation module is used for automatically generating a data verification request if the data to be verified does not meet the hijack identification condition;
the first determining module is used for determining whether a characteristic value of data to be checked is matched with a characteristic value of a preset illegal resource before the server issues a resource corresponding to the data after a data checking request is obtained and if the data is the data uploaded by the client, or determining whether the characteristic value of the data to be checked is matched with the characteristic value of the preset illegal resource before the client accesses the data if the data is the data issued by the server;
and the forbidding module is used for carrying out forbidding processing on the resource corresponding to the data to be checked if the characteristic value of the data to be checked is determined to be matched with the preset characteristic value of the illegal resource.
7. The system of claim 6, wherein the pending data is video data; the characteristic value of the data to be audited comprises a characteristic value corresponding to at least one key frame image in the video data;
the first determining module includes:
the matching sub-module is used for matching the characteristic value corresponding to the key frame image with the characteristic value of a preset illegal resource;
and the determining sub-module is used for determining that the characteristic value of the data to be audited is matched with the preset characteristic value of the illegal resource if the key frame image matched with the preset characteristic value of the illegal resource exists.
8. A computer-readable medium, on which a computer program is stored which, when being executed by a processing means, carries out the steps of the method according to any one of claims 1 to 5.
9. An electronic device, comprising:
a storage device having a computer program stored thereon;
processing means for executing the computer program in the storage means to carry out the steps of the method according to any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010079917.0A CN111343146B (en) | 2020-02-04 | 2020-02-04 | Data auditing method, system, computer readable medium and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010079917.0A CN111343146B (en) | 2020-02-04 | 2020-02-04 | Data auditing method, system, computer readable medium and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111343146A CN111343146A (en) | 2020-06-26 |
CN111343146B true CN111343146B (en) | 2022-08-09 |
Family
ID=71181473
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010079917.0A Active CN111343146B (en) | 2020-02-04 | 2020-02-04 | Data auditing method, system, computer readable medium and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111343146B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11888889B2 (en) | 2022-03-03 | 2024-01-30 | Uab 360 It | Securing against network vulnerabilities |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112015870B (en) * | 2020-09-14 | 2024-06-07 | 支付宝(杭州)信息技术有限公司 | Data uploading method and device |
CN112214475B (en) * | 2020-11-04 | 2023-07-07 | 成都中科大旗软件股份有限公司 | Method, system, storage medium and terminal for configuring multiple data sources |
CN113342849B (en) * | 2021-05-28 | 2024-06-07 | 百果园技术(新加坡)有限公司 | Data auditing method and device, electronic equipment and storage medium |
CN114205645B (en) * | 2021-12-10 | 2024-06-14 | 北京凯视达信息技术有限公司 | Distributed video content auditing method and device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104320677A (en) * | 2014-10-17 | 2015-01-28 | 深圳市同洲电子股份有限公司 | Audit server, a master control server and a video detection system |
CN104954386A (en) * | 2015-06-30 | 2015-09-30 | 百度在线网络技术(北京)有限公司 | Network anti-hijacking methods and device |
CN106686395A (en) * | 2016-12-29 | 2017-05-17 | 北京奇艺世纪科技有限公司 | Illegal-video live-broadcast detection method and system |
CN110012302A (en) * | 2018-01-05 | 2019-07-12 | 阿里巴巴集团控股有限公司 | A kind of network direct broadcasting monitoring method and device, data processing method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104410872A (en) * | 2014-11-04 | 2015-03-11 | 深圳市同洲电子股份有限公司 | Method and device for checking video sources |
-
2020
- 2020-02-04 CN CN202010079917.0A patent/CN111343146B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104320677A (en) * | 2014-10-17 | 2015-01-28 | 深圳市同洲电子股份有限公司 | Audit server, a master control server and a video detection system |
CN104954386A (en) * | 2015-06-30 | 2015-09-30 | 百度在线网络技术(北京)有限公司 | Network anti-hijacking methods and device |
CN106686395A (en) * | 2016-12-29 | 2017-05-17 | 北京奇艺世纪科技有限公司 | Illegal-video live-broadcast detection method and system |
CN110012302A (en) * | 2018-01-05 | 2019-07-12 | 阿里巴巴集团控股有限公司 | A kind of network direct broadcasting monitoring method and device, data processing method |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11888889B2 (en) | 2022-03-03 | 2024-01-30 | Uab 360 It | Securing against network vulnerabilities |
Also Published As
Publication number | Publication date |
---|---|
CN111343146A (en) | 2020-06-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111343146B (en) | Data auditing method, system, computer readable medium and electronic equipment | |
CN110275723A (en) | Obtain method, apparatus, electronic equipment and the readable medium of resource | |
CN112269959B (en) | Control method and device for display content, readable medium and electronic equipment | |
CN111163324B (en) | Information processing method and device and electronic equipment | |
CN116361121A (en) | Abnormal interface alarm method, device, electronic equipment and computer readable medium | |
CN109635558B (en) | Access control method, device and system | |
CN113032345A (en) | File processing method, device, terminal and non-transitory storage medium | |
US9288189B2 (en) | Retrieving both sensitive and non-sensitive content in a secure manner | |
CN110719499B (en) | Video downloading method, system, medium and electronic device | |
CN116664849B (en) | Data processing method, device, electronic equipment and computer readable medium | |
CN112612919A (en) | Video resource association method, device, equipment and medium | |
CN116644249A (en) | Webpage authentication method, webpage authentication device, webpage authentication medium and electronic equipment | |
CN111310145A (en) | User right verification method and device and electronic equipment | |
CN113031950B (en) | Picture generation method, device, equipment and medium | |
CN111885006B (en) | Page access and authorized access method and device | |
CN114741686A (en) | Method and device for detecting program white list and related equipment | |
CN113987471A (en) | Executable file execution method and device, electronic equipment and computer readable medium | |
CN115348472A (en) | Video identification method and device, readable medium and electronic equipment | |
CN113760724A (en) | Automatic testing method and device, electronic equipment and computer readable medium | |
CN112149019A (en) | Method, apparatus, electronic device, and computer-readable medium for displaying information | |
CN110633566A (en) | Intrusion detection method, device, terminal equipment and medium | |
CN114356788B (en) | Application program detection method, device, equipment and medium based on user information | |
CN111953680B (en) | Anti-hijacking method, device, medium and electronic equipment for content distribution network | |
CN111371745B (en) | Method and apparatus for determining SSRF vulnerability | |
CN112261659B (en) | Control method and device for terminal and server, terminal and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |