GB2539900A - A method, an apparatus and a computer program product for machine learning - Google Patents

A method, an apparatus and a computer program product for machine learning Download PDF

Info

Publication number
GB2539900A
GB2539900A GB1511399.6A GB201511399A GB2539900A GB 2539900 A GB2539900 A GB 2539900A GB 201511399 A GB201511399 A GB 201511399A GB 2539900 A GB2539900 A GB 2539900A
Authority
GB
United Kingdom
Prior art keywords
data
signature
dataset
acquisition unit
data stream
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1511399.6A
Other versions
GB201511399D0 (en
Inventor
Fan Lixin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Technologies Oy
Original Assignee
Nokia Technologies Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Technologies Oy filed Critical Nokia Technologies Oy
Priority to GB1511399.6A priority Critical patent/GB2539900A/en
Publication of GB201511399D0 publication Critical patent/GB201511399D0/en
Publication of GB2539900A publication Critical patent/GB2539900A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/554Detecting local intrusion or implementing counter-measures involving event detection and direct action
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/86Secure or tamper-resistant housings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/01Probabilistic graphical models, e.g. probabilistic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B29/00Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
    • G08B29/02Monitoring continuously signalling or alarm systems
    • G08B29/04Monitoring of the detection circuits
    • G08B29/046Monitoring of the detection circuits prevention of tampering with detection circuits

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computing Systems (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Signal Processing (AREA)
  • Algebra (AREA)
  • Probability & Statistics with Applications (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

Methods, apparatus and computer program comprising, receiving real-time data stream from a data acquisition unit 410, 420, applying a set of classifiers to data items from the data stream to obtain a signature 432 of a dataset, comparing the signature to pre-stored signatures 435 and if they are different, determining a tampering of the data acquisition unit. The method may also determine a Bayes classification risk and use that risk as the signature. The data stream may be received via a data transfer network and comprise audio and/or video data and the data acquisition unit may be a surveillance system.

Description

A METHOD, AN APPARATUS AND A COMPUTER PROGRAM PRODUCT FOR MACHINE LEARNING
TECHNICAL FIELD
[0001] The present embodiments generally relate to machine learning. In particular, the present embodiments relate to decision trees and generalization error estimation.
BACKGROUND
[0002] This section is intended to provide a background or context to the invention that is recited in the claims. The description herein may include concepts that could be pursued, but are not necessarily ones that have been previously conceived or pursued. Therefore, unless otherwise indicated herein, what is described in this section is not prior art to the description and claims in this application and is not admitted to be prior art by inclusion in this section.
[0003] Tampering of surveillance cameras is a problem. Therefore methods for detecting such tampering has been designed. According to a known method, incoming frames of video coming from the surveillance camera are stored in short-term and long-term pools. The frames are compared with each other to detect tampering. According to another known method, global changes in an image are detected with respect to a reference image. If there are no global changes, an object in the image is identified and an event analysis on the object is performed.
[0004] There is a need for an improved method for detecting a tampering of a data acquisition system.
SUMMARY
[0005] Various aspects of examples of the invention are provided in the detailed description.
[0006] According to a first aspect, there is provided a computer-implemented method comprising: receiving real-time data stream from a data acquisition unit, the real-time data stream comprising data items; applying a set of classifiers to the data items to obtain a signature of a dataset; comparing the signature of the dataset to prestored signatures; and if the signature of the dataset is different from the prestored signatures, determining a tampering of the data acquisition unit.
[0007] According to a second aspect, there is provided an apparatus comprising at least one processor, memory including computer program code, the memory and the computer program code configure to, with the at least one processor, cause the apparatus to perform at least the following: receive real-time data stream from a data acquisition unit, the realtime data stream comprising data items; apply a set of classifiers to the data items to obtain a signature of a dataset; compare the signature of the dataset to prestored signatures; and if the signature of the dataset is different from the prestored signatures, determine a tampering of the data acquisition unit.
[0008] According to a third aspect, there is provided an apparatus comprising at least processing means and memory means, and further comprising: means for receiving real-time data stream from a data acquisition unit, the real-time data stream comprising data items; means for applying a set of classifiers to the data items to obtain a signature of a dataset; means for comparing the signature of the dataset to prestored signatures; and if the signature of the dataset is different from the prestored signatures, means for determining a tampering of the data acquisition unit.
[0009] According to a fourth aspect, there is provided a computer program product embodied on a non-transitory computer readable medium, comprising computer program code configured to, when executed on at least one processor, cause an apparatus or a system to: receive real-time data stream from a data acquisition unit, the real-time data stream comprising data items; apply a set of classifiers to the data items to obtain a signature of a dataset; compare the signature of the dataset to prestored signatures; and if the signature of the dataset is different from the prestored signatures, to determine a tampering of the data acquisition unit.
[0010] According to an embodiment, a real-time data stream is received via a data transfer network.
[0011] According to an embodiment, a Bayes Classification Risk is determined, and the Bayes Classification Risk is used as the signature.
[0012] According to an embodiment, the data stream comprises one or more of the following: audio data, video data.
[0013] According to an embodiment, the data acquisition unit is a surveillance system. BRIEF DESCRIPTION OF THE DRAWINGS
[0014] For a more complete understanding of example embodiments of the present invention, reference is now made to the following descriptions taken in connection with the accompanying drawings in which: [0015] Figure 1 shows a block chart of an apparatus according to an embodiment; [0016] Figure 2 shows examples of tampering of surveillance cameras; [0017] Figure 3 shows comparison of untampered and tampered data streams being input to a set of selected classifiers; [0018] Figure 4 shows a tampering detection unit according to an embodiment; and [0019] Figure 5 is a flowchart illustrating a method according to an embodiment.
PET ATT ED DESCRIPTON OF SOME EXAMPLE EMBODIMENTS
[0020] In the following, several embodiments of the invention will be described in the context of machine learning, in particular decision trees and generalization error estimation. The embodiments of the invention can be utilized e.g. with surveillance microphones and cameras to detect a possible tampering. The embodiments can also be utilized in any dynamic data capturing, where tampering may exist.
[0021] Dynamic data capturing is known from e.g. surveillance microphones and cameras. Such dynamic data capturing may suffer from malicious tampering. Examples of tampering of surveillance cameras are shown in Figure 2. For example, a camera may be blocked 210, the camera may be redirected 220 or defocused 230 or the camera lens may be spray-painted 240. Such tampering should be detected fast enough so that the surveillance camera can be restored to the normal condition. When tampering is detected, e.g. by automatic detection, an alarm may be generated to inform a security agency of the tampering.
[0022] Known methods for detecting tampering rely on comparing new input data with stored older data. In such methods the storage for older data quickly becomes relatively large, and therefore, the processing time for comparison will take more time. This diminishes the usefulness of tampering detection for real-time surveillance systems. In addition, prerecoding old data approaches may not work well in such situations where the definition of normal/abnormal events (e g. presence of a certain person) vary significantly and have to be defined dynamically.
[0023] The present embodiments aim to improve the known technology in detection of tampering. In the present embodiments, the detection is based on the estimation of decision tree generalization error. Decision tree is considered to be one of the most widely used classifiers/regressors for many statistical inference applications. Decision tree is used to build a classification or regression models in a tree structure. The purpose of the decision tree is to break down a dataset into a smaller subsets. Then, the decision tree will be presented as decision nodes and leaf nodes. A decision node has two or more branches, whereas a leaf node represents a classification or decision. A root node is the topmost decision node in a tree. The decision tree is built top-down from the root node, and involves partitioning the complete data into subsets that contain elements with similar values. In a classification tree, a variable may be categorize into different classes. However, in a regression tree, the variable does not have a class, but the decision tree is based on one or more predictors.
[0024] Estimating generalization errors of decision tree classifiers is important. The generalization error of decision tree classifiers can be estimated by monitoring the statistics of a labelled sample put into leaf nodes of the decision tree classifier in question.
[0025] With respect of terminology, “training error” refers to errors that a classifier makes over the labelled training data. Training error can be directly measured, by comparing the classifier labels with the groundtruth labels of training data. On the other hand, term “generalization error” refers the expected future error rates of the classifier that makes over independent and identically distributed (i.i.d.) draw of data samples. Since these data samples are often unlabeled and are unknown beforehand in practice, the generalization error cannot be measured directly. Nevertheless, these testing data are assumed to be sampled following the same probability of training data, thus one is able to estimate the generalization error based on the measured training error provided that the estimation is unbiased and consistent.
[0026] Alternatively, one can split the labelled training data into two subsets, i.e. the training set and the validation set. Then the classifier is trained by using the data in the training set and the validation error is directly measured using the labelled data in the validation set. The validation error is an unbiased and consistent estimate of the generalization error, in the sense that it converges to the true generalization error if the number of validation samples goes to infinite.
[0027] In the technology of machine learning, a decision tree is constructed by recursively splitting the training data into a number of coherent subsets, where each subset of samples exhibit higher and higher coherences or purity. Often decision tree branches are repeated expanded until all samples thrown into the leaf nodes are of the same class label. While the decision tree learning approach is initiative and demonstrates successful performances for many machine learning problems, the training error of decision trees is highly biased and underestimates the generalization errors because of the small sized samples thrown into leaf nodes.
[0028] According to an example, to make accurate estimate of the generalization error by taking into account a small sample size, a posterior class probability with a beta distribution with its mean and variance can be modeled. The mean and variance depend on the number of samples put into a leafnode:
[0029] in which m is the number of samples thrown into the leaf node that belongs to class Ci, V = ΣΓ is the total sample size and ns is the prior number of pseudo samples which equals to 0, 1 and '/2.
[0030] The generalization error of a leaf node is estimated as summation of following two terms:
-i.
[0031] The total generalization error is taken as the weighted average of the generalization error of each leaf nodes:
in which the weight
is proportional to the number of training data samples put into each leaf nodes.
[0032] Bayesian theory is a known way of formulating and dealing with statistical decision problems. A common way of evaluating a decision rule is by computing a Bayes risk. Bayes risk function yields a real number for each decision rule, and the Bayes risk and the expected loss are equivalent, i.e. they leads to the same decision rule.
[0033] The present embodiments for tampering detection are based on Estimated Bayes Risk and aims to improve the existing technology by representing and storing Bayes Classification Risk signatures in more compact formats, as compared with the original input data formats (e.g. video frames, audio), whereby the processing time will also be reduced significantly fulfilling the real-time surveillance requirements. In addition, even if the definition of normal/abnormal events (e.g. the presence of certain people) has to be defined dynamically, these classifier signatures can be used to detect potential data tampering.
[0034] The tampering detection can be implemented by an apparatus, an embodiment of which is shown in Figure 1. The apparatus 50 is an electronic device for computing. The apparatus 50 may comprise a housing for incorporating and protecting the device. The apparatus 50 further may comprise a display 32, for example, a liquid crystal display or any other display technology capable of displaying images and/or videos.
[0035] The apparatus 50 may further comprise a keypad 34. According to another embodiment, any suitable data or user interface mechanism may be employed. For example, the user interface may be implemented as a virtual keyboard or data entry system as part of a touch-sensitive display. The apparatus may comprise a microphone 36 or any suitable audio input which may be a digital or analogue signal input. The apparatus 50 may further comprise an audio output device 38. The apparatus 50 may also comprise a battery. The apparatus may comprise one or more cameras 42 capable of recording or capturing images and/or video, or may be connected to one.
[0036] The apparatus 50 may comprise a controller 56 or processor for controlling the apparatus. The controller 56 may be connected to memory 58 which, according to an embodiment, may store both data in the form of image and audio data and/or may also store instructions for implementation on the controller 56. The controller 56 may further be connected to codec circuitry 54 suitable for carrying out coding and decoding or audio and/or video data or assisting in coding and decoding carried out by the controller 56.
[0037] According to an embodiment, the camera 42 is capable of recording or detecting individual video frames which are then passed to the codec 54 or controller for processing. According to an embodiment, the apparatus may receive the video image data for processing from another device prior to transmission and/or storage. According to an embodiment, the apparatus 50 may receive the images for processing either wirelessly or by a wired connection.
[0038] According to an embodiment, the apparatus 50 may comprise an infrared port for short range line of sight communication to other devices. According to an embodiment, the apparatus 50 may further comprise any suitable short range communication solution such as for example a Bluetooth wireless connection or a USB/firewire wired solution. The apparatus 50 may comprise radio interface circuitry 52 connected to the controller and suitable for generating wireless communication signals for example for communication with a cellular communications network, a wireless communications system or a wireless local area network. The apparatus 50 may further comprise an antenna 44 connected to the radio interface circuitry 52 for transmitting radio frequency signals generated at the radio interface circuitry 52 to other apparatus(es) and for receiving radio frequency signals from other apparatus(es).
[0039] The apparatus 50 may be a surveillance camera incorporating a tampering detection unit, or the apparatus 50 having a tampering detection unit may be connected to an external surveillance camera to receive video data via a data transfer network, wherein the tampering detection is implemented in the apparatus remotely from the surveillance camera.
[0040] As mentioned, the present embodiments for tampering detection are based on Estimated Bayes Risk. In the Estimated Bayes Risk method, a trained classifier can be used to predict feature/label for data items with certain features or labels, and the error rate of making wrong predictions is called Estimated Bayes Classification Risk (EBCR) for the classifier in question.
[0041] Data items from the same source should exhibit similar or even identical Bayes Classification Risk (R) for a pre-trained classifiers (C). A bank of classifiers (Cl, C2, C3, ...) leads to corresponding set of EBCR (Errl, Err2, Err3, ...), which forms a signature of data items that are collected from the same source.
[0042] According to an embodiment, data items which are subject to potential attacks can be protected by first applying a set of classifiers to them and then using corresponding Bayes Classification Risks as the signature of the authentic dataset. Tampered data are detected since it leads to different signatures when the set of selected (and secreted) classifiers are applied.
[0043] Figure 3 illustrates that when either original data 310 and tampered data 320 are inputted to a set of selected classifiers, different Estimated Bayes Classification Risks ([Errl, Err2, ..., ErrN] and [Errl ’, Err2’, ..., ErrN’]) are produced. The set or ensemble of classifiers are selected by the learning algorithm to minimize the training error.
[0044] For detecting the tampered data, the Estimated Bayes Classification Risks [ErrE, Err2’, ..., ErrN’] and [Errl, Err2, ..., ErrN] are compared. If the [EnT, Err2’, ..., ErrN’] is different from [Errl, Err2, ..., ErrN], the tampering is detected. The amount of difference between the two classification risks shall be statistically significant margin (e.g. greater than three times standard deviations of risks). The detection unit for tampering can be integrated as a component of real-time surveillance system, shown in Figure 4.
[0045] Figure 4 illustrates a video acquisition unit 410 and an audio acquisition unit 420. Either or both of them can be part of a surveillance system. As mentioned, a tampering detection unit 430 can also be integrated in the surveillance system. Alternatively, the tampering detection unit 430 may be located on a security server, whereupon the data from the video acquisition unit 410 and the audio acquisition unit 420 is received through a wireless or wired data transfer network. The tampering detection unit 430 is configured to apply classifiers 1, 2, . . .N to the data stream received from either or both of the data acquisition unit 410, 420 in order to obtain Bayes Classification Risk Signatures (BCR signatures) 432. The BCR signatures are stored in a BCR Signatures Database 435. The newly obtained BCR signatures 432 are compared to the stored BCR signatures in the database 435 in order to detect tampering. If the newly obtained BCR signatures 432 differ from the signatures in the database 435, the tampering is detected.
[0046] The present embodiments provide various advantages. First, Bayes Classification Risk signatures can be represented and stored in much more compact formats, as compared with the original input data formats (e.g. video frames), therefore the processing time can be reduced significantly fulfilling the real-time surveillance requirements. Secondly, even if the definition of normal/abnormal events (e g. the presence of certain people) has to be defined dynamically, these classifier signatures can still be used to detect potential data tampering.
[0047] Figure 5 is a flowchart illustrating a method according to an embodiment. The method comprises receiving real-time data stream from a data acquisition unit, the real-time data stream comprising data items 510; applying a set of classifiers to the data items to obtain a signature of a dataset 520; comparing the signature of the dataset to prestored signatures 530; and if the signature of the dataset is different from the prestored signatures 540 determining a tampering of the data acquisition unit 550.
[0048] The various embodiments of the invention can be implemented with the help of computer program code that resides in a memory and causes the relevant apparatuses to carry out the invention. For example, a device may comprise circuitry and electronics for handling, receiving and transmitting data, computer program code in a memory, and a processor that, when running the computer program code, causes the device to carry out the features of an embodiment. Yet further, a network device like a server may comprise circuitry and electronics for handling, receiving and transmitting data, computer program code in a memory, and a processor that, when running the computer program code, causes the network device to carry out the features of an embodiment.
[0049] It is obvious that the present invention is not limited solely to the above-presented embodiments, but it can be modified within the scope of the appended claims.

Claims (12)

1. A computer-implemented method, comprising receiving real-time data stream from a data acquisition unit, the real-time data stream comprising data items; applying a set of classifiers to the data items to obtain a signature of a dataset; comparing the signature of the dataset to prestored signatures; and if the signature of the dataset is different from the prestored signatures, determining a tampering of the data acquisition unit.
2. The method according to claim 1, further comprising receiving a real-time data stream via a data transfer network.
3. The method according to claim 1 or 2, further comprising determining a Bayes Classification Risk, and using the Bayes Classification Risk as the signature.
4. The method according to any of the preceding claims 1 to 3, wherein the data stream comprises one or more of the following: audio data, video data.
5. The method according to any of the preceding claims 1 to 4, wherein the data acquisition unit is a surveillance system.
6. An apparatus comprising at least one processor, memory including computer program code, the memory and the computer program code configure to, with the at least one processor, cause the apparatus to perform at least the following: receive real-time data stream from a data acquisition unit, the real-time data stream comprising data items; - apply a set of classifiers to the data items to obtain a signature of a dataset; - compare the signature of the dataset to prestored signatures; and if the signature of the dataset is different from the prestored signatures, determine a tampering of the data acquisition unit.
7. The apparatus according to claim 6, further comprising computer program to cause the apparatus to receive a real-time data stream via a data transfer network.
8. The apparatus according to claim 6 or 7, further comprising computer program to cause the apparatus to determine a Bayes Classification Risk, and to use the Bayes Classification Risk as the signature.
9. The apparatus according to any of the preceding claims 6 to 8, wherein the data stream comprises one or more of the following: audio data, video data.
10. The apparatus according to any of the preceding claims 6 to 9, wherein the data acquisition unit is a surveillance system.
11. An apparatus comprising at least processing means and memory means, and further comprising: - means for receiving real-time data stream from a data acquisition unit, the realtime data stream comprising data items; - means for applying a set of classifiers to the data items to obtain a signature of a dataset; - means for comparing the signature of the dataset to prestored signatures; and if the signature of the dataset is different from the prestored signatures, means for determining a tampering of the data acquisition unit.
12. A computer program product embodied on a non-transitory computer readable medium, comprising computer program code configured to, when executed on at least one processor, cause an apparatus or a system to: receive real-time data stream from a data acquisition unit, the real-time data stream comprising data items; apply a set of classifiers to the data items to obtain a signature of a dataset; compare the signature of the dataset to prestored signatures; and if the signature of the dataset is different from the prestored signatures, - to determine a tampering of the data acquisition unit.
GB1511399.6A 2015-06-30 2015-06-30 A method, an apparatus and a computer program product for machine learning Withdrawn GB2539900A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1511399.6A GB2539900A (en) 2015-06-30 2015-06-30 A method, an apparatus and a computer program product for machine learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1511399.6A GB2539900A (en) 2015-06-30 2015-06-30 A method, an apparatus and a computer program product for machine learning

Publications (2)

Publication Number Publication Date
GB201511399D0 GB201511399D0 (en) 2015-08-12
GB2539900A true GB2539900A (en) 2017-01-04

Family

ID=53872391

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1511399.6A Withdrawn GB2539900A (en) 2015-06-30 2015-06-30 A method, an apparatus and a computer program product for machine learning

Country Status (1)

Country Link
GB (1) GB2539900A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022115178A1 (en) * 2020-11-30 2022-06-02 Microsoft Technology Licensing, Llc Methods and systems for recognizing video stream hijacking on edge devices

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001030064A1 (en) * 1999-10-15 2001-04-26 Koninklijke Philips Electronics N.V. Image and video authentication system
EP1298913A2 (en) * 2001-09-28 2003-04-02 Eastman Kodak Company System and method of authenticating a digitally captured image
US20040153647A1 (en) * 2003-01-31 2004-08-05 Rotholtz Ben Aaron Method and process for transmitting video content
EP1936576A1 (en) * 2006-12-20 2008-06-25 Axis AB Camera tampering detection
CN101441771A (en) * 2008-12-19 2009-05-27 中国科学技术大学 Video fire hazard smoke detecting method based on color saturation degree and movement mode
CN103413143A (en) * 2013-07-29 2013-11-27 西北工业大学 Video target tracking method based on dynamic sparse projection
WO2014061922A1 (en) * 2012-10-17 2014-04-24 에스케이텔레콤 주식회사 Apparatus and method for detecting camera tampering using edge image
CN104602015A (en) * 2014-12-31 2015-05-06 西安蒜泥电子科技有限责任公司 Real-time video monitoring encryption and authentication method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001030064A1 (en) * 1999-10-15 2001-04-26 Koninklijke Philips Electronics N.V. Image and video authentication system
EP1298913A2 (en) * 2001-09-28 2003-04-02 Eastman Kodak Company System and method of authenticating a digitally captured image
US20040153647A1 (en) * 2003-01-31 2004-08-05 Rotholtz Ben Aaron Method and process for transmitting video content
EP1936576A1 (en) * 2006-12-20 2008-06-25 Axis AB Camera tampering detection
CN101441771A (en) * 2008-12-19 2009-05-27 中国科学技术大学 Video fire hazard smoke detecting method based on color saturation degree and movement mode
WO2014061922A1 (en) * 2012-10-17 2014-04-24 에스케이텔레콤 주식회사 Apparatus and method for detecting camera tampering using edge image
CN103413143A (en) * 2013-07-29 2013-11-27 西北工业大学 Video target tracking method based on dynamic sparse projection
CN104602015A (en) * 2014-12-31 2015-05-06 西安蒜泥电子科技有限责任公司 Real-time video monitoring encryption and authentication method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022115178A1 (en) * 2020-11-30 2022-06-02 Microsoft Technology Licensing, Llc Methods and systems for recognizing video stream hijacking on edge devices

Also Published As

Publication number Publication date
GB201511399D0 (en) 2015-08-12

Similar Documents

Publication Publication Date Title
CN108897777B (en) Target object tracking method and device, electronic equipment and storage medium
US11468680B2 (en) Shuffle, attend, and adapt: video domain adaptation by clip order prediction and clip attention alignment
CN110443274B (en) Abnormality detection method, abnormality detection device, computer device, and storage medium
US20220277225A1 (en) Method and device for detecting anomalies, corresponding computer program and non-transitory computer-readable medium
US11120127B2 (en) Reconstruction-based anomaly detection
US10824958B2 (en) Localized learning from a global model
US10489238B2 (en) Analyzing screenshots to detect application issues
US10853656B2 (en) Surveillance system with activity recognition
CN110138745B (en) Abnormal host detection method, device, equipment and medium based on data stream sequence
EP3905122B1 (en) Video type detection method, apparatus, electronic device and storage medium
US10529152B2 (en) Detecting unauthorized physical access via wireless electronic device identifiers
CN109544870B (en) Alarm judgment method for intelligent monitoring system and intelligent monitoring system
CN112699273A (en) Predicting potential emergency data structures based on multi-modal analysis
WO2015193288A1 (en) Fusion-based object-recognition
US20170302516A1 (en) Entity embedding-based anomaly detection for heterogeneous categorical events
CN113746780B (en) Abnormal host detection method, device, medium and equipment based on host image
GB2539900A (en) A method, an apparatus and a computer program product for machine learning
US20230072641A1 (en) Image Processing and Automatic Learning on Low Complexity Edge Apparatus and Methods of Operation
US10325588B2 (en) Acoustic feature extractor selected according to status flag of frame of acoustic signal
WO2018155594A1 (en) Information processing device, information processing method, and computer-readable recording medium
JP2018109739A (en) Device and method for audio frame processing
US11379288B2 (en) Apparatus and method for event classification based on barometric pressure sensor data
CN112241671A (en) Personnel identity identification method, device and system
KR102635351B1 (en) Crime prevention system
CN115150196B (en) Ciphertext data-based anomaly detection method, device and equipment under normal distribution

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)