GB2578311A - Method and apparatus for controlling a mobile camera - Google Patents
Method and apparatus for controlling a mobile camera Download PDFInfo
- Publication number
- GB2578311A GB2578311A GB1817173.6A GB201817173A GB2578311A GB 2578311 A GB2578311 A GB 2578311A GB 201817173 A GB201817173 A GB 201817173A GB 2578311 A GB2578311 A GB 2578311A
- Authority
- GB
- United Kingdom
- Prior art keywords
- video data
- server
- time
- location
- query
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/7867—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, title and artist information, manually generated time, location and usage information, user ratings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
- G06F16/48—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/487—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19654—Details concerning communication with a camera
- G08B13/19656—Network used to communicate with a camera, e.g. WAN, LAN, Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/21805—Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/2187—Live feed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/27—Server based end-user applications
- H04N21/274—Storing end-user multimedia data in response to end-user request, e.g. network recorder
- H04N21/2743—Video hosting of uploaded data from client
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/414—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
- H04N21/41422—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance located in transportation means, e.g. personal vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/21—Intermediate information storage
- H04N1/2166—Intermediate information storage for mass storage, e.g. in document filing systems
- H04N1/2179—Interfaces allowing access to a plurality of users, e.g. connection to electronic image libraries
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/18—Information format or content conversion, e.g. adaptation by the network of the transmitted or received information for the purpose of wireless delivery to users or terminals
- H04W4/185—Information format or content conversion, e.g. adaptation by the network of the transmitted or received information for the purpose of wireless delivery to users or terminals by embedding added-value information into content, e.g. geo-tagging
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Library & Information Science (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Television Signal Processing For Recording (AREA)
- Studio Devices (AREA)
Abstract
A method of controlling a mobile camera 300, such as a vehicle dashcam, is performed at a server 200. The server 300 receives a plurality of live video data streams from one or more mobile cameras 300, and stores data indicative of a time and a location at which each live video data stream was recorded. The server 300 receives a query 110 for video data, the query 110 specifying a time and a location. The server 200 identifies a camera 300 that recorded a video data stream at a time and a location satisfying the query 110 and sends one or more instructions to the identified camera 300 to control the identified camera 300. Preferably, the instructions cause the identified camera 300 to transmit video data to the server 200 at a higher resolution, store recorded video in a video data file, transmit a video data file to the server 200, or start transmitting a live video stream to the server 200. A method of providing video data, performed at a server, includes outputting a video data stream recorded at a location specified in a query.
Description
METHOD AND APPARATUS FOR CONTROLLING A MOBILE CAMERA
FIELD
The present disclosure relates to methods and apparatuses for controlling a mobile camera. In particular, but not exclusively, the disclosure relates to controlling a plurality of mobile cameras by a server to obtain video data recorded at a particular time and location.
BACKGROUND
There is an increasing presence and use of cameras in vehicles. Mobile cameras in vehicles, commonly known as "dash cams", are increasingly used to monitor events occurring in the vicinity of the vehicles in which they are installed. A user may turn on a dash cam when using a vehicle to record footage during the journey of the vehicle. If an event occurs that is of interest to the dash cam user, they can select to store the footage the camera has taken for further use. The footage may be footage of an incident that occurred in the vicinity of the vehicle, such as an accident, a traffic violation, a crime committed on the road, etc. If the footage is not stored within a predetermined amount of time, the footage may be overwritten by newly recorded footage.
If an event occurs, mobile cameras in the vicinity of the event may have recorded footage relevant to the event. However, there is no straightforward way to gain an understanding of whether and what footage might be available of an event.
Furthermore, there is no straightforward way to collect such footage from each of the mobile cameras on which it is recorded.
SUMMARY
In one aspect, the present disclosure provides a method of controlling a mobile camera, the method being performed at a server and comprising: receiving a plurality of live video data streams from one or more mobile cameras; storing data indicative of a time and a location at which each live video data stream was recorded; receiving a query for video data, the query specifying a time and a location; identifying a mobile camera that recorded a video data stream at a time and a location satisfying the query; and sending one or more instructions to the identified mobile camera, the instructions being configured to control the identified mobile camera.
The method may further comprise: storing the received plurality of live video data streams, wherein each of the stored video data streams is associated with the data indicative of the time and location at which it was recorded; in response to receiving the query for video data, identifying at least one stored video data stream recorded at a time and a location satisfying the query; and outputting the identified stored video data stream. The query specifying time and location may specify one or more of a period of time and a range of locations.
Various instructions for controlling the identified mobile camera are disclosed. The one or more instructions may comprise an instruction configured to cause the identified mobile camera to transmit video data at a second resolution, the second resolution being higher than a resolution of a live video data stream received from the identified mobile camera. The method may further comprise receiving, in response to the instruction, video data at the second resolution from the identified mobile camera. The one or more instructions may comprise an instruction configured to cause the identified mobile camera to store video data recorded by that camera in a video data file. The video data stored in the video data file may be associated with the time specified in the query. The one or more instructions may comprise an instruction configured to cause the identified mobile camera to transmit a video data file to the server. The method may further comprise receiving, in response to the instruction, the video data file from the identified mobile camera. The one or more instructions may comprise an instruction configured to cause the identified mobile camera to begin transmitting a live video data stream to the server. This instruction may be sent in response to determining that the identified mobile camera is not currently transmitting a live video data stream.
The method may further comprise identifying a plurality of mobile cameras, and sending an instruction to each of the identified mobile cameras to control the identified mobile cameras. Identifying a mobile camera may comprise: determining stored time and location data corresponding to the specified time and location of the query; and determining one or more mobile cameras corresponding to the determined time and location data. Determining the time and location data may comprise identifying time and location data within a predetermined range around the specified time and location data of the query.
Another aspect of the present disclosure provides a method of controlling a mobile camera, the method being performed at the mobile camera and comprising: transmitting a live video data stream to a server; transmitting, to the server, data indicative of a time and a location at which the live video stream was recorded; receiving an instruction from the server, the instruction being configured to control the mobile camera; and in response to receiving the instruction, performing one or more operations in accordance with the received instruction to control the mobile camera.
The method may further comprise: detecting an event using one or more camera sensors; and in response to detection of the event, performing one or more operations to control the camera. The operations may comprise one or more of: initiating a recording of video data; and/or transmitting a live video data stream at a first resolution; and/or transmitting video data at a second resolution, wherein the second resolution is higher than the first resolution of the live video data stream; and/or storing video data in a file; and/or transmitting a video data file to the server.
Another aspect of the present disclosure provides a method of providing video data, the method being performed at a server and comprising: receiving a plurality of live video data streams from one or more mobile cameras, wherein each of the live video data streams is associated with data indicative of a location at which it was recorded; receiving a query for video data, the query specifying a location; identifying at least one live video data stream recorded at a location satisfying the query; and outputting the identified video data stream.
Each of the stored video data streams may be associated with the data indicative of the time at which it was recorded, the received query may specify a time, and identifying at least one live video data stream may comprise identifying a live video data stream recorded at a time and a location satisfying the query.
The method may further comprise storing the received plurality of live video data streams, and outputting the identified video data stream may comprise outputting a stored video data stream. Alternatively or additionally, outputting the identified video data stream may comprise outputting a live video data stream.
The query may specify one or more of a period of time and a range of locations.
In other aspects, the present disclosure provides an apparatus configured to perform any of the methods disclosed herein. In particular, the present disclosure provides server and/or a mobile camera configured to perform the respective methods disclosed herein. The present disclosure also provides a system including one or more such servers and one or more such mobile cameras.
Another aspect of the present disclosure provides a processor-readable medium comprising instructions which, when executed by a processor, cause the processor to perform any of the methods disclosed herein.
Another aspect of the present disclosure provides a computer program product comprising instructions which, when executed by a computer, cause the computer to perform any of the methods disclosed herein.
BRIEF DESCRIPTION OF THE DRAWINGS
Examples of the disclosure will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 is a schematic representation of a network comprising a server and a plurality of mobile cameras; Figure 2 is a schematic representation of the server shown in Figure 1; Figure 3 is a schematic representation of one of the mobile cameras shown in Figure 1; Figure 4 is a flow diagram of a method of controlling a mobile camera, as performed by a server; Figure 5 is a flow diagram of a method of controlling a mobile camera, as performed by the mobile camera; and Figure 6 is a flow diagram of a method of controlling a mobile camera and responding to a query, as performed by a server.
DETAILED DESCRIPTION
Generally disclosed herein are methods and apparatuses for controlling a mobile camera. Figure 1 depicts a network 100 comprising one or more servers 200 and a plurality of mobile cameras 300, illustrated as cameras 300(a)-(d), wherein the one or more servers is able to communicate with each of the mobile cameras via a respective communication link 500. For the sake of clarity, only one server 200 is shown in Figure 1, and the singular term "server" is used throughout the present description; it should, however, be appreciated that multiple servers may be used (e.g. as a load-balanced server cluster, as a distributed cloud server arrangement, or any other suitable configuration). The mobile cameras may send data over communication link 500. Specifically, the mobile cameras 300(a)-(d) may send a live video data stream and/or one or more video data files to the server 200. The mobile cameras 300(a)-(d) may also send, to the server 200, information indicating the time and location at which the live video data stream or video data file was recorded. Optionally, the mobile cameras 300(a)-(d) may also send other sensor information (which is described in more detail below) to the server 200. Video data, or other data, with corresponding location data may be referred to as geotagged data. Video data, or other data, with corresponding time data may be referred to as timestamped data. The server 200 may store the time and location data received from the cameras 300(a)-(d) in a memory. The server 200 may further store the one or more video data files and/or live video data stream in a memory, accessible by the server 200. The server 200 is further configured to be able to receive and process one or more queries 110. A query 110 may request video data and may specify a time and location. The server 200 may be configured to search the stored data to find time and location data satisfying the time and location data of the query 110. The server 200 may identify one or more mobile cameras 300 linked to (in other words, associated with) the time and location data satisfying the query 110. Having identified a mobile camera 300, the server 200 may send one or more instructions to the identified camera in order to control that mobile camera 300. In response to the instructions, the server 200 may receive further data from mobile camera 300. This further data may also be stored by the server 200. The server 200 may provide a response 120 to the query 110. For example, the server 200 may transmit or otherwise provide access to the data stored on the server 200 and/or the further data received from an identified camera 300. An advantage of receiving a plurality of live video data streams and corresponding location and time data is that the server has a live, up-to-date, overview of the location of the plurality of mobile cameras 300 and the video data those cameras are recording. This live data can be used for determining a response 120 to a query 110. Another advantage of this method is that a plurality of cameras can be controlled centrally by the server 200, for example for the collection of video and other data, and does not rely on local control of a mobile camera 300, for example by a user or owner of the mobile camera 300.
Figure 2 depicts the server 200. The server 200 comprises a processor 210 and a memory 220. The processor 200 is able to execute instructions, stored in the memory 220, to cause the server 200 to perform actions that implement methods described herein. Memory 220 may comprise a data store 230 in which data received by the server 200 may be stored. The data store 230 may alternatively or additionally comprise memory which is physically separate from, but connected to, the server 200. This separate memory may be physically remote from the server 200, and may be connected via a wireless and/or wired connection. The server 200 may further comprise a receiver 240 and a transmitter 250 for receiving and transmitting data, respectively.
Figure 3 depicts a mobile camera 300, sometimes referred to in this description as a camera 300, which comprises a processor 310 and a memory 320. Within this disclosure, a mobile camera 300 is a camera which may be moved during normal operation of the camera. Specifically, the mobile camera 300 may be moved between different geographical locations, and may record imaging data during this movement. For example, a mobile camera may be placed in a vehicle, such that it can record imaging data while the vehicle is in motion. A dash cam is an example of a mobile camera that can be placed in a vehicle and used to implement the present disclosure. However, it will be appreciated that the present disclosure can also by implemented by other types of mobile camera. A mobile camera 300 may also record imaging data while it is stationary. The processor 310 is able to execute instructions, stored in the memory 320, to cause the mobile camera 300 to perform actions that implement the methods described herein. The mobile camera 300 comprises imaging hardware 370 able to record data, such as video data and image stills. Imaging hardware 370 may comprise optical elements (e.g. a lens and an image sensor), electronic elements, connectivity elements, signal processing elements, other processing elements, and/or software elements for capturing, processing, and transmitting imaging data. Imaging data may comprise for example video data, a live video data stream, or image stills. The mobile camera 300 further comprises a receiver 340 and a transmitter 350, for receiving and transmitting data, respectively. The data may be transmitted and/or received over the communication link 500 and may comprise, for example, imaging data, or instructions for controlling the mobile camera 300.
It will be appreciated that the memory 320 of the mobile camera 300 is not sufficiently large to store imaging data indefinitely. The memory 320 of the mobile camera 300 may store imaging data in a loop buffer 330, which has a predetermined amount of storage space available for storing data. A loop buffer is a type of first-in first-out (FIFO) data structure, and may be known as a circular buffer, circular queue, ring buffer or cyclic buffer. The memory structure in which the loop buffer 330 is stored may use data files, or may be implemented as raw storage on a storage medium. Purely by way of example, the storage medium may comprise a Secure Digital (SDTM) card or a flash memory (e.g., a NAND flash memory). Once all storage space in the loop buffer 330 is filled, newly recorded imaging data may be stored in a memory location already storing data, thus overwriting the older stored data. The loop buffer 330 is able to retain a set amount of data history, while overwriting the oldest saved data with newer recorded data. The length of the time period of historical imaging data that can be saved in the loop buffer 330 depends on the size (i.e. data storage capacity) of the loop buffer 330, the resolution of the imaging data saved in the buffer 330, and the frequency or rate at which new imaging data is produced for storage in loop buffer 330, and may further depend on other factors. The size of the loop buffer 330 may be changed (i.e., increased or decreased) by the processor 310. For example, the processor 310 may change the size of the loop buffer in response to a command from the server 200. As another example, the processor 310 may change the size of the loop buffer 330 to allow more or less storage space in the memory 320 to be allocated to a file store 360 (described below). A mobile camera 330 may comprise multiple loop buffers 330, which may store different types of data. For example, a first loop buffer may store video data at a first resolution, and a second loop buffer may store storing video data at a second resolution.
The memory 320 of the camera 300 may further comprise a file store 360, in which video data files may be saved separate from the loop buffer 330. Unlike imaging data stored in the loop buffer 330, video data files stored in the file store 360 are not automatically overwritten by new recorded data.
Purely by way of example, video data may be stored in the loop buffer 330 in MPEG-2 transport stream ("MPEG-TS") format. Video data files may be stored in the file store 360 in MPEG-4 Part 14 ("MP4") format. Other suitable formats, including proprietary formats, may also be used to store imaging data in the loop buffer 330 and the file store 360. Imaging data in the loop buffer 330 may be converted into a different format for storage in the file store 360.
A mobile camera 300 may further comprise one or more sensors for detecting or measuring one or more characteristics of the camera 300, the camera's environment and/or the imaging data it records. The sensors may include one or more of a location sensor, a gyroscopic sensor, an accelerometer, an infrared sensor, a magnetometer, a thermometer, and a barometer, for example. The location sensor is configured to determine the location of mobile camera 300. The location sensor may be, for example, a Global Navigation Satellite System (GNSS) sensor. The GNSS sensor may be configured to use any suitable navigation satellite system including, but not limited to, the Global Positioning System, Galileo, GLONASS and/or BeiDou. The location sensor need not necessarily be a GNSS sensor. For example, the location sensor may be configured to determine the location of the mobile camers 300 using a land-based positioning system, e.g. by triangulating a position based on a plurality of network access points, such as cellular telephone base stations. The location sensor may be used to determine the velocity of the mobile camera 300, by determining the difference between the location of the camera at two or more points in time.
Methods of controlling a mobile camera 300 will now be described with reference to Figures 4 to 6 Figure 4 is a flow diagram of a method that can be executed by a server 200 to control a mobile camera 300. In operation 402, the server 200 receives a plurality of live video data streams from a plurality of mobile cameras 300, for example cameras 300(a)-(d) shown in Figure 1. Each mobile camera 300 may provide a separate live video data stream independently from the other cameras and/or data streams. Each mobile camera 300 also provides data indicating the time and location at which the live video stream was recorded by that camera 300. The data indicating the time and location may be encoded in the live video stream itself. For example, each frame of the live video stream may include metadata that indicates the time and location at which that frame was recorded. Alternatively, the data indicating the time and location at which the live video stream was recorded may be provided to the server in a communication that is separate from the live video stream itself.
A live video data stream may be a transmission of video data, by a mobile camera 300, occurring at substantially the same time as the video data is recorded. A live stream of video data may be received by the server 200 with a slight delay relative to when the data was recorded, for example a 5 seconds, 10 seconds, 30 seconds, or 1 minute delay. The delay between the video data being recorded and the live video data stream being received by the server 200 may be caused by a plurality of factors. These factors may be caused by the mobile camera 300 and/or the communication link 500. The live video data stream must be prepared for transmission to the server 200 by a mobile camera 300. This process may include signal processing of the recorded video data, for example to change the resolution of the video data and/or to change the data format of the video data. This signal processing of the video data takes a finite (non-zero) amount of time. The video data stream may be transmitted across the communication link 500 in data packets, wherein each data packet may comprise video data of a predetermined length of time, for example 1, 5, or 10 seconds. The processing of video data may further comprise preparing these data packets of the video stream. The delay caused by preparation of the data packets may depend on the length of time of video data included in a packet. A mobile camera 300 may wait to collect an amount of video data to fill a data packet, before sending the next data packet. The delay caused by the data packet creation may be at least as long as the duration of video data comprised in a data packet. Once the video data is prepared for sending as part of a live video data stream, the live video data stream will take a finite (non-zero) amount of time to be transmitted over the communication link 500. The capacity and data transfer speed of the communication link 500 may influence the delay with which the live video data stream is received by the server 200.
In operation 404, the server 200 stores the time and location data received from the mobile camera 300, in its data store 230. The server 200 also stores data indicating which mobile camera 300 provided the time and location data. The data is stored in such a way that it is searchable by the server 200 based on one or both of location and time. For example, the data store 230 may comprise a database configured to store the time and location data. The database may also be configured to store the live video data streams that were received in operation 402.
In operation 406, the server 200 receives a query 110. The query 110 specifies a time and a location. The time may be expressed as a specific time, multiple specific times, a period of time, or multiple periods of time. The time may optionally comprise a date. A time may be specified as a number of hours and minutes (and, optionally, seconds) on a particular date. It will be appreciated that other suitable methods for specifying a time may be used, such as a UnixTM time stamp. The location may be expressed as a specific location, multiple specific locations, a distance range around one or more specific locations, or may specify an area such as for example a street, a neighbourhood, a postcode, a range of geographic coordinates, etc. A location may be specified by the geographic coordinates of that location, e.g. GNSS coordinates.
More complex queries are also possible. In one example, a query can be formulated to request recordings within an area defined by a travel time, a speed of travel, a starting location and a starting time (e.g. "show me all recordings reachable by travelling for W minutes at X kilometres per hour, when starting from location X at time Z''). A query of this form can be used to track persons involved in an incident (e.g. a perpetrator, a victim and/or a witness) by retrieving video footage from cameras located at any point along all of their possible routes away from the incident. As another example, a query may be formulated to request recordings at a particular location at a future time. This type of query can be used to schedule recordings, by causing mobile cameras 300 that would otherwise be inactive to begin recording and/or by causing mobile cameras 300 to record data in a different manner (e.g. at a higher resolution and/or by storing video data files in the file store 360) if they are in that particular location.
In operation 408, the server 200 searches the data store 230 for data satisfying the time and location of the query 110. In order to determine whether time and location data satisfies the query 110, the server 200 may compare the stored data to the specified time and location using ranges and/or limits set by the query 110. Additionally or alternatively, the server 200 may use a predetermined range around the specified time and location to determine whether or not the time and location data are considered to satisfy the query 110. For example, the predetermined range may be 5, 10, 20, or 30 seconds around a specified time and/or 10, 20, 50, or 100 metres around a specified location. The predetermined range may be set during configuration of the server 200, may be set by a user of the server 200, or may be included as part of a received query 110.
In operation 410, for a set of time and location data satisfying the query 110, the server 200 identifies one or more mobile cameras 300 that recorded a video stream at the time and location data that satisfied the query 110. Information identifying the mobile camera 300, for example a camera reference or other unique identifier, may be stored in the data store 230 with the time and location data. Information identifying the mobile camera 300 that recorded the video stream may be saved separately from the time and location data itself. Alternatively or additionally, the camera 300 that recorded the video stream may be determined from the time and location data itself, for example by being encoded in the data structure (e.g. using a particular file format) in which this time and location is stored. The data structure may contain metadata identifying the source of the data, which may be an identifier of the mobile camera 300 that recorded the data. Alternatively or additionally, the mobile camera 300 that recorded a video stream may be determined from the video stream, for example from metadata contained in the video stream, wherein that metadata identifies the camera.
In operation 412, the server 200 sends one or more instructions to the camera 300 identified at operation 410 for controlling the identified camera 300. The nature of the instructions may depend, for example, on the time of the data satisfying the query 110, the nature of query 110, or one or more properties of identified camera 300. If more than one mobile camera 300 was identified at operation 410, the server 200 may send one or more instructions to each identified camera 300.
As part of the methods disclosed herein, the server 200 may store video data received in the live video data stream in the data store 230. An advantage of storing video data at the server 200 is that the server 200 is able to provide video data in response to a query 110 directly, without requesting further data from a mobile camera 300. The stored video data may be linked to (or, in other words, associated with) the time and location data stored in operation 404. The video data may be stored in the format in which it was received, or may be converted to another format and/or another resolution for storage. Example formats for video data include MP4 or MPEG-TS formats. A query 110 may comprise a request for video data related to a time and location specified in the query 110. In response to a query 110, the server 200 may identify stored video data that was part of a live video data stream linked to a time and location satisfying the query 110. The identified video data may be only a portion of a video data stream spanning a period of time, for example a portion of stored video data corresponding to a period of time within a live video data stream covering a longer time period. The server 200 may output the identified stored video data as part of a response 120 to the query 110. Outputting stored video data may comprise sending a video data file comprising a copy of the stored video data as part of a response 120 to the query 110.
Alternatively or additionally, outputting stored video data may comprise providing information (e.g., a uniform resource locator and/or login data) in the response 120, whereby the information in the response enables a receiver of the response 120 to access the stored video data in the data store 230.
The server 200 is able to control an identified mobile camera 300, which may be in response to a query 110, by sending one or more instructions to the camera 300. These instructions may include an instruction to transmit video data to the server 200, wherein the video data has a second resolution that is higher than the resolution of the live video data stream. The resolution of the live video data stream may be referred to as a first resolution, and may be set by the mobile camera 300 to be sufficiently low so that live streaming over the communication link 500 is possible. Different live video data streams may have different resolutions. A mobile camera 300 may for example record video data at a resolution determined by the imaging hardware 370. This resolution may be equal to or higher than the second resolution mentioned above, for example 1080p resolution, ultra-HD resolution, or 4K resolution. The camera 300 may convert the video data to a lower resolution, for example 340 x 480, 600 x 800, or 720p resolutions. The mobile camera 300 may use the first resolution (i.e. the lower resolution) to transmit a live video data stream to the server 200 over the communication link 500. The server 200 may receive second resolution video data in response to the instruction sent to the camera 300. An advantage of the server 200 having the ability to control the mobile camera 300 to provide video data at a second (i.e. higher) resolution is that the response 120 to query 110 can provide more detailed video of an event of interest to a party making the query 110, without having to transmit all second resolution data to server 200.The quantity of data transmitted over the communication link 500 can thus be reduced (which may, in turn, reduce the costs of transmitting data), and the quantity of storage space require at the server 200 may also be reduced. Only video data of interest for a query 110 is transmitted over link 500 and potentially stored at server 200.
The one or more instructions sent to the identified mobile camera 300 may comprise an instruction to store video data in a file. The instruction may specify the resolution at which the video data is to be stored, or the resolution may be the default resolution at which camera 300 stores recorded video data. The instruction may specify one or more periods of time and/or one or more locations for which corresponding video data is to be stored in the video data file. The video data file may be stored in the file store 360, where it may be protected from being overwritten by newly recorded data (unless overwriting is specifically instructed). The mobile camera 300 may generate a reference to the video data file that can be used to identify the video data file, and link it to the instruction. The reference to the created video data file may be transmitted and provided to the server 200. Time and location data corresponding to the video data may further be stored or otherwise linked to (i.e. associated with) the stored video data file.
The one or more instructions sent to the identified mobile camera 300 may comprise an instruction to transmit one or more video data files to the server 200. The instruction may specify a time and/or a location, or a period/range of one or both of these, to identify one or more video data files to be sent. The instruction may comprise a reference to a video data file to be sent. In response to the instruction, the camera 300 may identify the requested video data file(s), and transmit them over the communication link 500 to the server 200.
The server 200 may determine that an identified camera 300, corresponding to time and location data satisfying query 110, is not currently providing a live video data stream. The server 200 may send an instruction to the identified camera 300, to cause the identified camera 300 to initiate a live video data stream to the server 200. Camera 300 receiving the instruction may be recording, but not transmitting, video data. In response to the instruction, the camera 300 may begin a transmission of a live video data stream to the server 200. Alternatively, the camera 300 may not be recording video data, in which case the instruction can initiate a recording of video data as well as begin a transmission of a live video data stream to the server 200.
In an example query 110, the time and location specified in query 110 may be satisfied by a current or recent time and location of an identified camera 300. In this instance, recent can be taken to mean within a period of time for which recorded imaging data is still stored in the memory 320 of the camera 300, for example in the loop buffer 330. If the current time satisfies query 110, the server may send an instruction to the camera 300 to send video data at a second resolution alongside (i.e. at substantially the same time as) the live video data stream. The second resolution may be higher than the first resolution of the live stream, and may not be suitable to be sent as a live stream. Therefore, the camera 300 may implement the instruction by storing the second resolution video data in a file, and transmitting the video data file to server 200 at a rate which may be slower than a live stream. The instruction may be executed by the camera 300 by storing at least a portion of the loop buffer 330 data in a video data file at the second resolution, and sending the stored video data file over the communication link 500. Once the video data is sent, the corresponding video data file at the camera 300 may be deleted or overwritten. The video data file may be separate from the loop buffer 330, and may for example be stored in the file store 360 of the camera 300. Alternatively, the video data may also be sent straight from the loop buffer 330 to the server 200.
If the current time and location satisfy the query 110, the camera may send second resolution video data to the server 200 for a predetermined period of time (which may be specified in the instruction received from the server 200), or may send second resolution video data until an instruction is received to stop doing so. In cases where a recent time and location satisfy the query 110, the server 200 may send an instruction to the identified camera 300 to send second resolution video data corresponding to the time and location. The mobile camera 300 may respond to this request by retrieving the video data from the loop buffer 330, storing the video data in a file, and transmitting the file. If the video data corresponding to that time is no longer stored in the loop buffer 330 or elsewhere in the memory 320 of the camera 300, the camera 300 may notify the server 200 that the requested video data is no longer available.
Figure 5 is a flow diagram of a method for controlling a mobile camera 300. The method is performed at a mobile camera 300. The method involves the mobile camera 300 transmitting 502 a live video data stream to the server 200, which may be done over the communication link 500. The camera further transmits 504 to the server 200 data indicative of a time and a location at which a live video data stream was recorded.
The data may, optionally, be sent separately to the live video data stream itself. The time and location data sent by the camera 300 may also include further data indicating the identity of the camera that recorded this data. The camera 300 may receive 506 an instruction from server 200 as set out above, and may perform 508 one or more operations to control the camera 300 in response to receipt of the instruction from the server 200.
The mobile camera 300 may comprise one or more sensors for measuring properties relating to the camera 300. A location sensor may be used to determine the location of the mobile camera 300, stored as location data. This location data may be linked to the time at which the location was determined (that is to say, the location data is timestamped), and provided to the server 200. The mobile camera 300 may comprise a gyroscopic sensor, which may be used for determining orientation of the mobile camera 300, stored as orientation data. This orientation data may be provided to the server 200, and may be used to determine what can be seen in imaging data (such as video data) captured by camera 300. Mobile camera 300 may comprise an accelerometer, which may be used to determine acceleration of the mobile camera 300. Mobile camera 300 may further comprise one or more of a thermometer to measure temperature, or a barometer to measure pressure. Mobile camera 300 may further comperise an infrared sensor to measure activity in the vicinity of the camera (for example, to detect an approaching person or vehicle).
One or more of the sensors 380 of the mobile camera 300 may be used for detection and identification of potential incidents. Potential incidents may be identified through detection of events. Examples of events include a sudden acceleration, a sudden deceleration, or a change in angular direction, which may be detected by an accelerometer or a gyroscope. These examples of events may indicate an incident involving an impact affecting movement of mobile camera 300 (e.g. a collision involving a vehicle in which the mobile camera 300 is located). An infrared sensor may be used to detect an approaching individual or vehicle, whose actions and/or movements may be of interest, and may constitute an event. If an event is detected by a sensor 380, this may trigger the camera 300 to begin recording video data. If the camera 300 is already recording during detection of an event, the camera may store the video data recorded in the period of time around this moment in a separate video data file to avoid it being overwritten or deleted by newly recorded footage. Mobile camera 300 may receive and/or store information to allow it to determine when an event detected by a sensor is a notable event that warrants recording and/or storing of video data. The mobile camera 300 may be configured to notify the server 200 if an event is detected. In some implementations, the mobile camera 300 may contact the server only if certain events occur, if more than a threshold number of events occur, or if more than a threshold number of events occur within a predetermined period of time.
The mobile camera 300 may send video data to the server 200 continuously, as a live video data stream. Alternatively, the mobile camera 300 may send video data to the server 200 in response to the detection of an event by a sensor 380 of the mobile camera 300 or in response to an instruction received over the communication link 500 from the server 200. Alternatively or additionally, the mobile camera 300 may send imaging data in the form of a still image, which may be sent periodically. For example, when a mobile camera is not recording video data, it may record a still image periodically, e.g. every 5, 10 or 30 minutes, and provide the still image to the server with corresponding time and location data. This allows the server 200 to remain aware of the presence of mobile camera 300, even when the camera is not sending a live video data stream.
The mobile camera 300 may be located in a vehicle. Specifically, the mobile camera 300 may be a dash cam. The mobile camera 300 may be powered by an external source connected to the vehicle. Alternatively or additionally, the mobile camera 300 may be powered by an internal battery, which may be rechargeable. For example, a mobile camera 300 located in a vehicle may be powered by the power supply of the vehicle while the vehicle engine is turned on, and may be powered by an internal battery while the vehicle engine is turned off. The internal battery may be charged by the vehicle while the engine is on and/or may be charged separately from the vehicle.
The mobile camera 300 may be connected to the server 200 at least in part by a wireless link forming a communication link 500. Such a link may for example comprise Wi-FiTM (IEEE 802.11), GSM, GPRS, 3G, LTE, and/or 5G connectivity. For a mobile camera 300 located in a vehicle, the communication link may also comprise a wired connection, for example an Ethernet (IEEE 802.3) connection, to a connectivity hub of the vehicle. The vehicle connectivity hub may then provide a communication link 500 to the server 200, for example using WiFiTM (IEEE 802.11), GSM, GPRS, 3G, LTE, 5G, or other connectivity channels. The mobile camera 300 may comprise both wired and wireless connections, so that it can form a communication link 500 independently or make use of other available connections.
In some instances, the communication link 500 might fail. For example, the communication link 500 might be unavailable in an area where there is no wireless network connectivity, or due to a hardware failure. In such a case, the mobile camera 300 might be notified internally of an unsuccessful data transmission and may save the first resolution live video data stream data in the loop buffer 330 and/or video data files in the mobile camera memory 320, for sending upon restoration of the communication link 500. The mobile camera 300 may reduce the amount of second (i.e. higher) resolution video data stored in order to increase the amount of first (i.e. lower) resolution data stored by mobile camera 300, to avoid video data loss for a period of time. Mobile camera 300 may continue to timestamp and geotag video data at the time of recording, for transmitting to the server 200 upon restoration of the communication link 500. If the communication link 500 fails for a prolonged period of time, and the mobile camera 300 runs out of memory for storing video data, older video data may be overwritten, and may be lost.
In another example instance, the communication link 500 may be present, but may be unable to transmit the desired amount of data as part of a live video data stream.
The server 200 may communicate over one or more wired or wireless communication links (including, but not necessarily limited to, the communication link 500). For example, the server 200 may communicate using using Ethernet (IEEE 802.3), WiFiTM (IEEE 802.11), GSM, GPRS, 3G, LTE, and/or 5G connections. The data sent over the communication links may be secured using the encryption of the respective connectivity standard. The data transferred over the connection may further be encrypted using encryption data mechanisms known in the art.
The server 200 may receive a query 110 from an authorised party. The server 200 may have (or have access to) a list of authorised parties from which it accepts and handles queries. For example, the list of authorised parties may include a law enforcement agency, such as the police. An authorised party may have secure authentication credentials, which can be added to a query 110 so that the server 200 can authenticate the query 110. The authentication credentials may further indicate to the server 200 the identity of the party presenting the query 110. The query 110 may be sent by the authorised party to the server 200 over a communication link. The communication link may be the same communication link 500 used by server 200 to communication with mobile cameras 300, or may be a separate communication link, for example a wired communication link. The communication link used for transmitting one or both of queries 110 and query responses 120 may be a secure communication link. Authentication credentials may be required to authorise a party. Parties which are not authorised may be prevented from presenting a query 110 to the server 200.
Alternatively or additionally, a query 110 from a non-authorised party may lack authentication credentials, and the server 200 may as a result refuse to handle and respond to such a query 110.
A query 110 may further comprise information regarding the nature of the required response 120 to query 110. For example, a query 110 may specify one or more destinations, or the format of the response. Example queries may include a request to send video data corresponding to a specified time and location, a request to initiate and send a recording by a camera 300 in a particular location at a current or future time, a request to retrieve data from server 200, or a request to receive data stored on one or more mobile cameras 300. In addition to time and location data, a query 110 may specify further requirements. Data relating to the specified further requirements may be sent, to the server 200, along with time and location data by a mobile camera 300. For example, a query may specify an orientation of a camera 200 in one direction or a range of directions, which may be determined from data provided by a magnetometer.
The instantaneous orientation of the camera 200 may be sent alongside time and location data by the mobile camera 300.
A mobile camera 300 may be owned by a camera-owning party, which may or may not be an authorised party 300. Different mobile cameras 300 may be owned by different parties. A mobile camera 300 may have local controls which may allow the camera to be controlled by a party other than the server 200. For example, a mobile camera 300 may be controlled by the camera itself based on input and information received from one or more sensors 380, or may be controlled by a party owning and/or using mobile camera 300. An authorised party may be separate from a party owning and/or using a mobile camera 300. A party using a mobile camera 300 may be able to control the mobile camera 300 manually. For example, a user may be able to initiate or stop a live video data stream or a video data recording.
Figure 6 illustrates a flow chart of an example query 110 and response 120 by a server 200. Similar to Figure 4, in operation 602, the server receives a plurality of live video data streams from a plurality of mobile cameras, wherein each live video data stream is timestamped and geotagged, thereby providing time and location data. In operations 604 and 606, the server 200 stores the time and location data, and video data of the live video data stream, respectively, in a searchable data store 230. In operation 608, the server 200 receives 610 a query 110 from an authorised party. The query 110 specifies a time and location of interest. In response to receipt of the query 110, the server 200 searches the data sore 230 for time and location data corresponding to the specified time and location. The server 200 identifies all entries in data store 230 that have time and location data falling within a predetermined range around the specified time and location data of the query 110. For each identified entry of time and location data, the server transmits 612 the video data corresponding to the time and location data, as part of a response 120 to query 110. In operation 614, the server 200 identifies a mobile camera 300 responsible for obtaining the video data satisfying the query 110. Server 200 may determine that the identified mobile camera 300 may have access to further information relevant to the query 110. For example, the identified mobile camera 300 may have video data at a higher resolution than the video data stored in data store 230. In operation 616, the server 200 sends instructions to the identified mobile camera 300 to control the mobile camera 300 to transmit the further relevant information to the server. In the above example, this may involve the mobile camera 300 storing, in a video data file, the relevant video data at a higher resolution than the streamed video data. The video data file may then be transmitted to the server 200 over a communication link 500, along with corresponding time and location data (for example, by timestamping and geotagging the video data). In operation 618, the server 200 receives the further relevant information from mobile camera 300, which in this example may be the further video data. In operation 620, the further received video data is then stored by server 200 in data store 230. Based on its corresponding time and location data, this further video data may be linked to other data corresponding to this video data, for example the video data of the live video data stream, and its corresponding time and location data. In operation 622, the server 200 sends the further relevant data as part of a response 120 to the query 110, to the authorised party and/or to a destination specified in the query 110.
The above paragraphs have described methods using a network of mobile cameras 300.1t should be appreciated that, in addition to mobile cameras 300, the network may also comprise one or more stationary cameras, for example one or more pan-tilt-zoom cameras. A stationary camera can be understood to be a camera that remains substantially in the same location during normal use, but may be moveable, for example to be brought from one location to another.
The methods disclosed herein can be performed by instructions stored on a processor- readable medium. The processor-readable medium may be: a read-only memory (including a PROM, EPROM or EEPROM); random access memory; a flash memory; an electrical, electromagnetic or optical signal; a magnetic, optical or magneto-optical storage medium; one or more registers of a processor; or any other type of processor-readable medium. In alternative embodiments, the present disclosure can be implemented as control logic in hardware, firmware, software or any combination thereof. The apparatuses disclosed herein may be implemented by dedicated hardware, such as one or more application-specific integrated circuits (ASICs) or appropriately connected discrete logic gates. A suitable hardware description language can be used to implement the methods described herein with dedicated hardware.
It will be appreciated by the person skilled in the art that various modifications may be made to the above described embodiments, without departing from the scope of the invention as defined in the appended claims. Features described in relation to various embodiments described above may be combined to form embodiments also covered in the scope of the invention.
Claims (22)
- CLAIMS: 1. A method of controlling a mobile camera, the method being performed at a server and comprising: receiving a plurality of live video data streams from one or more mobile cameras; storing data indicative of a time and a location at which each live video data stream was recorded; receiving a query for video data, the query specifying a time and a location; identifying a mobile camera that recorded a video data stream at a time and a location satisfying the query; and sending one or more instructions to the identified mobile camera, the instructions being configured to control the identified mobile camera.
- 2. A method in accordance with claim 1, the method further comprising: storing the received plurality of live video data streams, wherein each of the stored video data streams is associated with the data indicative of the time and location at which it was recorded; in response to receiving the query for video data, identifying at least one stored video data stream recorded at a time and a location satisfying the query; and outputting the identified stored video data stream.
- 3. A method in accordance with any of the preceding claims, wherein the query specifying time and location specifies one or more of a period of time and a range of locations.
- 4. A method in accordance with any of the preceding claims, wherein the one or more instructions comprise an instruction configured to cause the identified mobile camera to transmit video data at a second resolution, the second resolution being higher than a resolution of a live video data stream received from the identified mobile camera, and wherein the method further comprises receiving, in response to the instruction, video data at the second resolution from the identified mobile camera.
- 5. A method in accordance with any of the preceding claims, wherein the one or more instructions comprise an instruction configured to cause the identified mobile camera to store video data recorded by that camera in a video data file.
- 6. A method in accordance with claim 5, wherein the video data stored in the video data file is associated with the time specified in the query.
- 7. A method according to any of the preceding claims, wherein the one or more instructions comprise an instruction configured to cause the identified mobile camera to transmit a video data file to the server, and wherein the method further comprises receiving, in response to the instruction, the video data file from the identified mobile camera.
- 8. A method in accordance with any of the preceding claims, further comprising determining that the identified mobile camera is not currently transmitting a live video data stream, and wherein the one or more instructions comprise an instruction configured to cause the identified mobile camera to begin transmitting a live video data stream to the server.
- 9. A method in accordance with any of the preceding claims, comprising: identifying a plurality of mobile cameras, and sending an instruction to each of the identified mobile cameras to control the identified mobile cameras.
- 10. A method in accordance with any of the preceding claims, wherein identifying a mobile camera comprises: determining stored time and location data corresponding to the specified time and location of the query; and determining one or more mobile cameras corresponding to the determined time and location data.
- 11. A method in accordance with claim 10, wherein determining the time and location data comprises identifying time and location data within a predetermined range around the specified time and location data of the query.
- 12. A method of controlling a mobile camera, the method being performed at the mobile camera and comprising: transmitting a live video data stream to a server; transmitting, to the server, data indicative of a time and a location at which the live video stream was recorded; receiving an instruction from the server, the instruction being configured to control the mobile camera; and in response to receiving the instruction, performing one or more operations in accordance with the received instruction to control the mobile camera.
- 13. A method in accordance with claim 12, further comprising: detecting an event using one or more camera sensors; and in response to detection of the event, performing one or more operations to control the camera.
- 14. A method in accordance with any of claims 12 or 13 wherein the operations comprise one or more of: initiating a recording of video data; and/or transmitting a live video data stream at a first resolution; and/or transmitting video data at a second resolution, wherein the second resolution is higher than the first resolution of the live video data stream; and/or storing video data in a file; and/or transmitting a video data file to the server.
- 15. A method of providing video data, the method being performed at a server and comprising: receiving a plurality of live video data streams from one or more mobile cameras, wherein each of the live video data streams is associated with data indicative of a location at which it was recorded; receiving a query for video data, the query specifying a location; identifying at least one live video data stream recorded at a location satisfying the query; and outputting the identified video data stream.
- 16. A method in accordance with claim 15, wherein: each of the stored video data streams is associated with the data indicative of the time at which it was recorded; the received query further specifies a time; and identifying at least one live video data stream comprises identifying a live video data stream recorded at a time and a location satisfying the query.
- 17. A method in accordance with claim 15 or claim 16, wherein outputting the identified video data stream comprises outputting a live video data stream.
- 18. A method in accordance with any of claims 15 to 17, the method further comprising storing the received plurality of live video data streams, and wherein outputting the identified video data stream comprises outputting a stored video data stream.
- 19. A method in accordance with any of claims 15 to 18, wherein the query specifies one or more of a period of time and a range of locations.
- 20. An apparatus configured to perform a method in accordance with any of the preceding claims.
- 21. A processor-readable medium comprising instructions which, when executed by a processor, cause the processor to perform a method in accordance with any of claims 1 to 19.
- 22. A computer program product comprising instructions which, when executed by a computer, cause the computer to perform a method in accordance with any of claims 1 to 19.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1817173.6A GB2578311A (en) | 2018-10-22 | 2018-10-22 | Method and apparatus for controlling a mobile camera |
GBGB1901720.1A GB201901720D0 (en) | 2018-10-22 | 2019-02-07 | Method and apparatus for controlling a mobile camera |
PCT/GB2019/052995 WO2020084288A1 (en) | 2018-10-22 | 2019-10-21 | Method and apparatus for controlling a mobile camera |
GB2107356.4A GB2593368B (en) | 2018-10-22 | 2019-10-21 | Method and apparatus for controlling a mobile camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1817173.6A GB2578311A (en) | 2018-10-22 | 2018-10-22 | Method and apparatus for controlling a mobile camera |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201817173D0 GB201817173D0 (en) | 2018-12-05 |
GB2578311A true GB2578311A (en) | 2020-05-06 |
Family
ID=64453740
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1817173.6A Withdrawn GB2578311A (en) | 2018-10-22 | 2018-10-22 | Method and apparatus for controlling a mobile camera |
GBGB1901720.1A Ceased GB201901720D0 (en) | 2018-10-22 | 2019-02-07 | Method and apparatus for controlling a mobile camera |
GB2107356.4A Active GB2593368B (en) | 2018-10-22 | 2019-10-21 | Method and apparatus for controlling a mobile camera |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GBGB1901720.1A Ceased GB201901720D0 (en) | 2018-10-22 | 2019-02-07 | Method and apparatus for controlling a mobile camera |
GB2107356.4A Active GB2593368B (en) | 2018-10-22 | 2019-10-21 | Method and apparatus for controlling a mobile camera |
Country Status (2)
Country | Link |
---|---|
GB (3) | GB2578311A (en) |
WO (1) | WO2020084288A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1675377A2 (en) * | 2004-12-27 | 2006-06-28 | Kyocera Corporation | Mobile camera system |
US20130039542A1 (en) * | 2009-04-28 | 2013-02-14 | Whp Workflow Solutions, Llc | Situational awareness |
US20170021875A1 (en) * | 2015-07-22 | 2017-01-26 | Pittasoft Co., Ltd. | Image recording device for vehicle with dual wi-fi access function and image sharing system using the same |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9094791B2 (en) * | 2014-01-01 | 2015-07-28 | Ouri Shifman | Method for providing on-demand digital representations to mobile devices in other geographic locations |
US20170019581A1 (en) * | 2015-07-16 | 2017-01-19 | Symbol Technologies, Llc | Arrangement for, and method of, reducing server processing requirements of a host server |
US10204159B2 (en) * | 2015-08-21 | 2019-02-12 | Trimble Navigation Limited | On-demand system and method for retrieving video from a commercial vehicle |
GB2577689B (en) * | 2018-10-01 | 2023-03-22 | Digital Barriers Services Ltd | Video surveillance and object recognition |
-
2018
- 2018-10-22 GB GB1817173.6A patent/GB2578311A/en not_active Withdrawn
-
2019
- 2019-02-07 GB GBGB1901720.1A patent/GB201901720D0/en not_active Ceased
- 2019-10-21 GB GB2107356.4A patent/GB2593368B/en active Active
- 2019-10-21 WO PCT/GB2019/052995 patent/WO2020084288A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1675377A2 (en) * | 2004-12-27 | 2006-06-28 | Kyocera Corporation | Mobile camera system |
US20130039542A1 (en) * | 2009-04-28 | 2013-02-14 | Whp Workflow Solutions, Llc | Situational awareness |
US20170021875A1 (en) * | 2015-07-22 | 2017-01-26 | Pittasoft Co., Ltd. | Image recording device for vehicle with dual wi-fi access function and image sharing system using the same |
Also Published As
Publication number | Publication date |
---|---|
GB201817173D0 (en) | 2018-12-05 |
GB202107356D0 (en) | 2021-07-07 |
GB2593368B (en) | 2023-03-01 |
WO2020084288A1 (en) | 2020-04-30 |
GB2593368A (en) | 2021-09-22 |
GB201901720D0 (en) | 2019-03-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11562020B2 (en) | Short-term and long-term memory on an edge device | |
US10956758B2 (en) | Method and system for providing auto space management using virtuous cycle | |
US11994400B2 (en) | Systems and methods for assessment of a rideshare trip | |
CN106462729B (en) | Vehicle image data management system and method | |
US10582162B2 (en) | Image information collecting system and method for collecting image information on moving object | |
KR101783982B1 (en) | Integration Control System and Method Using Surveillance Camera for Vehicle | |
JP2016181239A (en) | Region monitoring server, provision terminal and system | |
US10204159B2 (en) | On-demand system and method for retrieving video from a commercial vehicle | |
KR101906709B1 (en) | Connected dash cam | |
JP2017517819A (en) | Method for monitoring road conditions in real time and apparatus for monitoring road conditions in real time | |
US11423706B2 (en) | Real-time data acquisition and recording data sharing system | |
JP2017117005A (en) | Accident notification system, notification system, on-vehicle notification device and accident notification method | |
WO2012137367A1 (en) | Image accumulation system | |
CN107784831A (en) | Car networking road conditions video acquisition platform and method | |
CN113033301B (en) | Method for acquiring road inspection facility data based on AI image recognition technology | |
WO2018155149A1 (en) | Sensor information presentation device, sensor information collection device, sensor information collection system, sensor information presentation method, sensor information collection method, and computer program | |
US20200074761A1 (en) | On-vehicle device, data collection system, and data collection apparatus | |
KR20140022680A (en) | System and method for managing integrated image of vehicle blackbox | |
GB2578311A (en) | Method and apparatus for controlling a mobile camera | |
JP7000884B2 (en) | Data acquisition system and server | |
US11032372B1 (en) | Efficient data streaming using a global index | |
KR20130057265A (en) | A system for providing video images of a smartphone black-box and the method thereof | |
KR20130103876A (en) | System for retrieving data in blackbox | |
JP7182121B2 (en) | image acquisition system | |
KR20140065725A (en) | System and method for managing integrated image information of vehicle blackbox |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |