GB2456871A - Video Location Annotation System - Google Patents
Video Location Annotation System Download PDFInfo
- Publication number
- GB2456871A GB2456871A GB0801549A GB0801549A GB2456871A GB 2456871 A GB2456871 A GB 2456871A GB 0801549 A GB0801549 A GB 0801549A GB 0801549 A GB0801549 A GB 0801549A GB 2456871 A GB2456871 A GB 2456871A
- Authority
- GB
- United Kingdom
- Prior art keywords
- video recording
- location
- video
- annotation system
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000001360 synchronised effect Effects 0.000 claims description 3
- 238000000034 method Methods 0.000 description 8
- 239000004065 semiconductor Substances 0.000 description 2
- 239000013078 crystal Substances 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Time and location information from a navigation system is stored and combined with timing information associated with a video recording to produce an overlay file that can be used to annotate the video recording with information about the location where each part of the video recording was made. The position determination may be via a Global Positioning Satellite system (GPS), radar or sonar, radio direction finding, a hyperbolic navigation system, or an inertial navigation system. External databases may be queried by the store position/location data. The positions/locations may be linked to images, maps or aerial photographs. The linked information may be provided as sub images within the video image.
Description
1 2456871 Video Location Annotation Syatem This invention relates to annotating video material by including location and associated information in it.
The availability of accurate location information from Global Positioning Satellites (GPS) has enabled many novel services including the ability to add location information to still photographs. This may be done by connecting a GPS receiver to a camera (such as the Nikon D200) or including the GPS receiver within the camera itself (such as the Ricoh 500SE). In these cases, the location of the camera is included in the photograph as hidden metadata at the moment the shutter release is pressed.
An alternative approach is taken by using devices such as the Sony GPS-CS1 which maintain a log of time and location of the photographer. The camera records only the current time when the shutter is pressed. The location relating to the time the photograph was taken is calculated after the event by examining the GPS log and finding the location closest to the time the photo was taken and inserting this location information into the photograph after the event.
The present invention deals not with still photographs but rather with video material.
The present invention uses a log of time and location information from a navigation system such as GPS, together with timing information associated with video material to create a file of information which can be overlaid on the video material and will describe the location of the camera at each time.
The invention will now be described solely by way of example and with reference to the accompanying diagrams in which: Figure 1 is a schematic view of the GPS receiver and memory portion of the present invention.
Figure 2 is a schematic view of the video camera and recorder portion of the present invention.
Figure 3 is a schematic view of the process in the present invention to combine a time and location log and the timing information from a video recording to create an overlay file.
Figure 4 is a schematic view of the process in the present invention to create a new video recording by adding the information specified in the overlay file to the original video recording.
Figure 5 is a schematic view of the process in the present invention to render the original video recording with the addition of the information specified in the overlay file.
Figure 6 is a schematic view of the process in the present invention to use the information in the time and location log to query a database of images such as maps or photographs and create a new video recording from the images.
Figure 7 is a schematic view of the process in the present invention to create a new video recording by merging the video recording created from the image database to the original video recording.
Figure 8 is a schematic view of the process in the present invention to render the original video recording with the addition of the video recording created from the image database.
Figure 9 is a schematic view of the process in the present invention to use the contents of a database of reference locations to create an overlay file including location information specified relative to the reference locations.
In figure 1, a GPS receiver 1 maintains a log 2 of the time and location information received from the satellites. This is retained in a form of storage media such as semiconductor memory and can be retrieved at a later date. In an alternative embodiment, location information is derived from another source such as radar, sonar, radio direction finding, inertial navigation or a hyperbolic navigation system such as LORAN.
In figure 2, a video camera 3 and recorder 4 record video material onto storage media such as magnetic tape, optical disc (such as DVD-Recordable) or semiconductor memory. The recorder contains a clock 5 of high accuracy, being derived for example from a piezoelectric crystal oscillator. The recording includes timing information derived from this clock, whether by reference to the start or finish time of the recording or by including "time code" information within the recording itself.
Preferably the timing information stored with the recording can be compensated to allow for inaccurate setting of the clock and inaccuracies in the clock speed.
In figure 3, at some point after the recording has been made, the time and location log 2 is compared with the timing information in the video recording 6 and an overlay file 7 is created. This overlay file contains information to be added to the video material. The information describes the location of the camera at each point. The on-screen position of each data element of the information is specified, together with the time at which it appears and the duration for which it persists. This overlay file can be in any desired format but may beneficially adopt an existing format to allow interoperability with other systems. One such format is the "SubStation Alpha" format created by Kotus.
Preferably, the overlay file may include information interpolated between instances of the time and location information provided by the satellites. This may involve a linear interpolation or preferably a spline or polynomial interpolation which uses multiple instances of time and location information.
Where absolute timing information is not available with the video recording, this can be calculated from the time and location log by selecting an event that is identifiable in both the recording and the log, and using the time from the log as a datum to determine the time at which each part of the recording was made.
In figure 4, the overlay file 7 and the video recording 6 are processed 8 to create a new video recording 9 with the location information overlaid on the picture. This process can be carried out with, for instance, the "VirtualDub" video processing utility (distributed from virtualdub.org) in combination with the "subtitler" filter.
An alternative embodiment shown in figure 5 has the overlay file 7 combined with the video recording 6 in real time and rendered 10 for viewing on a display 11 with the location information overlaid, but without modification of the video recording. This can be done with for instance the "ffdshow" codec in combination with a media player on the Windows platform.
In figure 6 the time and location log file 2 is used to query 12 an external database 13 of information such as photographs, graphic images, maps or aerial/satellite images such as the Terraserver-USA database maintained by Microsoft Corporation with US Geological Survey data. In the embodiment described here, the information retrieved 14 is a set of maps centred on the location of the camera. For each location referenced in the log file, or some proportion of the locations, a map image is retrieved. In one embodiment, these images are stored as a set. In an alternative embodiment, the images are combined to create a map video recording which is synchronised with the original video file.
In figure 7 the images 14 are processed 15 to merge them with the original video recording 6 to create a new video recording 16 as a "picture in picture" with the retrieved images appearing in the video recording or the video recording appearing in the sequence of retrieved images.
This may be done as the images are retrieved from the database or afterwards. In the alternative embodiment in figure 8, the images are rendered 17 in real time as a synchronised "picture in picture" in the original recording. In further alternative embodiment the original recording is rendered as a "picture in picture" in the image video recording.
In figure 9 a database 18 of known reference locations is established. For each of the locations (or a proportion of the locations) in the time and location log file 2, a criterion such as being the closest is used to select 19 one or more reference locations. An overlay file 7 is created which gives the location of the camera relative to the reference location(s) or vice versa. For example the location might be given as "6km NW of mountain summit" or "At Turn 7". The overlay file is be configured to include these relative location(s), the absolute location or a combination of these either simultaneously or sequentially.
Claims (40)
- Cia ius 1. A Video Location Annotation System comprising a source of stored time and location information and a video recorder creating a file containing location information to be overlaid onto a video recording.
- 2. An overlay file which specifies the time and location information and the way in which it should be overlaid onto the video recording.
- 3. A Video Location Annotation System according to Claim 1, in which the stored location information is derived from Global Positioning Satellites.
- 4. A Video Location Annotation System according to Claim 1, in which the stored location information is derived from a radar or sonar transmitter/receiver.
- 5. A Video Location Annotation System according to Claim 1, in which the stored location information is derived from radio direction finding.
- 6. A Video Location Annotation System according to Claim 1, in which the stored location information is derived from a hyperbolic navigation system.
- 7. A Video Location Annotation System according to Claim 1, in which the stored location information is derived from an inertial navigation system.
- 8. A Video Location Annotation System according to Claim 1, in which the timing information stored with the video recording can be adjusted to compensate for inaccuracies.
- 9. A Video Location Annotation System according to Claim 1, in which the timing information stored with the location information can be adjusted to compensate for inaccuracies.
- 10. A Video Location Annotation System according to Claim 1, in which an overlay file is combined with the video recording to produce a derivative video recording which carries the time and location information.
- 11. A Video Location Annotation System according to Claim 1, in which an overlay file is combined with the video recording as it is rendered so that a viewer can see the time and location information.
- 12. A Video Location Annotation System according to Claim 1, in which an external database is queried with stored location information to generate images which are related to the stored location information instances.
- 13. Images according to claim 12 in which only a proportion of the available stored location instances are queried.
- 14. Images according to claim 12 which comprise maps related to the stored location instances.
- 15. Images according to claim 12 which comprise satellite or aerial photographs related to the stored location instances.
- 16. Images according to claim 12 which comprise photographs or graphic images related to the stored location instances.
- 17. A Video Location Annotation System according to Claim 1, in which retrieved images are merged with the original video recording to produce a new video recording where the images appear as a sub-picture in the original video recording.
- 18. A Video Location Annotation System according to Claim 1, in which retrieved images are merged with the original video recording to produce a new video recording where the original video recording appears as a sub-picture in the sequence of images.
- 19. A Video Location Annotation System according to Claim 1, in which retrieved images are rendered alongside the original video recording to be viewed with the images appearing as a sub-picture in the original video recording.
- 20. A Video Location Annotation System according to Claim 1, in which retrieved images are rendered alongside the original video recording to be viewed with the original video recording appearing as a sub-picture in the sequence of images.
- 21. A Video Location Annotation System according to Claim 1, in which retrieved images are combined to generate a subsidiary video recording which is synchronised with the original recording.
- 22. A subsidiary video recording according to claim 22.in which the subsidiary video recording uses only a proportion of the stored location instances.
- 23. A Video Location Annotation System according to Claim 1, in which a subsidiary video recording of retrieved images is merged with the original video recording to produce a new video recording where the subsidiary video recording appears as a sub-picture in the original video recording.
- 24. A Video Location Annotation System according to Claim 1, in which a subsidiary video recording of retrieved images is merged with the original video recording to produce a new video recording where the original video recording appears as a sub-picture in the subsidiary video recording.
- 25. A Video Location Annotation System according to Claim 1, in which a subsidiary video recording of retrieved images is rendered alongside the original video recording to be viewed with the original video recording appearing as a sub-picture in the subsidiary video recording.
- 26. A Video Location Annotation System according to Claim 1, in which a subsidiary video recording of retrieved images is rendered alongside the original video recording to be viewed with the subsidiary video recording appearing as a sub-picture in the original video recording.
- 27. An overlay file according to claim 2 which includes information interpolated between location information instances.
- 28. An overlay file according to Claim 27 where the interpolation is performed linearly between two location information instances.
- 29. An overlay file according to Claim 27 where a spline interpolation is performed between more than two location information instances.
- 30. An overlay file according to Claim 27 where polynomial interpolation is performed between more than two location information instances.
- 31. A Video Location Annotation System according to Claim 3. where the video recording has its timing information calculated by reference to an event that can be identified in both the video recording and the stored time and location information.
- 32. A database of known geographical reference points which are to be compared with the some of the location information instances to select one or more reference points according to a specified criterion.
- 33. A database according to Claim 32 where the criterion to be a selected reference point is to be a member of the set of some number of closest locations, measured between the prolections onto the surface of the earth.
- 34. A database according to Claim 32 where the criterion to be a selected reference point is to be a member of the Set of some number of closest locations, measured between the projections onto the surface of the reference geoid approximating the earth.
- 35. A database according to Claim 32 where the criterion to be a selected reference point is to be a member of the set of some number of closest locations, measured as the slope distance between the reference location and the location instance.
- 36. An overlay file according to Claim 2 which contains details of the selected reference locations to be overlaid onto the video recording.
- 37. An overlay file according to Claim 36 which specifies the distance or distances and bearing or bearings from the selected reference location or locations to the location instance.
- 38. An overlay file according to Claim 36 which specifies the distance or distances and bearing or bearings from the location instance to the reference location or locations.
- 39. An overlay file according to Claim 36 where the bearing or bearings are expressed as a true bearing.
- 40. An overlay file according to Claim 36 where the bearing or bearings are expressed as a magnetic bearing.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0801549A GB2456871A (en) | 2008-01-29 | 2008-01-29 | Video Location Annotation System |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0801549A GB2456871A (en) | 2008-01-29 | 2008-01-29 | Video Location Annotation System |
Publications (2)
Publication Number | Publication Date |
---|---|
GB0801549D0 GB0801549D0 (en) | 2008-03-05 |
GB2456871A true GB2456871A (en) | 2009-08-05 |
Family
ID=39186466
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB0801549A Pending GB2456871A (en) | 2008-01-29 | 2008-01-29 | Video Location Annotation System |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2456871A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020191087A1 (en) * | 1996-04-15 | 2002-12-19 | Canon Kabushiki Kaisha | Communication apparatus and method that link a network address with designated image information |
US20040114042A1 (en) * | 2002-12-12 | 2004-06-17 | International Business Machines Corporation | Systems and methods for annotating digital images |
WO2004090903A1 (en) * | 2003-04-08 | 2004-10-21 | Koninklijke Philips Electronics N.V. | A method of position stamping a photo or video clip taken with a digital camera |
WO2004110061A1 (en) * | 2003-06-03 | 2004-12-16 | Sony Corporation | Recording/reproducing system |
US6833865B1 (en) * | 1998-09-01 | 2004-12-21 | Virage, Inc. | Embedded metadata engines in digital capture devices |
-
2008
- 2008-01-29 GB GB0801549A patent/GB2456871A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020191087A1 (en) * | 1996-04-15 | 2002-12-19 | Canon Kabushiki Kaisha | Communication apparatus and method that link a network address with designated image information |
US6833865B1 (en) * | 1998-09-01 | 2004-12-21 | Virage, Inc. | Embedded metadata engines in digital capture devices |
US20040114042A1 (en) * | 2002-12-12 | 2004-06-17 | International Business Machines Corporation | Systems and methods for annotating digital images |
WO2004090903A1 (en) * | 2003-04-08 | 2004-10-21 | Koninklijke Philips Electronics N.V. | A method of position stamping a photo or video clip taken with a digital camera |
WO2004110061A1 (en) * | 2003-06-03 | 2004-12-16 | Sony Corporation | Recording/reproducing system |
Also Published As
Publication number | Publication date |
---|---|
GB0801549D0 (en) | 2008-03-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240127561A1 (en) | Method for Representing Virtual Information in a View of a Real Environment | |
KR0184245B1 (en) | Electronic camera and image reproducing device therefor | |
US8818138B2 (en) | System and method for creating, storing and utilizing images of a geographical location | |
US11709070B2 (en) | Location based service tools for video illustration, selection, and synchronization | |
US8862987B2 (en) | Capture and display of digital images based on related metadata | |
US20060142942A1 (en) | Device and method for creating electronic album using map data | |
US9026513B2 (en) | Associating digital images with waypoints | |
CN101640775B (en) | Video recording method, photo taking method and mobile terminal | |
CN101848353A (en) | Method and device thereof for recording satellite navigation information in image file | |
CN102032914B (en) | Navigation system | |
JP6412400B2 (en) | Image composition apparatus and image composition program | |
JPH099197A (en) | Recording device for consecutive stereo image data | |
GB2456871A (en) | Video Location Annotation System | |
JPH1013720A (en) | Information service device | |
JP7557580B2 (en) | GENERATION APPARATUS, GENERATION METHOD, AND GENERATION PROGRAM | |
JP7374253B2 (en) | Generation device, generation method, and generation program | |
KR20190096722A (en) | Apparatus and method for providing digital album service through content data generation | |
JP2009147542A (en) | Reproduction device | |
TW201031194A (en) | Method, apparatus and recording media for recording satellite navigation information in image file | |
JP2003189230A (en) | Data recording device and data recording method and structure of data to be recorded | |
Morgan | Baseline coastal oblique aerial photographs collected from Fenwick Island State Park, Delaware, to Corolla, North Carolina, March 27, 1998 | |
Warrick et al. | Overlapping lakebed images collected near Dollar Point, Lake Tahoe, CA, March 10 and 11, 2021 | |
WO2023004441A1 (en) | Trip movie generator | |
Morgan et al. | Archive of Post-Hurricane Isabel Coastal Oblique Aerial Photographs Collected during USGS Field Activity 03CCH01 from Ocean City, Maryland, to Fort Caswell, North Carolina, and Inland from Waynesboro to Redwood, Virginia, September 21-23, 2003 | |
JP2004341742A (en) | Facility information display device |