GB2421653A - System for the collection and association of image and position data - Google Patents

System for the collection and association of image and position data Download PDF

Info

Publication number
GB2421653A
GB2421653A GB0428383A GB0428383A GB2421653A GB 2421653 A GB2421653 A GB 2421653A GB 0428383 A GB0428383 A GB 0428383A GB 0428383 A GB0428383 A GB 0428383A GB 2421653 A GB2421653 A GB 2421653A
Authority
GB
United Kingdom
Prior art keywords
image
location
data
image data
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0428383A
Other versions
GB0428383D0 (en
Inventor
Tim Woolford
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TREK WIRELESS Ltd
Original Assignee
TREK WIRELESS Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TREK WIRELESS Ltd filed Critical TREK WIRELESS Ltd
Priority to GB0428383A priority Critical patent/GB2421653A/en
Publication of GB0428383D0 publication Critical patent/GB0428383D0/en
Publication of GB2421653A publication Critical patent/GB2421653A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)
  • Navigation (AREA)

Abstract

An image collecting system 1 comprises a camera 4 operable to produce data representing images; a positioning device 6; and a processor 2, whereby the image collecting apparatus is installable with a vehicle so that the camera is directed in a generally horizontal direction so as to be able to capture images of the vehicle's surroundings whilst the vehicle progresses along a route. The processor generates plural records each comprising data representing an image and information identifying the location of the image collecting apparatus at or around the time of image data production. Image annotation is carried out by using a computer to process the image data to provide data identifying one or more objects in the image; providing data identifying the location of the one or more objects; using a database and the object location data to obtain information about the one or more objects; and annotating the image with the information about the one or more objects.

Description

Handling Image Data
Description
The present invention relates to an image collecting system, and to a method of collecting image data. The invention relates also to a method of generating records and to apparatus for generating records.
There are many ways to help a driver plan a journey, both before the start of the journey and during the journey itself. The simplest of these is a conventional map, but other more sophisticated systems are available.
On-line route planners, such as that provided by the Automobile Association (www.theaa.com), are simple to use and are usually without charge. A driver must enter a starting point and destination of a journey. The service provides written directions, with information about landmarks that are passed, and the distance the user must travel between junctions. Although it is straightforward to obtain these directions, if the driver is travelling alone, they usually have to read the instructions whilst driving, which is difficult and can be unsafe.
In-car satellite navigation systems usually include a Global Positioning Receiver (GPS) receiver, and a screen to display a map of the route to the driver. Spoken instructions may also be provided. Although these devices have proved useful, maps can be difficult to interpret, especially whilst driving through a busy junction with many lanes. In-car navigation systems, which are installable with a mobile phone and have external GPS receivers, have recently become available. These show maps in perspective, so can be easier to navigate. One such system is described at www.tomtom. com.
In addition to these services, there are other ways of providing a person with information concerning an area, although not primarily for planning a route.
Aerial photographs can be provided using a variety of means, from a kite, to a plane or satellite. They usually take the form of large, plan-view images of an area. The plan view images can be purchased for decorative purposes, or used for research purposes. They are of limited usefulness in planning a route, as it is mote difficult to interpret plan-view images than those taken in a horizontal plane, and they are not labelled with relevant information such as road names.
Interactive maps, where a user can click on an indicated portion of a map to view an eye-level photo, in a direction indicated on the map, show points of interest in an area (see www.sublimephotography.co.uk for example). These images are captured manually, and referenced manually to a location on a map.
There are known methods of capturing images from routes and locations. One example of this is a video camera mounted in a moving vehicle. This technique is often used when shooting films, or preparing television programs. Although it is useful for some indication of the location to be given, a precise location is not given.
The invention addresses some of the shortcomings of the prior art. The inventor has perceived a need for a new way of providing people with information about locations, particularly locations along routes, and the means for generating the information.
A first aspect of the present invention provides an image collecting system as claimed in claim 1.
The generation of records may be operator assisted, but preferably is fully automated.
Preferably, the image collecting apparatus and the processor are coupled together or integrated together.
A second aspect of the present invention provides a method of collecting image data as claimed in claim 11.
Preferably, the location data also identifies the orientation of the positioning device.
A third aspect of the present invention provides a method of generating records as claimed in claim 14.
Preferably, the location data also identifies the orientation of the positioning device.
A fourth aspect of the present invention provides apparatus for generating records as claimed in claim 19.
Preferably, the location data also included orientation information.
The invention also provides a database of records generated using any of the first, third and fourth aspects of the invention. Such a database of images can be integrated with in-car navigation systems, geographic information systems (GIS), or with an on-line route planner. Eye-level images can be understood and compared quickly with surroundings. Therefore, the directions given on such navigation systems can be interpreted easily.
Embodiments of the present invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 illustrates schematically the components of a first embodiment of an image collecting system according to the invention; Figure 2 illustrates the Figure 1 image collecting system installed in a vehicle; Figure 3 illustrates the progression of the Figure 2 vehicle along a route; Figure 4 illustrates schematically the components of a second embodiment of the image collecting system according to the invention; Figure 5 shows an image captured using the Figure 4 image collecting system, annotated with location specific information.
Figure 6a is a plan view of a car at a junction; Figure 6b shows an image taken from the vehicle in Figure 6a; Table I shows an example log file from a GPS receiver; and Table 2 shows an example output file from a collating program.
In the Figures, reference numerals are re-used for like elements throughout.
Referring firstly to Figure 1, the image collecting system I comprises a computer 2, a camera 4, and a positioning device 6. The camera 4 is connected to the computer 2 by means of a USB cable 8, though any suitable connection may be used. The positioning device 6 is connected to the computer 2 by a USB cable 10, though any suitable connection means may be used, including a wireless connection. The computer 2, the camera 4, and the positioning device 6 are connected to a power supply 11. However, any of the computer 2, the camera 4, the positioning device 6 can be battery operated.
The camera 4 contains a first clock 12, which allows the determination of times when images are captured. The positioning device 6 includes a second clock 14.
The first and second clocks 12, 14 are not synchronised. Therefore, there may be an offset between the two clocks 12, 14. The camera 4 is set to capture images at predetermined time intervals, and transfer data representing the images via the USB cable 8 to the computer 2. The camera 4 also transfers information about the time the images were captured according to the clock 12. The positioning device 6 identifies its location at predetermined time intervals. It transfers data identifying its location, and data identifying the time the positioning device 6 was at that location according to the clock 14, to the computer 2. The transfer of both location data and images can occur in real time, or in bulk at the end of an operating session.
Figure 2 shows the image collecting system I installed in a vehicle 16. Only a dashboard 18 and a front screen 20 of the vehicle 16 are shown. The computer 2 is preferably in the form of a laptop, tablet or notebook computer. The computer 2 is preferably fixed to the dashboard 12. However the computer 2 can be placed in, or fixed to, any part of the vehicle 16. Alternatively, the computer 2 may be supported by an occupant of the vehicle. The camera 4 is fixed to the inside of the screen 20 by means of a suction pad 21. The camera 4 is directed to capture images of the road ahead of the vehicle 16. The camera 4 may be fixed by any suitable means to any part of the vehicle 16, including by being mounted externally, provided that unobstructed images of the vehicle's surroundings can be produced. The positioning device 6 is a GPS receiver. The GPS receiver 6 is fixed to the dashboard 14 of the vehicle 16. The GPS receiver 6 may be fixed to any part of the vehicle 16, inside or out.
Figure 3 illustrates the progression of the vehicle 16 at locations 16a, 16b and 16c, along a route. The image collecting system I is installed in the vehicle 16. In this example, images are produced by the camera 4 every 10 seconds. The GPS receiver 6 is arranged to determine its location, i.e. provide a location fix, once every second.
Accordingly, an image may be captured at a time not corresponding to a location fix time.
The location data fed from the GPS receiver 6 comprises latitude and longitude components. In a moving vehicle, the bearing of the vehicle at that time is also determined and forms part of the location data.
The vehicle 16a shows the position of the vehicle at a time t10s. At this time a first image, Imagel.jpg, is captured. The first image is transferred to the computer 2. The first image is stored in a memory of the computer 2. The vehicle 16b shows the position of the vehicle at a second time t210s. At this time a second image, Image2.jpg, is captured, and transferred to the computer 2. The vehicle I 6c shows the position of the vehicle at a third time t320s. At the third time, a third image, Image3.jpg, is captured, and transferred to the computer 2. The GPS receiver 6 sends location information, corresponding to location and bearing fixes at I second intervals, to the computer 2.
In this way, images of the vehicle's surroundings are captured at various points along the route of the vehicle. The time interval between image capture is chosen to provide sufficient detail of the route travelled. In the above example, for a vehicle 16 travelling at a speed of 3okph, an image is generated every 83m. The camera 4 is also manually operable, so images may be captured at additional points where a user considers there to be a significant point of interest, such as at a road junction.
The time interval between location fixes may be more or less than the time interval between successive image productions. Any interval can be used for location fixes, though it is preferable that a small interval is chosen so that the location of the apparatus at the time of image production is determined accurately..
When the vehicle 16 has completed the route, the operating session ends. Then, the data is processed using the computer 2.
The computer 2 includes a memory (not shown) for storage of the images and location data, and a processor (not shown) for processing them. The computer 2 also has suitable inputs for connecting to the camera 4 and GPS receiver 6. The computer 2 also comprises a screen, to view the results of the processing, and one or more of a keyboard, a mouse, and a touch screen, for user input.
The camera 4 is a digital camera. The digital camera 4 has a resolution of at least 3 megapixels. The images captured with the camera 4 are preferably in Exif II JPEG format. The camera 4 is installed with software that enables it to communicate with the computer 2, for transferring images to it. Preferably, the camera 4 is programmable with specified filenames for the captured images. The camera 4 may be programmed with the number of images required. The camera 4 may be controlled by an application in the computer 2. The camera 4 may be mounted at any point in or on the vehicle, so long as unobstructed images of the vehicle's surroundings are captured. The first clock 12 in the camera 4 is a standard clock such as a quartz clock. The time indicated by the first clock 12 is susceptible to inaccuracy.
The GPS receiver 6 outputs data in the format specified by the NMEA 0183 Interface Standard, though any suitable format may be used. Differential correction may be supported by the GPS receiver 6, to provide greater accuracy This enables very accurate fixes to be made. The computer 2 is preferably installed with a terminal program that captures the output of the GPS receiver 6 and write it in ASCII format to a text file. The second clock 14 in the GPS receiver 6 is a standard quartz clock. However, the time of the second clock 14 is periodically corrected using information from the highly accurate atomic clocks of GPS satellites.
Therefore the second clock 14 in the GPS receiver 6 is usually more accurate than the first clock 12 in the camera 4.
Since the computer 2 is connected to the GPS receiver 6, it is possible to synchronise the system time of the computer 2 with GPS time. As images are captured to the memory of the computer, their file creation times are synchronised with GPS time.
The image collection system I can be easily transferable between vehicles. Also, the system can be constructed such that components such as the camera 4 can be upgradeable individually. The image collection system I is operated under control of a computer program stated in and running on the computer 2.
Depending on the number of roads within the area, a 1.6 km square area can typically be surveyed in approximately one hour, generating 360 images. At 1600x1200 resolution (180 dpi), this equates to 150 Mb of storage.
Figure 4 shows a second embodiment of an image collecting system 22. Here, the computer 2, the camera 4, the GPS receiver 6, and the power supply 11 are housed in a weatherproof case 24. This case 24 is mounted on the roof of the vehicle 16.
There is a forward-facing aperture 23 in the weatherproof case 24, so that images of the vehicle's surroundings are captured. The computer 2 is remotely connected to a touch screen 26, so that the apparatus can be controlled from within the vehicle 16.
This embodiment is particularly suitable for use where image capture is required on a regular basis.
In another embodiment (not shown), only the camera 4 and the GPS receiver 6 are installed in the vehicle 16. The computer 2 is provided separately from the camera 4 and the GPS receiver 6, for example at a remote processing facility (not shown).
Whilst the vehicle 16 progresses along the route, the camera 4 and the GPS receiver 6 store the images and location data respectively. At the end of the route, the images and location data are transferred to the computer 2 for processing.
Table I shows typical location data output from the GPS receiver 6, when the NMEA 0183 Interface Standard is used. The location data is preferably contained in a log file, where the log file is in the form of a standard text file. Each line of the log file is a comma-delimited sentence. The first field of the sentence indicates the sentence type. In this example, only the $GPRMS sentences (Recommended minimum specific GNSS data) are used for processing. In a $GPRMS sentence, the second field indicates the time that location data corresponds to, in the format HHMMSS.XXX. The third field indicates the fix status, where A' indicates that the GPS satellite had a fix, and V' indicates an invalid fix. Only sentences with the A' fix status are used for processing. The fourth field indicates the latitude in degrees and minutes, in the format DDMM.XXXX. The fifth field indicates which hemisphere the latitude measurement is taken in: either North N' or South S'. The sixth field indicates the longitude, in the same format as the latitude. The seventh field indicates which hemisphere the longitude measurement is taken in: either West W' or East E'. The eighth field indicates the speed in knots. The ninth field indicates the bearing in degrees. The remaining fields are not processed.
The images are preferably output in Exif II JPEG format. The file header contains the filename, the creation time and date and the modification time and date.
Processing software stored in a non-volatile memory (not shown), and is run by the computer 2. It collates the images and location data, so that each image is referenced to a location. The processing software is provided with the following information before the collation process begins: the location of the GPS log file; the directory containing the captured images; an output file location and name; which date/time (creation or modification) should be used for the images; and a clock correction factor.
The clock correction factor is used in cases where there may be an offset between the first clock 12 on the camera 4 and the second clock 14 on the GPS receiver 6.
The clock correction factor is simply the difference, in seconds, between the two clocks. Alternatively, the first and second clocks 12, 14 may be synchronised with each other before the capture process via the computer 2. However, even if the clocks 12, 14 are synchronised with one another, the image creation time may not exactly correspond with the times in the GPS log files due to the finite time taken to capture an image and write it to memory. The time taken to write the image to memory depends on the resolution of the image. It is therefore preferable that a clock correction factor is always used.
The correction factor may be determined in any suitable way. For instance, by taking a digital image of the computer screen with the (GPS synchronised) time displayed on the screen, the offset can be determined. The correction factor can then be obtained by comparing the time displayed in the image with the image modification/creation time shown on the camera 4. Alternatively a route can be driven and images captured at known locations. After processing, the calculated locations of the images can be compared with the actual locations and the correction factor obtained.
Operation is as follows.
The processing software firstly opens the output file in ASCII CSV format.
The first image is located, and its creation time or modification time, depending on which was chosen before the start of the collation process, is obtained. This time is adjusted using the clock correction factor. The result of this is that this adjusted time and the times indicated in the GPS log file are referenced to the same clock 14.
The computer 2 searches through the GPS log file to find the sentence with the time closest to the adjusted image creation or modification time. To do this, all the times are rounded to the nearest second. If no exact match is found, then one second is subtracted from the adjusted image creation or modification time and the process is repeated. When an exact match is found, this is deemed to be the closest - 10 - time match. Then, the result is fed to the output file. The resulting entry in the output file contains the location information contained in the chosen GPS sentence in the GPS log file, and the filename of the image. The entry also contains some information from the GPS log file, reformatted for ease of interpretation. These are: latitude reformatted into decimal degrees (DD.DDDDDDDD); longitude reformatted into decimal degrees; and time reformatted to HH:MM:SS. The reformatted latitude and longitude are also given a positive or negative sign depending on their hemisphere. A positive sign is given for North and East respectively. A negative sign is given for South and West respectively.
The above process is repeated for every image. When the processing is complete, the output file can be displayed or otherwise provided as an output. An example of an output file is shown in Table 2. Preferably, a summary is also displayed identifying the number of records processed, the variables used, and giving an option to save the information as a text file.
The resulting output file, or database, comprising plural records each containing location information and image data, can be used in a number of ways.
The output file can be used as a source to embed the Exif header fields in the image with all the location information corresponding to that image. Each image file then contains all the information needed to identify where it was captured and in what direction it was captured.
The output file can be integrated with a satellite navigation system (not shown). As a driver approaches a location for which an image is available, the image appears on the screen. This can allow the user to be shown which of plural lanes is appropriate, for instance. The satellite navigation system can be set to display only points of interest, such as junctions.
The output file can be integrated with journey planners, such as the online journey planner provided by the Automobile Association (www.theaa. com). When the written directions apply to a location in the output file, the relevant image can be -11 - shown. The driver can thus experience a virtual journey, before starting the actual journey.
The output file can be used in Geographic Information Systems (GISs), for example web-based GISs.
The output file may also be used for mobile applications. Since the images are in Exif II jpeg format, they can also be easily re-rendered in different resolutions and sizes. This allows them to be optimised for specific applications. For example images can be converted to lower resolution and smaller size so that they can be downloaded over a mobile data link (such as GPRS or 3G) to a mobile device.
The output file can also be integrated with digital maps, such as Microsoft's Autoroute TM* The digital map displays the images' location, for instance using an icon or arrow symbol. Preferably, the digital map also displays the images' orientations, such as by using an arrow icon.
If processing is carried out in the field and the location of the digital images is super-imposed on to a digital map by importing the output file, this can allow a survey team to identify easily during as immediately following an operating session if the required number of images have been captured and that all roads have been surveyed, etc. The output file (for instance in standard ASCII CSV text format) can be transferred from the field via a mobile data link (such as GPRS or 3G) to a head office system, thereby allowing head office staff to track progress.
When the vehicle returns to its base, the digital images, together with the output reference file can be transferred via removable media.
The images can be enhanced with annotations, to increase their usefulness.
- 12 - As the bearing of the vehicle at the time that each image was produced is known, a compass graphic indicating the direction in which the image was taken can be superimposed on the image. Preferably, the bearing is given to the nearest 45 , and corresponds to a compass point. However, any form of compass graphic can be used, and the bearing can be presented to any required resolution.
The images can be annotated with location specific information. Figure 5 shows an example of such an image. Annotation may be provided manually. It can instead be provided in a fully or partly automated fashion.
The coordinates of the field of view of the images can be easily defined. Figure 6a shows a plan view of a vehicle 16 approaching a junction, as an image is captured.
Images are annotated automatically as follows. After isolating the data representing an image and the corresponding location and orientation data, the data is processed by the computer 2, or by another computer. The computer uses image analysis software to identify significant objects in the image The boundary of the field of view 28 of the image is defined by the camera settings, and has the same shape and relative position for each image. All points within the boundary of the field of view 28 are visible in the captured image. These points are identified by image analysis software, as is known in the art, to identify individual buildings. Typically, only buildings are identified, although other objects such as statues, parks and gateways may also be identified. This analysis typically involves identifying lines in the image, and identifying building outlines from those lines and their intersections. The computer then uses the location and orientation data along with the object identification data to provide data identifying the location of the one or more objects. The locations are identified through scene analysis, using vanishing points, vanishing lines and the like. Distances to objects may be estimated in any suitable way, using people as reference objects if appropriate. If a person standing next to a building has a certain height in pixels, then the distance of that person, and thus the building, from the camera can be estimated. The computer then uses the object location information to interrogate an external database to obtain information about the objects. For instance, a database such as that behind www.multimap.co.uk can be used to identify what use buildings have, identifying shop names and opening - 13 - times and the like. This information then is used by the computer to annotate the relevant parts of the image, before providing the annotated image.
Figure 6b shows an image that has been automatically annotated using the local information. The local information can be formatted before being used to annotate the image.
Records produced in the manner described above can be used to advantage in many ways. They can be used with web based information services, such as www.streetmap.co.uk, www.multimap.co.uk, 192.com, etc. They can also be used in GPS-based Navigation systems, including in-car, handheld and web-based systems. They can be used also to enhance existing mobile location-based services.
Examples are Vodafone's TM find and seek service. Providing such services with ground level images obtained as above can increase their usefulness. The invention can also be used for the enhancement of professional GIS systems such as Mapinfo TM, Mapoint TM etc. End user applications of the records include: local authorities and emergency services. Here, the images can be used to identify the location or an incident, before despatching fire or ambulance units. This also reduces the need for onfoot surveys.
Estate Agents and House buyers could use records to identify what an area looks like before visiting it. Travel guides, both online and printed versions, can also benefit. Travel Agents and holiday makers can benefit for similar reasons.
The records can be used for education, since a student can learn about an area without visiting it. For similar reasons, the records are of potentially high value to people working in architecture, civil engineering, land surveyors, property valuation and the like.
As well as enabling people to find images about a place of interest, the records can be pushed by users. For instance, businesses can use images from selected records - 14 - to identify their offices, branches or shops to their customers and suppliers, and to offer directions using the images. Business information service providers can use the images to show the locations of businesses they provide information on.
Sales staff can use images to identify potential customer businesses when planning sales routes (FMCG type sales, calling into outlets, pharmaceutical, visiting surgeries, etc.).
Publishers of for instance newspapers can gain access to local photographs through the records. Outdoor advertising companies can use images to identity both existing and potential new sites.
In the mobile domain, mobile phone and PDA users can annotate images with notes and then send them as an MMS. An example annotation could be "meet you here at 7pm" with a suitable pointer.
Mobile and web based Directory Enquiries services could use a record to send an annotated image either via email or MMS, to a mobile device, providing an added useful service.
Although the present invention has been described with respect to the above embodiments, it should be apparent to those skilled in the art that modifications can be made without departing from the scope of the invention.
For instance, the positioning device 6 may be a transceiver or transmitter whose location can be determined externally, for example by a mobile telephone or other network. In this case, the location information may be obtained by the computer 2 after completing an operating session.
Instead of the location of the positioning device 6 at the time closest to the image capture time being taken as the actual location, the location may instead be estimated. For example, two locations either side of the time of capture of an image could be interpolated to estimate thelocation. This interpolation may utilise the time of image capture and the times of the location fixes.
- 15 - Instead of using an internal clock in the camera 4 to determine the time of image capture, a clock of the computer 2 may be used. This may require the image data to be passed to the computer 2 from the camera quickly, so that the computer can correctly determine the time at which the image was taken. The clock in the computer 2 may be synchronised with the time in the positioning device 6, especially if it is a GPS receiver. If the positioning device feeds out location information immediately, the clock of the computer may be used to determine the image capture times and the location fix times.
As well as or instead of a forward facing camera, a camera directed differently may be used. This may be facing against the direction of travel for instance.

Claims (29)

  1. - 16 - Claims 1. An image collecting system comprising: image collecting
    apparatus comprising: a camera operable to produce data representing images; and a positioning device; whereby the image collecting apparatus is installable with a vehicle so that the camera is directed in a generally horizontal direction so as to be able to capture images of the vehicle's surroundings whilst the vehicle progresses along a route; and a processor arranged to receive the image data produced by the image collecting apparatus and to receive information identifying the location of the image collecting apparatus at plural times, and to generate plural records each comprising data representing an image and information identifying the location of the image collecting apparatus at or around the time of image data production.
  2. 2. A system as claimed in claim 1, wherein the image collecting apparatus and the processor are coupled together or integrated together.
  3. 3. A system as claimed in either preceding claim, wherein the positioning device is a GPS receiver.
  4. 4. A system as claimed in any claim 3, wherein the GPS receiver supports differential correction.
  5. 5. A system as claim in any preceding claim, wherein the processor is part of a laptop, tablet, notebook or personal computer.
  6. 6. A system as claimed in any preceding claim, wherein the positioning device is operable to provide orientation information.
  7. 7. A system as claimed in any preceding claim, wherein the camera is programmed to produce data representing an image automatically at predetermined time intervals.
    - 17 -
  8. 8. A system as claimed in any preceding claim, wherein the camera is operable to produce data representing an image in response to and immediately following a user input.
  9. 9. A system as claimed in any preceding claim, wherein the camera is programmed with a destination at which to write the image data.
  10. 10. A system as claimed in any preceding claim, wherein the camera is controlled by the processor.
  11. 11. A method of collecting image data, the method comprising: installing a camera and a positioning device with a vehicle, whereby the camera is directed in a generally horizontal direction; progressing the vehicle along a route; using the camera to produce data representing images of the vehicle's surroundings at plural points along the route; acquiring location data identifying the location of the positioning device at plural points along the route; and transferring the image data and the location data for processing.
  12. 12. A method as claimed in claim 11, wherein the information identifying the location of the image collecting apparatus includes orientation information.
  13. 13. A method as claimed in claim 11 or claim 12, comprising transferring the image data from the camera to a computer as the image data is produced.
  14. 14. A method of generating records each comprising location data and data representing an image, data representing an image having been produced by a camera at plural image data production times, and the location data identifying the location of a positioning device at plural location fix times whilst in close proximity to the camera, the method comprising: - 18 - determining one or more location fix times that are closest to an image data production time; and generating a record containing the data representing an image that corresponds to the image data production time and information identifying the location of the positioning device at or around the image data production time.
  15. 15. A method as claimed in claim 14, comprising using an offset to compensate any difference between a clock used to determine the location fix times and a clock used to determine the image data production times.
  16. 16. A method as claimed in claim 14 or claim 15, wherein said information identifying the location of the positioning device at or around the image data production time comprises location information corresponding to a location fix time that is closest to the image data production time.
  17. 17. A method as claimed in claim 14 or claim 15, wherein providing said information identifying the location of the positioning device comprises using location information corresponding to at least two location fix times that are closest to an image data production time to estimate the location of the positioning device at the image data production time.
  18. 18. A method as claimed in any of claims 11 to 17, wherein the positioning device is a GPS receiver.
  19. 19. Apparatus for generating records each comprising location data and data representing an image, data representing an image having been produced by a camera at plural image data production times, and the location data identifying the location of a positioning device at plural location fix times whilst in close proximity to the camera, the apparatus comprising: means for determining one or more location fix times that are most closely related to an image data production time; and - 19 - means for generating a record containing the data representing an image that corresponds to the image data production time and information identifying the location of the positioning device at or around the image data production time.
  20. 20. Apparatus as claimed in claim 19, arranged to use offset to compensate any difference between a clock used to determine the location fix times and a clock used to determine the image data production times.
  21. 21. Apparatus as claimed in claim 19 or claim 20, wherein said information identifying the location of the positioning device at or around the image data production time comprises location information corresponding to a location fix time that is closest to the image data production time.
  22. 22. Apparatus as claimed in claim 19 or claim 20, arranged to use location information corresponding to at least two location fix times that are most closely related to an image data production time to estimate the location of the positioning device at the image data production time.
  23. 23. A database of records produced using the apparatus of any of claims I to 10 andl9to22.
  24. 24. A database of records produced using the method of claim 14 or any claim dependent on claim 14.
  25. 25. A database as claimed in claim 23 or claim 24, wherein one or more image data items comprises an image annotated with location specific information.
  26. 26. A database as claimed in any of claims 23 to 25, wherein one or more image data items comprises an image annotated with orientation information.
  27. 27. A method of providing an annotated image, the method comprising: receiving data representing an image, data identifying at least the location at which the image was produced, and orientation information; 20 - using a computer to process the image data to provide data identifying one or more objects in the image; using the computer to process the location data, the orientation information data and the object identification data to provide data identifying the location of the one or more objects; using a database and the object location data to obtain information about the one or more objects; and annotating the image with the information about the one or more objects, and providing the annotated image.
  28. 28. Apparatus arranged to perform the method of claim 27.
  29. 29. A computer program comprising instructions for a processor to carry out the steps of any of claims 11 to 18 and 27.
GB0428383A 2004-12-24 2004-12-24 System for the collection and association of image and position data Withdrawn GB2421653A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0428383A GB2421653A (en) 2004-12-24 2004-12-24 System for the collection and association of image and position data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0428383A GB2421653A (en) 2004-12-24 2004-12-24 System for the collection and association of image and position data

Publications (2)

Publication Number Publication Date
GB0428383D0 GB0428383D0 (en) 2005-02-02
GB2421653A true GB2421653A (en) 2006-06-28

Family

ID=34130946

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0428383A Withdrawn GB2421653A (en) 2004-12-24 2004-12-24 System for the collection and association of image and position data

Country Status (1)

Country Link
GB (1) GB2421653A (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1734341A1 (en) * 2005-06-14 2006-12-20 LG Electronics Inc. Matching camera-photographed image with map data in portable terminal and travel route guidance method
EP1903310A2 (en) 2006-09-19 2008-03-26 Reigncom Ltd. Vehicle navigation system including camera unit
GB2450371A (en) * 2007-06-22 2008-12-24 Reuben Wilcock Associating position data with camera images
EP2153387A1 (en) * 2007-04-26 2010-02-17 Vinertech Pty Ltd Collection methods and devices
DE102009006471A1 (en) * 2009-01-28 2010-09-02 Audi Ag Method for operating a navigation device of a motor vehicle and motor vehicle therefor
EP2260457A2 (en) * 2008-02-26 2010-12-15 Microsoft Corporation System for logging life experiences using geographic cues
CN102629985A (en) * 2011-02-04 2012-08-08 佳能株式会社 Information processing apparatus and control method therefor
WO2013068145A1 (en) * 2011-11-08 2013-05-16 National University Of Ireland Maynooth A synchronisation system
US8612134B2 (en) 2010-02-23 2013-12-17 Microsoft Corporation Mining correlation between locations using location history
US8719198B2 (en) 2010-05-04 2014-05-06 Microsoft Corporation Collaborative location and activity recommendations
US9009177B2 (en) 2009-09-25 2015-04-14 Microsoft Corporation Recommending points of interests in a region
US9063226B2 (en) 2009-01-14 2015-06-23 Microsoft Technology Licensing, Llc Detecting spatial outliers in a location entity dataset
US9261376B2 (en) 2010-02-24 2016-02-16 Microsoft Technology Licensing, Llc Route computation based on route-oriented vehicle trajectories
US9536146B2 (en) 2011-12-21 2017-01-03 Microsoft Technology Licensing, Llc Determine spatiotemporal causal interactions in data
US9593957B2 (en) 2010-06-04 2017-03-14 Microsoft Technology Licensing, Llc Searching similar trajectories by locations
US9683858B2 (en) 2008-02-26 2017-06-20 Microsoft Technology Licensing, Llc Learning transportation modes from raw GPS data
US9754226B2 (en) 2011-12-13 2017-09-05 Microsoft Technology Licensing, Llc Urban computing of route-oriented vehicles
GB2549384A (en) * 2016-03-21 2017-10-18 Ford Global Tech Llc Inductive loop detection systems and methods
US10288433B2 (en) 2010-02-25 2019-05-14 Microsoft Technology Licensing, Llc Map-matching for low-sampling-rate GPS trajectories

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0933277A (en) * 1995-07-18 1997-02-07 Matsushita Electric Ind Co Ltd Route guidance apparatus
JPH10176930A (en) * 1996-12-16 1998-06-30 Mazda Motor Corp Information reproducing apparatus and information recording medium for vehicle
JPH1194571A (en) * 1997-09-18 1999-04-09 Toshiba Corp Recording and reproducing device, recording and reproducing method and recording medium
EP0921376A1 (en) * 1997-12-03 1999-06-09 Mixed Reality Systems Laboratory Inc. Panoramic image acquisition system
JPH11160080A (en) * 1997-12-01 1999-06-18 Harness Syst Tech Res Ltd Mobile body information system
JPH11304500A (en) * 1998-04-17 1999-11-05 Sony Corp On-vehicle information processing equipment and automobile
JP2000101884A (en) * 1998-09-17 2000-04-07 Fuji Photo Film Co Ltd Electronic camera
US6222583B1 (en) * 1997-03-27 2001-04-24 Nippon Telegraph And Telephone Corporation Device and system for labeling sight images
US6233523B1 (en) * 1997-10-02 2001-05-15 Ibs Integrierte Business Systeme Gmbh Method of collection and linking of positional data from satellite localization and other data
CN1353393A (en) * 2001-12-18 2002-06-12 中国科学院测量与地球物理研究所 Scenery data acquiring system for obtaining time information and coordinate information at same time
US20020090143A1 (en) * 2001-01-11 2002-07-11 Takaaki Endo Image processing apparatus, method of processing images, and storage medium
JP2004048427A (en) * 2002-07-12 2004-02-12 Koncheruto:Kk Digital camera with photographic position and azimuth or portable telephone system with image
JP2004120397A (en) * 2002-09-26 2004-04-15 Fuji Photo Film Co Ltd Method, device, and program for outputting image
JP2004257979A (en) * 2003-02-27 2004-09-16 Sanyo Electric Co Ltd Navigation apparatus
WO2004095374A1 (en) * 2003-04-21 2004-11-04 Nec Corporation Video object recognition device and recognition method, video annotation giving device and giving method, and program

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0933277A (en) * 1995-07-18 1997-02-07 Matsushita Electric Ind Co Ltd Route guidance apparatus
JPH10176930A (en) * 1996-12-16 1998-06-30 Mazda Motor Corp Information reproducing apparatus and information recording medium for vehicle
US6222583B1 (en) * 1997-03-27 2001-04-24 Nippon Telegraph And Telephone Corporation Device and system for labeling sight images
JPH1194571A (en) * 1997-09-18 1999-04-09 Toshiba Corp Recording and reproducing device, recording and reproducing method and recording medium
US6233523B1 (en) * 1997-10-02 2001-05-15 Ibs Integrierte Business Systeme Gmbh Method of collection and linking of positional data from satellite localization and other data
JPH11160080A (en) * 1997-12-01 1999-06-18 Harness Syst Tech Res Ltd Mobile body information system
EP0921376A1 (en) * 1997-12-03 1999-06-09 Mixed Reality Systems Laboratory Inc. Panoramic image acquisition system
JPH11304500A (en) * 1998-04-17 1999-11-05 Sony Corp On-vehicle information processing equipment and automobile
JP2000101884A (en) * 1998-09-17 2000-04-07 Fuji Photo Film Co Ltd Electronic camera
US20020090143A1 (en) * 2001-01-11 2002-07-11 Takaaki Endo Image processing apparatus, method of processing images, and storage medium
CN1353393A (en) * 2001-12-18 2002-06-12 中国科学院测量与地球物理研究所 Scenery data acquiring system for obtaining time information and coordinate information at same time
JP2004048427A (en) * 2002-07-12 2004-02-12 Koncheruto:Kk Digital camera with photographic position and azimuth or portable telephone system with image
JP2004120397A (en) * 2002-09-26 2004-04-15 Fuji Photo Film Co Ltd Method, device, and program for outputting image
JP2004257979A (en) * 2003-02-27 2004-09-16 Sanyo Electric Co Ltd Navigation apparatus
WO2004095374A1 (en) * 2003-04-21 2004-11-04 Nec Corporation Video object recognition device and recognition method, video annotation giving device and giving method, and program

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2840358A1 (en) * 2005-06-14 2015-02-25 LG Electronics, Inc. Matching camera-photographed image with map data in portable terminal and travel route guidance method
US7728869B2 (en) 2005-06-14 2010-06-01 Lg Electronics Inc. Matching camera-photographed image with map data in portable terminal and travel route guidance method
EP1734341A1 (en) * 2005-06-14 2006-12-20 LG Electronics Inc. Matching camera-photographed image with map data in portable terminal and travel route guidance method
EP1903310A2 (en) 2006-09-19 2008-03-26 Reigncom Ltd. Vehicle navigation system including camera unit
EP1903310A3 (en) * 2006-09-19 2009-09-02 Reigncom Ltd. Vehicle navigation system including camera unit
EP2153387A4 (en) * 2007-04-26 2012-04-25 Vinertech Pty Ltd Collection methods and devices
EP2153387A1 (en) * 2007-04-26 2010-02-17 Vinertech Pty Ltd Collection methods and devices
GB2450371B (en) * 2007-06-22 2011-10-26 Reuben Wilcock Method and apparatus for sending positioning data to a connected apparatus
GB2450371A (en) * 2007-06-22 2008-12-24 Reuben Wilcock Associating position data with camera images
EP2260457A2 (en) * 2008-02-26 2010-12-15 Microsoft Corporation System for logging life experiences using geographic cues
EP2260457A4 (en) * 2008-02-26 2012-12-19 Microsoft Corp System for logging life experiences using geographic cues
US9683858B2 (en) 2008-02-26 2017-06-20 Microsoft Technology Licensing, Llc Learning transportation modes from raw GPS data
US9063226B2 (en) 2009-01-14 2015-06-23 Microsoft Technology Licensing, Llc Detecting spatial outliers in a location entity dataset
DE102009006471A1 (en) * 2009-01-28 2010-09-02 Audi Ag Method for operating a navigation device of a motor vehicle and motor vehicle therefor
US9009177B2 (en) 2009-09-25 2015-04-14 Microsoft Corporation Recommending points of interests in a region
US9501577B2 (en) 2009-09-25 2016-11-22 Microsoft Technology Licensing, Llc Recommending points of interests in a region
US8612134B2 (en) 2010-02-23 2013-12-17 Microsoft Corporation Mining correlation between locations using location history
US9261376B2 (en) 2010-02-24 2016-02-16 Microsoft Technology Licensing, Llc Route computation based on route-oriented vehicle trajectories
US10288433B2 (en) 2010-02-25 2019-05-14 Microsoft Technology Licensing, Llc Map-matching for low-sampling-rate GPS trajectories
US8719198B2 (en) 2010-05-04 2014-05-06 Microsoft Corporation Collaborative location and activity recommendations
US9593957B2 (en) 2010-06-04 2017-03-14 Microsoft Technology Licensing, Llc Searching similar trajectories by locations
US10571288B2 (en) 2010-06-04 2020-02-25 Microsoft Technology Licensing, Llc Searching similar trajectories by locations
US8804007B2 (en) 2011-02-04 2014-08-12 Canon Kabushiki Kaisha Information processing apparatus and control method therefor
CN102629985B (en) * 2011-02-04 2015-06-17 佳能株式会社 Information processing apparatus and control method therefor
US9584676B2 (en) 2011-02-04 2017-02-28 Canon Kabushiki Kaisha Information processing apparatus and control method therefor
EP2485469A1 (en) * 2011-02-04 2012-08-08 Canon Kabushiki Kaisha Information processing apparatus and control method therefor
CN102629985A (en) * 2011-02-04 2012-08-08 佳能株式会社 Information processing apparatus and control method therefor
US20140267798A1 (en) * 2011-11-08 2014-09-18 National University Of Ireland Maynooth Synchronisation system
WO2013068145A1 (en) * 2011-11-08 2013-05-16 National University Of Ireland Maynooth A synchronisation system
US9754226B2 (en) 2011-12-13 2017-09-05 Microsoft Technology Licensing, Llc Urban computing of route-oriented vehicles
US9536146B2 (en) 2011-12-21 2017-01-03 Microsoft Technology Licensing, Llc Determine spatiotemporal causal interactions in data
GB2549384A (en) * 2016-03-21 2017-10-18 Ford Global Tech Llc Inductive loop detection systems and methods

Also Published As

Publication number Publication date
GB0428383D0 (en) 2005-02-02

Similar Documents

Publication Publication Date Title
US9471986B2 (en) Image-based georeferencing
US7155336B2 (en) System and method for automatically collecting images of objects at geographic locations and displaying same in online directories
GB2421653A (en) System for the collection and association of image and position data
US9497581B2 (en) Incident reporting
US8897541B2 (en) Accurate digitization of a georeferenced image
US9535587B2 (en) System and method for displaying information in response to a request
US6604049B2 (en) Spatial information using system, system for obtaining information, and server system
US8356255B2 (en) Virtual white lines (VWL) for delimiting planned excavation sites of staged excavation projects
US8712689B2 (en) Method for computer-based determination of a position in a map, navigation device and mobile radio telephone
US8364397B2 (en) Pictorial navigation
AU2011211601B2 (en) Method for providing information on object within view of terminal device, terminal device for same and computer-readable recording medium
US20020116121A1 (en) Bundled map guide
EP2393076A2 (en) Navigable topological maps
US20090171980A1 (en) Methods and apparatus for real estate image capture
JP2007121528A (en) System and method for renewing map creation
US20080079808A1 (en) Method and device for collection and application of photographic images related to geographic location
WO2009130729A2 (en) Application for identifying, geo-locating and managing points of interest (poi)
US8688368B2 (en) Image-based localization for addresses
KR100810153B1 (en) Apparatus and method of guiding grave location using gps
JP2010164402A (en) Information collecting device, mobile terminal device, information center, and navigation system
JP2013152250A (en) Mobile terminal and information center
KR102336775B1 (en) Apparatus and method for generating thema route
US20120202516A1 (en) Apparatus and method for providing location-based data
Rahim et al. GNSS-and-GIS based android integration of mobile based virtual guide application ExpLahore for walled city Lahore, Pakistan
Gautam et al. Multimedia for mobile environment: image enhanced navigation

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)