WO2009126159A1 - Procédés et appareil de vérification de signalisation - Google Patents

Procédés et appareil de vérification de signalisation Download PDF

Info

Publication number
WO2009126159A1
WO2009126159A1 PCT/US2008/059952 US2008059952W WO2009126159A1 WO 2009126159 A1 WO2009126159 A1 WO 2009126159A1 US 2008059952 W US2008059952 W US 2008059952W WO 2009126159 A1 WO2009126159 A1 WO 2009126159A1
Authority
WO
WIPO (PCT)
Prior art keywords
location
image
data
actual
signage
Prior art date
Application number
PCT/US2008/059952
Other languages
English (en)
Inventor
Michael Alan Hicks
Original Assignee
The Nielsen Company (U.S.)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Nielsen Company (U.S.) filed Critical The Nielsen Company (U.S.)
Priority to PCT/US2008/059952 priority Critical patent/WO2009126159A1/fr
Publication of WO2009126159A1 publication Critical patent/WO2009126159A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases

Definitions

  • This disclosure relates generally to media exposure measurement systems and, more particularly, to methods and apparatus for auditing signage.
  • FIG. 1 is a block diagram of an example media site data collection system used to collect media site information as described herein.
  • FIG. 2 illustrates an example data structure that may be used to implement an example site database of FIG. 1.
  • FIG. 3 is a block diagram of an example apparatus that may be used to implement an example survey planner of the example media site data collection system of FIG. 1.
  • FIG. 4 is an example graphical user interface display that may be used to implement a display of the survey planner of FIGS. 1 and 3.
  • FIG. 5A depicts a block diagram of an example apparatus that may be used to implement an example mobile assisted survey tool of the example media site data collection system of FIG. 1.
  • FIG. 5B depicts a block diagram of an example user-interface apparatus of the example mobile assisted survey tool of FIG. 5A.
  • FIGS. 6A, 6B, 6C, and 6D illustrate example structural configurations that may be used to implement the example mobile assisted survey tool of FIGS. 1 and 5A.
  • FIG. 7 is a block diagram of an example apparatus that may be used to implement an example site data merger of the example media site data collection system of FIG. 1.
  • FIGS. 8A, 8B and 8C depict example user interfaces that may be implemented in connection with the example site data merger of FIG. 7 to show locations of surveyed media sites in connection with media site data and to enable users to verify and/or update the media site data.
  • FIGS. 9A and 9B illustrate an example data structure that may be used to represent media site data for use by the example site data merger of FIGS. 1 and 7.
  • FIG. 10 illustrates an example user interface that may be used to display alternative images of a surveyed media site and verify collected media site data.
  • FIGS. 11 and 12 are flowcharts representative of machine readable instructions that may be executed to implement the example media site data collection system of FIG. 1.
  • FIG. 13 is a flowchart representative of machine readable instructions that may be executed to implement the example survey planner of FIGS. 1 and 3.
  • FIG. 14 is a flowchart representative of machine readable instructions that may be executed to implement the example site data merger of FIGS. 1 and 7.
  • FIG. 15 is a flowchart representative of machine readable instructions that may be executed to implement the example mobile assisted survey tool of FIGS. 1, 5A and 6A-6D.
  • FIG. 16 illustrates a three-dimensional Cartesian coordinate system showing a plurality of dimensions that may be used to determine a location of a media site based on a location of an observer.
  • FIG. 17 is a block diagram of an example processor platform that may be used and/or programmed to implement the example processes of FIGS. 11-15 to implement any or all of the example media site data collection system, the example survey planner, the example site data merger and/or the example mobile assisted survey tool described herein.
  • FIG. 18 is a block diagram of an example auditing system used to audit signage.
  • FIG. 19 is a flowchart representative of machine readable instructions that may be executed to implement the example auditing system of FIG. 18.
  • FIG. 1 is a block diagram of an example media site data collection system used to collect media site information as described herein.
  • the example media site data collection system 100 collects data from one or more sources to form a database of media site data 105 (e.g., media site data records).
  • Example media sites include any number and/or types of indoor and/or outdoor advertisement sites (e.g., billboards, posters, banners, sides of buildings, walls of bus stops, walls of subway stations, walls of train stations, store name signage, etc.) and/or commercial sites or establishments (e.g., shopping centers, shopping malls, sports arenas, etc.).
  • indoor and/or outdoor advertisement sites e.g., billboards, posters, banners, sides of buildings, walls of bus stops, walls of subway stations, walls of train stations, store name signage, etc.
  • commercial sites or establishments e.g., shopping centers, shopping malls, sports arenas, etc.
  • the example media site database 105 includes one or more data records that store, among other things, values that represent the location of the media site (e.g., geo-code location data), values that represent the direction the media site faces, values that represent whether the media site is illuminated, and/or an owner name and owner ID number for that site, if available.
  • An example data structure 200 that may be used to implement the example site database 105 of FIG. 1 is described below in connection with FIG. 2.
  • Media site data stored in the example site database 105 of FIG. 1 may be used by, for example, outdoor advertisers to measure and/or establish with scientific and verifiable accuracy the reach of their outdoor media sites.
  • a study participant and/or respondent carries (or wears) a satellite positioning system (SPS) receiver (not shown) that periodically (e.g., every 4 to 5 seconds) acquires and receives a plurality of signals transmitted by a plurality of SPS satellites and uses the plurality of received signals to calculate a current geographic location (i.e., a position fix) for the respondent and a current time of day.
  • SPS satellite positioning system
  • the SPS receiver sequentially stores the result of each position fix (e.g., geo-code location data and the time of day and, if desired, the date) for later processing by a computing device (not shown).
  • Example SPS receivers operate in accordance with one or both of the U.S. Global Positioning System (GPS) or the European Galileo System.
  • GPS Global Positioning System
  • the computing device correlates and/or compares the stored sequence of position fixes with locations of media sites represented by the site database 105 to determine if one or more of the media sites should be credited as having been exposed to a person (i.e., whether it is reasonable to conclude that the wearer of the monitoring device (i.e., the SPS receiver) was exposed to the one or more media sites).
  • the accuracy of media exposure measurement systems and methods depends upon the accuracy and/or completeness of the media site data stored in the site database 105. For example, if the location of a particular media site stored in the site database 105 is in error, the media site may be credited with exposures that have not actually occurred and/or may not be credited with exposures that have occurred. Accordingly, the example media site data collection system 100 of FIG. 1 is configured to use data from multiple sources to compile media site data that is as complete and as accurate as technically and/or practically feasible. For example, data from a first source (which may not be complete) may be combined with data from a second source (which may not be complete) to create a more complete site database record for a particular media site.
  • data from a media site source may be verified using data from another source to verify the accuracy of the data from the media site source and/or to modify and/or update the data in the media site source.
  • data from multiple sources may be combined, verified, modified and/or used in any number of ways.
  • Example media site data sources include, but are not limited to, government records 110, a mobile assisted survey tool (MAST) 111, third-party still and/or moving images 112 and/or one or more members of a field force 113 (e.g., using the MAST 111).
  • Example government records 110 include site licensing applications, documents and/or records (e.g., conditional use permits, plot plans, building permits, certificates of occupancy, etc.) that may be collected from, for instance, any number and/or type(s) of county and/or city offices responsible for enforcing building and/or zoning rules and/or regulations.
  • Government records 110 may also include media site data from surveys performed by a government agency and/or a government contractor.
  • the media site data collection system 100 is configured to be used to manually retrieve data pertaining to media sites from paper copies of the government records 110 and manually enter the retrieved data into the site database 105 via, for example, a user interface (e.g., provided by a site data merger 120).
  • a user interface e.g., provided by a site data merger 120.
  • the example MAST 111 of FIG. 1 is a mobile apparatus that includes an electronic range finder, a camera, an SPS receiver, and a compass such that a user of the MAST 111 can capture and/or record location information, direction-facing information, illumination information, and/or other data for a media site.
  • the captured media site data is downloaded from the example MAST 111 to the example site data merger 120 on an occasional, periodic, and/or real-time basis.
  • the example MAST 111 is used by members of the example field force 113 and can be implemented using 1) a platform that is attached and/or affixed to the top of an automobile, truck, etc., 2) a platform that can be hand-carried, and/or 3) a platform that is attached and/or affixed to a human-powered vehicle or low-speed vehicles (e.g., bicycles, kick scooters, Segway® personal transporters, etc.). Any number and/or type(s) of data transfer device(s), protocol(s) and/or technique(s) can be used to download captured media site data from the MAST 111 to the site data merger 120.
  • the MAST data transfer device(s), protocol(s) and/or technique(s)
  • MAST 111 can be attached to the site data merger 120 using a universal serial bus (USB) connection, a Bluetooth® connection, and/or removable storage device drivers executing on the MAST 111 and/or the site data merger 120. While a single MAST 111 is illustrated in FIG. 1, in other example implementations any number and/or types of mobile assisted survey tools could be used to collected media site data. For example, multiple persons each having a MAST 111 could be used to collect media site data for a geographic area. An example manner of implementing the example MAST 111 is described below in connection with FIGS. 5A and 6A-6D.
  • third-party still and/or moving images 112 are electronically acquired from any number and/or type(s) of third parties and/or third party tools such as, for example, web sites, Google® Earth mapping service, Microsoft ® Virtual Map and/or Pictometry ® Electronic Field Study software.
  • the images 112 may be obtained in paper form and scanned into or otherwise converted to an electronic format suitable for use by the example site data merger 120.
  • the example images 112 are provided for use by the site data merger 120 and/or a user of the site data merger 120 to verify and/or modify media site information and/or data collected by the example MAST 111.
  • the example images 112 may be any type(s) of images including, for example, photographs (e.g., satellite photographs, aerial photographs, terrestrial photographs, etc.), illustrations and/or computer-generated images.
  • the example field force 113 of FIG. 1 includes one or more persons that physically survey a designated market area (DMA). Such persons may be directly employed by a company operating, utilizing and/or implementing the site database 105, and/or may include contractors hired by the company.
  • members of the example field force 113 visit media sites to collect media site data using the example MAST 111 or an apparatus substantially similar to the MAST 111, which may be a pedestrian-based MAST or a vehicular-based MAST.
  • the members of the field force 113 can use any automated, electronic and/or manual tools and/or methods other than the MAST 111 to collect the media site data.
  • the example media site data collection system 100 includes the site data merger 120.
  • the example site data merger 120 receives data from (and/or inputs based upon) one or more of the media site data sources 110-113 to form the media site data stored in the example site database 105.
  • the site data merger 120 is configured to provide one or more user interfaces that allow users to 1) input media site data collected from government records 110, 2) import data from the example MAST 111, and/or 3) overlay media site data (e.g., collected using the MAST 111 and/or collected from other sources such as the government records 110) on top of one or more of the example images 112.
  • Example implementations of user interfaces to allow a user to overlay the media site data on top of one or more of the example images 112 are described below in connection with FIGS. 8A-8C and 10.
  • the user interfaces are implemented using the Google® Earth mapping service tool. In other example implementations, any other mapping tool may alternatively be used including, for example, Pictometry® Electronic Field Study software or Microsoft® Virtual Earth.
  • the user interfaces of FIGS. 8A-8C and 10 also enable a user to verify the accuracy of collected media site data and, if necessary, modify and/or correct the media site data based upon the images 112.
  • the media site data collection system 100 is described herein as having a single site data merger 120 as illustrated in FIG. 1, in other example implementations, the media site data collection system 100 can be implemented using two or more site data mergers 120 using two or more computing platforms that operate and/or interact with the example site database 105.
  • a first site data merger can be used to enter media site data collected from the government records 110
  • a second site data merger can be used to import media site data collected using the MAST 111
  • a third site data merger can be used to display, verify and/or modify collected media site data using, for example, the third-party images 112.
  • the example media site data collection system 100 includes a survey planner 130.
  • a detailed block diagram of an example implementation of the survey planner 130 is described below in connection with FIG. 3.
  • the example survey planner 130 uses data from the example government records 110 and/or the example images 112 to categorize different geographic areas as dense areas or sparse areas (e.g., dispersed areas).
  • the planner can exclude areas in which zoning prohibits outdoor advertising.
  • the geographic areas are categorized in this manner to determine how they will be surveyed.
  • Pedestrian-based MAST's or similar MAST's may be used by members of the field force 113 that move by walking, riding a bike, or using any other transport equipment (e.g., a Segway®, a kick scooter, etc.) that is relatively more maneuverable in a dense area than a vehicle and more appropriate for use in a pedestrian environment (e.g., sidewalks, walkways, bike paths, etc.).
  • Vehicular-based MAST's are mounted on motorized vehicles (e.g., automobiles, cars, trucks, etc.).
  • Dense areas are areas characteristic of having relatively more media sites for a given measured area than sparse areas. Dense areas may also be areas having relatively more activity (e.g., high traffic count) and/or which are relatively more densely populated with people, structures, advertisements, etc. than sparse areas such that using a vehicular-based MAST would be difficult or impossible. For example, dense areas may include inner-city neighborhoods or business districts, shopping districts, indoor areas of commercial establishments, etc. The dense areas are surveyed using pedestrian-based MAST's because pedestrians are relatively more agile and flexible for maneuvering and positioning cameras in a densely populated or activity-rich area than are vehicles. Sparse areas are areas characteristic of having relatively less media sites per a given measured area.
  • Sparse areas may also be areas characteristic of having relatively less activity (e.g., low traffic count) and/or which are relatively less densely populated with people, structures, advertisements, etc. than dense areas.
  • sparse areas may include rural roads, highway areas, etc.
  • the sparse areas are surveyed using vehicular-based MAST's because vehicles can cover larger geographic areas faster than pedestrians.
  • geographic areas that might otherwise be categorized as sparse areas may nonetheless by surveyed using pedestrian-based MAST's if, for example, characteristics (e.g., traffic, low speed limit, etc.) make it difficult for an automobile to be maneuvered while the MAST 111 is operated and/or the speed at which the traffic is moving might limit the effectiveness of the MAST 111.
  • the example survey planner 130 of FIG. 1 is configured to present a user interface (e.g., the user interface 400 of FIG. 4) that has zoning and traffic count data overlaid on top of a map and/or image of a geographic area.
  • a traffic count is a count of all movements for cars, trucks, buses and/or pedestrians per geographic area for a given duration.
  • the areas that are, for example, zoned for commercial and/or retail use and have high traffic counts are designated as dense areas. Once dense areas and sparse areas are identified, they can be sub-divided and/or assigned to particular members of the field force 113 for surveying.
  • members of the field force 113 assigned to survey sparse areas will do so using vehicle -based MAST's (e.g., the MAST 111 of FIGS. 6A-6D), and members of the field force 113 assigned to survey dense areas will do so using pedestrian -based MAST's.
  • vehicle -based MAST's e.g., the MAST 111 of FIGS. 6A-6D
  • pedestrian -based MAST's e.g., the MAST 111 of FIGS. 6A-6D
  • FIG. 2 illustrates an example data structure 200 that may be used to implement a media site data record of the example site database 105 of FIG. 1 for a media site.
  • the example data structure 200 includes a panel identifier field 204.
  • the example panel identifier field 204 of FIG. 2 includes a value and/or alphanumeric string that uniquely identifies the media site and is used to associate the media site with a DMA.
  • an owner of the media site e.g., the owner of an advertisement at the media site
  • the example data structure 200 includes an owner name field 208.
  • the example owner name field 208 includes an alphanumeric string that represents the owner of the media site.
  • the example data structure 200 includes an on-road field 212.
  • the example on-road field 212 includes a flag that can have one of two values (e.g., YES or NO) that represents whether the media site is along a roadway.
  • YES YES
  • NO YES
  • the example primary road field 216 includes an alphanumeric string that represents the name of a road. If the media site is not along a road (e.g., the on-road field 212 contains a NO flag value), the primary road field 216 may be left blank.
  • the example data structure 200 includes a cross street field 220.
  • the example cross street field 220 includes an alphanumeric string that represents the name of the nearest crossroad to the media site. If the media site is not along a road (e.g., the on-road field 212 contains a NO flag value), the cross street field 220 may be left blank.
  • the example data structure 200 includes a direction facing field 224.
  • the example direction facing field 224 includes a value that represents the direction towards which the media site is facing (e.g., a number in degrees).
  • the example media site data collection system 100 of FIG. 1 determines the media site facing direction relative to true North (e.g., calculated from the geographic offset from magnetic North).
  • the direction towards which a media site is facing can be calculated using a line drawn perpendicular to the face of the media site and outwards or away from the media site.
  • the example data structure 200 includes a GPS North-South coordinate field 228 and a GPS East-West coordinate field 232.
  • the example North-South coordinate field 228 contains a value that represents the North-South location of the media site as determined from received GPS signals (i.e., the latitude of the media site).
  • the example East-West coordinate field 232 contains a value that represents the East-West location of the media site as determined from received GPS signals (i.e., the longitude of the media site).
  • the example data structure 200 includes an estimated position error field 236.
  • the example estimated position error field 236 includes a value that represents the potential error in the coordinates represented by the example coordinate fields 228 and 232 (e.g., in units of feet or degrees).
  • the value stored in the estimated position error field 236 may be computed using any algorithm(s), logic and/or method(s) based on, for example, the number and/or strength of received GPS signals. For example, if a GPS position fix was determined using relatively few GPS signals or GPS signals with low signal strength, the error in location may be larger.
  • the example data structure 200 includes a side of road field 240.
  • the example side of road field 240 includes a flag that represents on which side of the primary road the media site is located. If the media site is not along a road (e.g., the on-road field 212 contains a NO flag value), the side of road field 240 may be left blank.
  • the example data structure 200 includes an angle to road field 244.
  • the example angle to road field 244 includes a value that represents (e.g., in degrees) the angle the media site faces relative to the road. If the media site is not along a road (e.g., the on-road field 212 contains a NO flag value), the angle to road field 244 may be left blank.
  • the example data structure 200 includes an illumination field 248.
  • the example illumination field 248 includes a value that represents the number of hours per day that the media site is illuminated (e.g., 0 hours, 12 hours, 18 hours, 24 hours, etc.).
  • the example data structure 200 includes a panel type field 252.
  • the example panel type field 252 includes a value and/or an alphanumeric string that represents a media site type (e.g., a billboard type, a bus-shelter type, an 8-sheet poster type, a 30-sheet poster type, a wall-mural type, a 3-D prop type, etc.).
  • the example data structure 200 includes a panel size field 256.
  • the example panel size field 256 includes a value that represents the size of the media site measured vertically, horizontally and/or diagonally (e.g., 6 feet, 24 feet, etc.).
  • the example data structure 200 includes a distance from road field 260.
  • the example distance from road field 260 includes a value that represents the distance of the media site from the primary road (e.g., in feet or meters). If the media site is not along a road (e.g., the on-road field 212 contains a NO flag value), the distance from road field 260 may be left blank.
  • the example data structure 200 includes a province name field 264.
  • the example province name field 264 includes an alphanumeric string that represents the name of the district, county, parish or province in which the media site is located.
  • the example data structure 200 includes a city name field 268.
  • the example city name field 268 includes an alphanumeric string that represents the name of the city in which the media site is located.
  • the example data structure 200 includes a secondary road field 272.
  • the example secondary road field 272 includes an alphanumeric string that represents the name of the secondary road from which the media site is visible.
  • the secondary road field 272 may be left blank.
  • the example data structure 200 includes a postal code field 276.
  • the example postal code field 276 includes an alphanumeric string that represents the postal code (e.g., a zipcode) for the geographic area in which the media site is located.
  • the example data structure 200 includes a clutter field 280.
  • the example clutter field 280 includes one or more alphanumeric strings that describe any obstructions that may impact viewing of the media site from the primary road for the media site. The obstructions can be evident from a digital image of the media site stored in association with the data structure 200 (e.g., as specified in a picture field 284).
  • the example data structure 200 includes a picture field 284.
  • the example picture field 284 includes one or more alphanumeric strings that represent the name of one or more digital image files. Additionally or alternatively, the contents of one or more digital image files may be stored directly within the picture field 284.
  • the example data structure 200 is illustrated in FIG. 2 as having the data fields described above, in other example implementations, the example data structure 200 may be implemented using any number and/or type(s) of other and/or additional fields and/or data. Further, the fields and/or data illustrated in FIG. 2 may be combined, divided, omitted, re-arranged, eliminated and/or implemented in any of a variety of ways. For example, the secondary road field 272, the example postal code field 276 and/or the example clutter field 280 may be omitted from some implementations of the site database 105 and/or for some media sites. Moreover, the example data structure may include additional fields and/or data than those illustrated in FIG. 2 and/or may include more than one of any or all of the illustrated fields and/or data.
  • FIG. 3 is a block diagram of the example survey planner 130 of FIG. 1.
  • the example survey planner 130 includes a data collector 305.
  • the example data collector 305 collects map data and/or images 310 from the example third-party images 112 (FIG. 1) and zoning data 311 and traffic data 312 from the example government records 110 (FIG. 1 ).
  • the map data 310, the zoning data 311 and the traffic data 312 may be collected electronically, manually from paper records, and/or any combination thereof. If any of the map data 310, the zoning data 311 and/or the traffic data 312 is entered manually, the data collector 305 can implement any type of user interface suitable for entering such information.
  • the data collector 305 can collect any or all of the data 310-312 from the site data merger 120 and/or the example site database 105.
  • the example survey planner 130 includes a mapper 315 and a display 320.
  • the example mapper 315 formats and/or creates one or more user interfaces 317 to graphically depict a map and/or image of a geographic area.
  • An example user interface 317 created by the mapper 315 is discussed below in connection with FIG. 4.
  • the example display 320 is configured to display the user interfaces 317 created by the example mapper 315.
  • the example display 320 may be any type of hardware, software and/or any combination thereof that can display a user interface 317 for viewing by a user.
  • the display 320 may include a device driver, a video chipset, and/or a video and/or computer display terminal.
  • the example survey planner 130 of FIG. 3 includes an overlayer 325.
  • the example overlayer 325 overlays the zoning data 311 and/or traffic data 312 on top of the user interface 317 by providing instructions to the example mapper 315 and/or the display 320.
  • the instructions cause the mapper 315 to modify one or more of the user interfaces 317 and/or cause the display 320 to directly overlay the data 311 and 312.
  • the overlayer 325 may use an application programming interface (API) that directs the display 320 to add lines and/or text to a user interface created by the mapper 315.
  • API application programming interface
  • the example data collector 305, the example mapper 315, the example user interface(s) 317, the example display 320 and the example overlayer 325 may be implemented to use the Google® Earth mapping service tool.
  • other mapping tools such as, for example, Microsoft ® Virtual Map or Pictometry ® Electronic Field Study software could be used instead.
  • the Google® Earth mapping service tool is used to implement an application that may be executed by a general-purpose computing platform (e.g., the example computing platform 1700 of FIG. 17).
  • portions of the example data collector 305, the example mapper 315, the example user interfaces 317 and the example overlay 325 are implemented using the Google® Earth mapping service application.
  • the Google® Earth mapping service application collects and displays map data 310 from third-party images 112 (e.g., satellite and/or aerial images of a geographic area) stored within a server that implements and/or provides the Google® Earth mapping service interface 317.
  • the Google® Earth mapping service tool generates user interfaces 317 that may be displayed on a computer terminal associated with the computing platform.
  • Another application and/or utility i.e., the overlayer 325) that may be executed by the computing platform (and/or a different computing platform) formats the zoning data 311 and the traffic data 312 into a data file suitable for use with the Google® Earth mapping service application (e.g., a file structure in accordance with the Keyhole Markup Language (KML) format).
  • KML Keyhole Markup Language
  • Google® Earth mapping service KML files textually describe lines, information, graphics and/or icons to be displayed by overlaying them on third-party images 112.
  • the Google® Earth mapping service application reads and/or processes the KML file generated by the overlayer 325, and the user's personal computer and/or workstation displays the resulting overlaid images and/or user interfaces 317 generated by the Google® Earth mapping service application for viewing by a user.
  • the example survey planner 130 of FIG. 3 includes a partitioner 330.
  • the example partitioner 330 of FIG. 3 partitions the map into areas dense in media sites and areas sparse in media sites.
  • the example partitioner 330 partitions the map based upon overlaid zoning data 311 and overlaid traffic data 312.
  • the partitioner 330 identifies portions of the map corresponding to both high traffic counts and zoned for commercial and/or retail use as media site dense areas.
  • Such media site dense areas are typically easiest to survey via, for example, foot and/or bicycle.
  • Other areas of the map are typically sparse in media sites and, thus, amenable to survey via automobile.
  • the partitioning of the overlaid map may be performed via hardware, software, manually and/or via any combination thereof.
  • the example survey planner 130 includes an assignor 335.
  • the example assignor 335 sub-divides the map partitions determined by the example partitioner 330 into sub -partitions based upon the type of the map partition (e.g., dense or sparse) and based upon the size of a geographic area that can be surveyed by a surveyor within a prescribed time period (e.g., miles of roadway per day). For example, a surveyor on foot may be able to survey two miles of densely located media sites in a day, while a surveyor in a car may be able to survey 20 miles of dispersedly located media sites in a day.
  • the type of the map partition e.g., dense or sparse
  • a prescribed time period e.g., miles of roadway per day
  • the example assignor 335 then assigns the sub-partitions to particular surveyors so that an entire geographic area is surveyed, for example, in as time efficient a manner as possible (e.g., in as few days as possible given a particular number and/or type(s) of surveyors) and/or in as cost efficient a manner as possible.
  • the creation of sub-partitions and/or the assignment of sub-partitions to surveyors may be performed via hardware, software, manually and/or as any combinations thereof.
  • the survey planner 130 includes a graphical user interface (GUI) 340.
  • GUI 340 may be part of an operating system (e.g., Microsoft® Windows XP®) used to implement the survey planner 130.
  • the GUI 340 allows a user of the survey planner 130 to, for example, select a geographic area to be mapped and/or to select zoning data 311 and/or traffic data 312 to be overlaid on the geographic area map .
  • the Google® Earth mapping service tool is used to implement a portion of the example survey planner 130, the GUI 340 provides an interface between the user and the Google® Earth mapping service application.
  • the Google® Earth mapping service tool may use an API provided by the example GUI 340 to display information and/or to receive user inputs and/or selections (e.g., to allow a user to select a KML file to load).
  • the example survey planner 130 of FIG. 1 may be combined, divided, re-arranged, omitted, eliminated and/or implemented in any of a variety of ways.
  • the example data collector 305, the example mapper 315, the example user interface(s) 317, the example display 320, the example overlayer 325, the example partitioner 330, the example assignor 335, the example GUI 340 and/or, more generally, the example survey planner 130 may be implemented using hardware, software, firmware and/or any combination of hardware, software and/or firmware.
  • the example survey planner 130 may include additional elements, processes and/or devices than those illustrated in FIG. 3 and/or may include more than one of any or all of the illustrated elements, processes and/or devices.
  • FIG. 4 illustrates an example user interface 400 that may be presented by the example survey planner 130 of FIGS. 1 and 3.
  • the user interface 400 is one of the user interfaces 317 of the survey planner 130 depicted in FIG. 3.
  • the user interface 400 may be created using any mapping tool, such as a geographic information system (GIS) tool (e.g., a Maplnfo® GIS tool) or the Google® Earth mapping service.
  • GIS geographic information system
  • Maplnfo® GIS tool Maplnfo® GIS tool
  • Google® Earth mapping service e.g., Google® Earth mapping service.
  • the example map 405 is color-coded based upon how an area is zoned. For example, an area 415 occurring along West Sunset Boulevard is zoned for commercial use while an area 420 south of Melrose Avenue is zoned for residential use. To depict traffic data, the example map 405 is overlaid with traffic count data. For example, a traffic count 425 for West Sunset Boulevard is 25,000 per the 2003 Annual Average Weekday Traffic (AAWT) Traffic Count for Los Angeles County.
  • AAWT Average Weekday Traffic
  • Example dense media site areas of FIG. 4 occur along West Sunset Boulevard, Santa Monica Boulevard and Melrose Avenue.
  • An example sparse media site area 420 is located south of Melrose Avenue.
  • FIG. 5 A is a block diagram of the example mobile assisted survey tool (MAST) 111 of FIG. 1.
  • the MAST 111 includes a user-interface apparatus 505, which may be implemented using, for example, a touch-screen tablet computer, a hand-held computer, a personal digital assistant (PDA) and/or a laptop computer.
  • the example user-interface apparatus 505 provides a user interface (such as a GUI) that allows a user of the user-interface apparatus 505 to control the operation of the MAST 111 to collect and/or enter media site data.
  • a user interface such as a GUI
  • the example user-interface apparatus 505 displays real-time video on a user interface (e.g., in a window of an application executing upon the user-interface apparatus 505) that enables a user to touch a point (e.g., a location) on the screen of the user-interface apparatus 505 to identify a media site.
  • a user interface e.g., in a window of an application executing upon the user-interface apparatus 505
  • a point e.g., a location
  • the example user-interface apparatus 505 interacts with other elements of the MAST 111 to capture media site data as described below.
  • the video camera 510 may be omitted from the MAST 111, and surveyors (e.g., members of the field force 113) can rely on their own sight to determine the direction in which to direct the field of view of the digital camera 515 to capture an image of a targeted media site.
  • the user-interface apparatus 505 also provides one or more additional and/or alternative user interfaces that allow a user of the user- interface apparatus 505 to enter textual information concerning the media site.
  • Example textual information includes, media site owner, primary road, secondary road, crossroads, illumination, etc.
  • the example MAST 111 includes a video camera 510 (e.g., a video image capturing device).
  • the example video camera 510 is any type and/or model of digital video camera capable of capturing, storing and/or providing real-time video to the example user-interface apparatus 505.
  • the Live! Ultra webcam manufactured by Creative Labs ® is used to implement the example video camera 510 and is coupled to the example user-interface apparatus 505 via a Universal Serial Bus (USB) interface to enable live video feed to be communicated to and displayed by the user-interface apparatus 505.
  • USB Universal Serial Bus
  • the example MAST 111 of FIG. 5A includes a camera 515 (e.g., a still image capturing device).
  • the example camera 515 may be implemented using any type and/or model of digital still picture camera capable of capturing, storing and/or providing a digital photograph to the example user-interface apparatus 505 and being controlled by the user-interface apparatus 505.
  • a camera 515 e.g., a still image capturing device.
  • the example camera 515 may be implemented using any type and/or model of digital still picture camera capable of capturing, storing and/or providing a digital photograph to the example user-interface apparatus 505 and being controlled by the user-interface apparatus 505.
  • the digital camera 515 is capable of capturing relatively higher resolution images and/or relatively higher quality images (e.g., higher color depth, sharper images, better focused images, etc.) than the video camera 510.
  • relatively higher resolution images and/or relatively higher quality images e.g., higher color depth, sharper images, better focused images, etc.
  • the higher-resolution images of the media sites facilitate subsequently performing detailed analyses of text and image details of the media sites.
  • the S3iS digital camera manufactured by Canon® of Shimomaruko 3-chome, Ohta-ku, Tokyo, Japan is used to implement the example digital camera 510.
  • the example digital camera 515 is coupled to the example user-interface apparatus 505 using a USB interface.
  • other peripheral interfaces such as, for example, a Bluetooth® interface, an IEEE 1394 interface, etc. may be used instead to couple the camera 515 to the user-interface apparatus 505.
  • the digital camera 515 is controlled by the example user-interface apparatus 505 to, for example, control the zoom of the digital camera 515 and/or the shutter trigger of the digital camera 515 to capture a photograph.
  • the example MAST 111 is described herein as having separate video and still picture cameras (e.g., the video camera 510 and the digital camera 515), in other example implementations, the MAST 111 may be implemented using a single camera capable of capturing video and digital still pictures. In this manner, the camera can transfer live video to the user-interface apparatus 505 and, when a user selects an advertisement object of interest in the video feed to be captured, the computer can control the camera to capture a still image (e.g., a high-resolution still image) of the specified object.
  • a still image e.g., a high-resolution still image
  • the example MAST 111 of FIG. 5A includes a rangefinder 520.
  • the example rangefinder 520 can be implemented using any type and/or model of digital rangefinder.
  • the rangefinder 520 is implemented using the TruPulse® 200B manufactured by Laser Technologies of 7070 S. Arlington Way, Englewood, Colorado, USA, 80112.
  • the rangefinder 520 is coupled to the user-interface apparatus 505 using a Bluetooth® interface.
  • other peripheral interfaces such as, for example, an RS-232 serial communication interface, an IEEE 1394 interface, a USB interface etc. may be used instead.
  • the rangefinder 520 is controlled by the example user-interface apparatus 505 to measure and report the distance between the rangefinder 520 and a media site.
  • the digital camera 515 is triggered to take a picture of the media site at substantially the same time that the digital rangefinder 520 is triggered to measure the distance to the media site.
  • the example MAST 111 includes a pan-tilt mechanism 525.
  • the example pan-tilt mechanism 525 is controllable in two directions (side-to-side and up-and-down) to orient the camera 515 and the rangefinder 520 relative to a media site.
  • the pan-tilt mechanism 525 can be controlled so that the selected media site is in substantially the center of a viewfmder of the digital camera 515 and/or a picture captured by the digital camera 515.
  • the pan-tilt mechanism 525 may be controlled manually by a user of the MAST 111 and/or automatically by the user-interface apparatus 505 based upon a user-selected point in the real-time video provided to the user-interface apparatus 505 by the example video camera 510.
  • the user-interface apparatus 505 may determine that a selected media site is currently displayed in the upper right corner of the real-time video and, thus, direct the pan -tilt mechanism 525 to rotate to the right and tilt upwards until the media site is in the middle of the realtime video frames.
  • the example pan-tilt mechanism 525 may be coupled to the example user- interface apparatus 505 using any type of interface, such as an RS-232 serial communication interface, a USB interface and/or a Bluetooth Interface.
  • the interface may be used to control the pan-tilt mechanism 525 (if electronically controllable) and/or to receive angle and/or tilt information from the pan-tilt mechanism 525.
  • Such angle and/or tilt information is relative to the current orientation of the MAST 111 (e.g., the facing direction of an automobile to which the MAST 111 is mounted).
  • a pan- tilt mechanism that can be used to implement the example pan/tile mechanism 525 is implemented using the SPG400 Standard Servo Power Gearbox, the SPT400 Standard Servo Power Gearbox Tilt System, the 31425S HS-425BB Servo and the 35645S HS-5645MG Servo - all manufactured by Servo City of 620 Industrial Park, Winfield, KS, USA, 67156.
  • the example MAST 111 includes a digital compass 530.
  • the example compass 530 may be implemented using any type and/or model of digital compass.
  • the example compass 530 may be coupled to the example user-interface apparatus 505 using any type of interface including, for example, a USB interface and/or a Bluetooth® Interface.
  • the USB interface may be used to read the current orientation of the MAST 111 in, for example, degrees. As described below in connection with FIGS.
  • the MAST 111 may be provided with a rotary encoder 635 to determine an angle of rotation (or pan) of the cameras 510 and 515 relative to a reference point on a vehicle.
  • the user-interface apparatus 505 may determine the directions in which the fields of view of the cameras 510 and 515 are positioned based on a direction of travel of an automobile as indicated by the compass 530 and the angle of rotation indicated by the rotary encoder 635.
  • the digital compass 530 may be coupled to a rotating (e.g., a panning) platform on which the cameras 510 and 515 are mounted so that as the cameras 510 and 515 are rotated, the compass 510 is also rotated to directly detect the direction in which the fields of view of the cameras 510 and 515 are positioned.
  • the example MAST 111 includes a GPS receiver 535.
  • the example GPS receiver 535 is implemented using an Earthmate® LT-20 GPS receiver communicatively coupled to the user-interface apparatus 505 using a USB interface.
  • the USB interface may be used to obtain the last position fix from the GPS receiver 535 (e.g., longitude and latitude) and/or to direct the GPS receiver 535 to perform a position fix.
  • the GPS receiver 535 may also estimate and provide to the user-interface apparatus 505 an estimate of the amount of error in a position fix.
  • the GPS receiver 535 may be implemented using any other type and/or model of GPS receiver capable to receive GPS signals from one or more GPS satellites, and determine and/or estimate the current location of the MAST 111.
  • the example GPS receiver 535 may be coupled to the example user-interface apparatus 505 using any other type of interface including, for example, a Bluetooth® interface.
  • the data interfaces are represented using the data interfaces block designated by reference numeral 540.
  • the MAST 111 is provided with a USB hub to communicatively couple any USB interfaces of the components described above to the user-interface apparatus 505.
  • Such USB hub represented by the data interfaces 540, is separate from the other components and may be used if the user-interface apparatus 505 has less USB interfaces than the number required to communicate with the above-described components that use USB interfaces.
  • some of the data interfaces 540 are integrated in the components and the components are directly communicatively coupled to the user-interface apparatus 505.
  • the data interfaces 540 may include, for example, USB interfaces, RS-232 serial communication interfaces, Bluetooth® Interfaces, IEEE 1394 interfaces.
  • the data interfaces 540 enable the computer to control and exchange data with the above-described components.
  • the data interfaces 540 enable the example MAST 111 to download media site data to, for example, the example site data merger 120 of FIG. 1 using the example data structure 200 of FIG. 2.
  • FIG. 5B is a block diagram of the example user-interface apparatus 505 of the example mobile assisted survey tool 111 of FIG. 5A.
  • the example user-interface apparatus 505 To display user interface screens, maps or images of geographic areas, images of scenes having media sites, images of media sites and/or any information related thereto, the example user-interface apparatus 505 is provided with a display interface 555.
  • the display interface 555 is implemented using a Microsoft® Windows operating system display interface configured to display graphical user interfaces.
  • the user-interface apparatus 505 is provided with a user-input interface 560.
  • the user-input interface 560 is implemented using an interface to a touch panel mounted onto a display of the example user-input apparatus 505.
  • the user-input interface 560 may be implemented using any other type of user-input interface including a mouse or other pointer device, a keyboard interface, etc.
  • the user-interface apparatus 505 is provided with an image object recognizer 565.
  • the image object recognizer 565 is configured to perform object recognition processes to recognize media sites (e.g., billboards, posters, murals, or any other advertisement media) in images captured by the video camera 510 and/or the digital camera 515.
  • media sites e.g., billboards, posters, murals, or any other advertisement media
  • the image object recognizer 565 can use the screen location selected by the user on the displayed image and use an object recognition process to detect the boundaries of an advertisement located in the scene at the user-selected screen location. In this manner, subsequent processes can be performed to aim and zoom the digital camera 515 towards the advertisement media site in the scene.
  • the user-interface apparatus 505 is provided with a data interface 570.
  • the data interface 570 is configured to retrieve and store data in data records (e.g., the data structure 200 of FIG. 2) for different surveyed media sites.
  • the data interface 570 can receive data from the digital camera 515, the digital rangefmder 520, the GPS receiver 535, the video camera 510, the digital compass 530, and/or the data interface 540 described above in connection with FIG. 5A and store the data in the local memory 575.
  • the data interface 570 is configured to store and retrieve images in the memory 575 captured by the camera(s) 510 and/or 515 for display via the display interface 555. Also, the data interface 570 is configured to retrieve aerial maps or photographs or satellite photographs of geographic areas for display to a user as shown below in connection with the user interface 800 of FIGS. 8A-8C and/or the user interface 1000 of FIG. 10.
  • the data interface 570 is configured to store the zoom levels of the digital camera 515 used to capture images of media sites, to store distances between user-specified media sites and survey locations from which the media sites were surveyed, to store captured images of media sites, to store pan and tilt angles used to position the rangefmder 520 and the digital camera 515 to capture the images of the media sites, to store location information representative of the locations of the MAST 111 when the media sites were surveyed and to store timestamp(s) indicative of time(s) at which the digital camera 515 captured the image(s) of the media sites.
  • the information stored in the memory 575 can subsequently be used to determine geographic location coordinates of the media sites and/or can be communicated to the site database 105 for storage and subsequent processing.
  • the user-input apparatus 505 is provided with a camera positioner interface 580.
  • the camera positioner interface 580 is configured to determine an amount of tilt rotation and pan rotation (e.g., rotational angle values) by which to adjust the position of the digital camera 515 and the rangefinder 520 to position the field of view of the digital camera 515 on a targeted media site. For example, after the image object recognizer 565 recognizes the boundaries of a media site to be surveyed, the camera positioner interface 580 can determine pan and tilt adjustment values with which to adjust the pan-tilt mechanism 525 (FIG. 5A) to position the fields of view of the digital camera 515 and the rangefinder 520 to be on the identified media site.
  • tilt rotation and pan rotation e.g., rotational angle values
  • the user-interface apparatus 505 is provided with a camera controller 585.
  • the camera controller 585 is configured to control the zoom levels and the shutter trigger of the digital camera 515 to capture images of media sites.
  • the camera controller 585 is configured to determine the zoom level based on the distance between the digital camera 515 and the targeted media site as measured by the digital rangefinder 520.
  • the camera controller 585 is configured to determine zoom levels that capture a media site in its entirety (e.g., advertisement content and fixtures to which the advertisement content is affixed or surrounding the advertisement content) or to capture at least a portion of the media site.
  • the camera controller 585 is also configured to control image or video capture operations including zoom operations of the video camera 510.
  • the example user-interface apparatus 505 is provided with a location information generator 590.
  • the location information generator 590 is configured to use data stored in the memory 575 to determine the location(s) of media site(s) as described in detail below in connection with FIG. 16.
  • the example MAST 111 and the example user-interface apparatus 505 may be implemented using any number and/or type(s) of other and/or additional elements, devices, components, interfaces, circuits and/or processors. Further, the elements, devices, components, interfaces, circuits and/or processors illustrated in FIGS. 5A and 5B may be combined, divided, re-arranged, eliminated and/or implemented in any number of different ways. Additionally, the example MAST 111 and/or the example user-interface apparatus 505 may be implemented using any combination of firmware, software, logic and/or hardware.
  • the MAST 111 and/or the example user-interface apparatus 505 may be implemented to include additional elements, devices, components, interfaces, circuits and/or processors than those illustrated in FIGS. 5A and 5B and/or may include more than one of any or all of the illustrated elements, devices, components, interfaces, circuits and/or processors.
  • FIGS. 6A, 6B, 6C, and 6D illustrate example structural configurations that may be used to implement the example MAST 111 of FIGS. 1 and 5A. While example configurations of implementing the example MAST 111 are illustrated in FIGS. 6A-6D, other configurations of implementing the MAST 111 may alternatively be used. Because many elements, devices, components, interfaces, circuits and/or processors of the example MAST 111 of FIGS. 6A-6D are identical to those discussed above in connection with FIG. 5 A, the descriptions of those elements, devices, components, interfaces, circuits and/or processors are not repeated here. Instead, identical elements, devices, components, interfaces, circuits and/or processors are illustrated with identical reference numerals in FIGS. 5A and 6A-6D, and the interested reader is referred back to the descriptions presented above in connection with FIG. 5A for a complete description of those like numbered elements, devices, components, interfaces, circuits and/or processors.
  • the example MAST 111 is mounted through a sun roof 605 of an automobile roof 610.
  • the MAST 111 is mechanically affixed to one or more members that position and/or secure the MAST 111 within the sun roof area 605.
  • the pan-tilt mechanism 525 is implemented using a manual adjustment configuration.
  • the pan-tilt mechanism 525 is implemented using a PVC pipe 620 that passes through a thrust bearing 625 and is manually controllable using an up/down and rotate handle 630.
  • 6A and 6B enables a person to control the position and field of view of the cameras 510 and 515 and the range finder 520 by enabling the person to a) move the handle 630 upwards/downwards to tilt the video camera 510, the digital camera 515 and the rangefinder 520 relative to a geographic horizon and b) rotate the handle 630 to rotate the video camera 510, the digital camera 515 and the rangefinder 520 relative to the front of the automobile.
  • the MAST 111 is provided with a rotary encoder 635 to determine the position of the video camera 510, the digital camera 515 and the rangefinder 520 relative to the front-to-back centerline of the automobile.
  • the example rotary encoder 635 provides a digital value and/or an electrical signal representative of the rotational angle of the video camera 510, the digital camera 515 and the rangefinder 520 relative to the front-to-back centerline of the automobile to the user-interface apparatus 505.
  • the user-interface apparatus 505 can determine the direction in which fields of view of the cameras 510 and 515 are pointing based on direction information from the digital compass 530 and the angle of rotation indicated by the rotary encoder 635.
  • the example MAST 111 is implemented using an electronically controllable pan-tilt mechanism 525 and is surrounded by an example housing 650 having a clear weatherproof dome 655 to protect the MAST 111 from environmental elements (e.g., rain, snow, wind, etc.).
  • the example housing 650 can be mounted and/or affixed to the top of an automobile using, for example, straps, a luggage rack, a ski rack, a bike rack, suction cups, etc.
  • the example MAST 111 of FIG. 6C includes a stylus 660. The user selects a media site by pressing the tip 665 of the stylus 660 to a touch-panel-enabled screen 670 of the user-interface apparatus 505 at a point corresponding to a media site.
  • the pan-tilt mechanism 525 is electronically controllable via the user- interface apparatus 505.
  • the example user- interface apparatus 505 communicates with the video camera 510 ((FIGS. 6A and 6B) which is provided but not shown in the example configuration of FIG. 6C), the digital camera 515, the rangefmder 520, the pan-tilt mechanism 525, the digital compass 530 and the GPS receiver 535 via respective communication interfaces as described above in connection with FIG. 5A.
  • the components of the MAST 111 in the housing 650 can be communicatively coupled to the user-interface apparatus 505 via a wireless communication interface such as, for example, a Bluetooth® interface to eliminate the need to extend communication cables from the user-interface apparatus 505 to the MAST components.
  • the MAST 111 can be provided with a manual pan-tilt adjustment mechanism as shown in FIGS.
  • the MAST 111 can also be provided with the electronic pan-tilt mechanism 525 to enable the MAST 111 to automatically perform fine position adjustments when, for example, centering on and zooming into a media site of interest.
  • the example MAST 111 is implemented using a base 680 and a tiltable housing 682 to provide a vertical tilting motion.
  • the base or housing 680 includes a lower fixed-position base or housing portion 684 and an upper rotatable base or housing portion 686 to provide a panning motion.
  • the video camera 510 (FIGS. 5A, 6A, and 6B) is mounted in the lower fixed-position base portion 684 and captures video images through a window area 688.
  • the digital camera 515 and the rangefmder 520 (FIGS.
  • the tiltable housing 682 of the upper rotatable base portion 686 are mounted in the tiltable housing 682 of the upper rotatable base portion 686 and have a field of view or line of sight through a window area 690.
  • a tilting device of the pan-tilt mechanism 525 is mounted in the upper rotatable base portion 686 at a location indicated by reference numeral 692 to vertically tilt the tiltable housing 682.
  • the base 680 including the tiltable housing 682 and the lower and upper base portions 684 and 686 are implemented using weatherproof construction.
  • the digital compass 530 and the GPS receiver 535 can also be mounted on the MAST 111 of FIG. 6D.
  • the digital compass 530 and the GPS receiver 535 can be mounted on a fixed (e.g., non pannable and non tiltable) portion such as, for example, a mounting plate 694 that remains in a fixed position relative to a vehicle on which the MAST 111 is mounted.
  • a fixed (e.g., non pannable and non tiltable) portion such as, for example, a mounting plate 694 that remains in a fixed position relative to a vehicle on which the MAST 111 is mounted.
  • the lower base portion 684 is described above as a fixed-position base portion, in other example implementations, the lower base portion 684 may be implemented as a rotatable base portion so that the lower and upper base portions 684 and 686 can rotate together to enable panning motions for the digital camera 515 and the video camera 510.
  • the example MAST 111 of FIGS. 6A, 6B, 6C, and/or 6D has a vehicular- based form factor suitable for mounting on a motorized vehicle
  • the example MAST 111 may be implemented as a pedestrian-based MAST using a wearable and/or carry- able form factor.
  • the rangefmder 520 may be a hand-held rangefmder 520 having a viewfmder that allows a user to point the rangefmder 520 at or about the center of a media site.
  • the rangefmder 520 is capable of operating in a mode that enables measuring angles to the top and bottom edges of the media site to allow the height of the media site to be computed.
  • the user-interface apparatus 505 may be implemented using a handheld portable computing device (e.g., a personal digital assistant (PDA), a Windows Mobile® device, a PocketPC device, a Palm device, etc.) that may be carried using a carrying case that may be clipped to a belt.
  • a handheld portable computing device e.g., a personal digital assistant (PDA), a Windows Mobile® device, a PocketPC device, a Palm device, etc.
  • PDA personal digital assistant
  • Windows Mobile® device e.g., a Windows Mobile® device
  • a PocketPC device e.g., Samsung device, etc.
  • the video camera 510 may be omitted from the MAST 111, and surveyors (e.g., members of the field force 113) can rely on their own sight to determine the direction in which to direct the field of view of the digital camera 515 to capture an image of a targeted media site.
  • the user- interface apparatus 505 is configured to display a user interface that prompts the user of the MAST 111 to perform different measurements and/or capture pictures of a media site. For example, when a user identifies a new media site, the user can press a start button. The user-interface apparatus 505 then prompts the user to specify a plurality of operations to characterize the media site including, for example, 1) measuring a distance to the media site, 2) measuring a height of the media site (e.g., measure angles to the top and bottom of the media site), 3) entering textual information (e.g., owner name, etc.), and 4) capturing one or more pictures of the media site.
  • a user interface apparatus 505 prompts the user to specify a plurality of operations to characterize the media site including, for example, 1) measuring a distance to the media site, 2) measuring a height of the media site (e.g., measure angles to the top and bottom of the media site), 3) entering textual information (e.g., owner name, etc.),
  • the GPS receiver 535 and the digital compass 530 are mounted to the rangefmder 520 so that as the rangefmder 520 is moved the GPS receiver 535 and the digital compass 530 can be used to track the direction and location of the rangefmder 520. For example, as the rangefmder 520 is rotated, the digital compass 530 can correctly measure the direction in which the rangefmder 520 is pointing.
  • vehicular-based or pedestrian-based MAST's can be configured to be controlled using a head-mounted controller.
  • headgear to be worn by a member of the field force 113 may be provided with an inertial sensor, a transparent partial (one-eye) visor, a digital camera and a rangefmder.
  • the user adjusts his head position to look at a media site through the one-eye visor to target the media site and to perform the distance measurement using the rangefmder 520.
  • the angles used to compute the height of the media site can be derived from the orientation of the user's head.
  • the transparent partial (one-eye) visor positioned over a user's eye could be used to locate and target a media site.
  • the inertial sensor in the helmet can be used to generate motion and direction information based on a person's detected head movements to control the example pan-tilt mechanism 525 to position the cameras 510 and 515 and the rangefmder 520.
  • FIG. 7 is a block diagram of the example site data merger 120 of FIG. 1.
  • the example site data merger 120 includes a data collector 705.
  • the example data collector 705 collects map data 710 from the example third-party images 112 (FIG. 1) and from media site data 711 and media site images 712 collected during one or more media site surveys and/or gathered from the government records 110 (FIG. 1).
  • the example site data 711 and/or the site images 712 may be collected electronically (e.g., collected using the example MAST 111 described herein), may be manually provided from paper records, and/or any combination thereof.
  • the data collector 705 can be implemented in connection with a user interface to enable a user to enter the site data 711 and/or the site images 712 manually. Additionally or alternatively, if any of the site data 711 and/or the site images 712 were previously entered and/or downloaded, the data collector 705 can collect any or all of the data 710-712 from the example site database 105.
  • the example data site merger 120 includes a mapper 715 and a display 720.
  • the example mapper 715 formats and/or creates one or more user interfaces 717 that graphically depict a geographic area and that are presented by the example display 720.
  • Example user interfaces 717 created by the mapper 715 are discussed below in connection with FIGS. 8A-8C and 10.
  • the example display 720 may be implemented using any type of hardware, software and/or any combination thereof that can display a user interface 717 for viewing by a user.
  • the display 720 may include a device driver, a video chipset, and/or a video and/or computer display terminal.
  • the example site data merger 120 includes an overlay er 725.
  • the example overlayer 725 overlays the site data 711 and/or the site images 712 on top of the user interface(s) 717 by providing instructions a) to the display 720 that cause the display 720 to show the overlaid data 711 and 712 and/or b) to the mapper 715.
  • the overlayer 725 may use an application programming interface (API) that directs the mapper 715 and/or the display 720 to add lines and/or text to user interface(s) 717 created by the mapper 715.
  • API application programming interface
  • the example site data merger 120 includes a modifier 730.
  • the example modifier 730 presents one or more user interfaces 735 via the display 720 that allow a user of the site data merger 120 to verify, modify and/or update the site data 711.
  • Example user interfaces 735 for verifying, modifying and/or updating the site data 711 and/or the site database 105 are discussed below in connection with FIGS. 8A-8C and 10.
  • the mapper 715 and/or the overlayer 725 create a first user interface 717 that displays collected media site data 711 overlaid onto an aerial/satellite photograph (e.g., an aerial/satellite photograph from the map data 710) of a geographic area
  • the example modifier 730 presents one or more additional user interfaces 735 that allow a user to adjust the location of a media site based upon the satellite photograph and/or based upon the site images 712.
  • the modifier 730 updates the site database 105 (e.g., the example coordinate fields 228 and 232) based upon the collected (and possibly modified) media site data 711.
  • the Google® Earth mapping service tool is used to implement the example data collector 705, the example mapper 715, the example user interface(s) 717, the example display 720, the example overlayer 725, at least a portion of the example modifier 730 and the example user interface(s) 735 of FIG. 7.
  • other mapping tools such as, for example, Microsoft® Virtual Map could additionally or alternatively be used.
  • the Google® Earth mapping service tool may be implemented as an application that is executed by a general-purpose computing platform (e.g., the example computing platform 1700 of FIG.
  • the Google® Earth mapping service application collects and displays the map data 710 from the third-party images 112 (e.g., satellite images) stored within a server that implements and/or provides the Google® Earth mapping service interface.
  • the Google® Earth mapping service tool is used to generate the user interfaces 717 that may be displayed on a computer terminal associated with the computing platform.
  • the Google® Earth mapping service tool also generates user interfaces 735 that allow a user to verify and/or modify displayed media site data.
  • Another application and/or utility (e.g., the example overlayer 725) that executes upon the computing platform (and/or a different computing platform) formats the site data 711 and the site images data 712 into a data file suitable for use with the Google® Earth mapping service application (e.g., a file structure in accordance with the KML format).
  • the Google® Earth mapping service application reads and/or processes the KML file generated by the overlayer 725, and the user's personal computer and/or workstation displays the resulting overlaid images and/or user interfaces 717 and 735 generated by the Google® Earth mapping service application for viewing by a user.
  • the Google® Earth mapping service tool saves a second KML file that reflects any changes made to the site data 711 by the user using the user interface(s) 735.
  • the example modifier 730 of FIG. 7 parses the site data 711 from the second KML file and adds, stores and/or updates the media site data stored in the site database 105 (e.g., adds, updates and/or modifies the example coordinate fields 228 and 232 of FIG. 2).
  • the site data merger 120 includes a graphical user interface (GUI) 740 (e.g., a user input interface).
  • GUI graphical user interface
  • the example GUI 740 of FIG. 7 may be part of an operating system (e.g., Microsoft® Windows XP®) used to implement the site data merger 120.
  • the GUI 740 allows a user of the site data merger 120 to, for example, select a geographic area to be mapped and/or to select the site data 711 and/or the site images 712 to be overlaid onto a geographic map.
  • the Google® Earth mapping service tool is used to implement a portion of the example site data merger 120, the GUI 740 is used to provide an interface between the user and the Google® Earth mapping service application.
  • the Google® Earth mapping service tool may use an API to display information and/or to receive user inputs and/or selections (e.g., to select and load a KML file) via the GUI 740.
  • the elements, processes and devices illustrated in FIG. 7 may be combined, divided, re-arranged, eliminated and/or implemented in any of a variety of ways.
  • the example data collector 705, the example mapper 715, the example user interface(s) 717 and 735, the example display 720, the example overlayer 725, the example modifier 730, the example GUI 740 and/or, more generally, the example site data merger 120 may be implemented using hardware, software, firmware and/or any combination of hardware, software and/or firmware.
  • the example site data merger 120 may include additional elements, processes and/or devices than those illustrated in FIG. 7 and/or may include more than one of any or all of the illustrated elements, processes and devices.
  • FIGS. 8A, 8B and 8C depict example user interfaces that may be implemented in connection with the example site data merger 120 of FIG. 7 to show locations of surveyed media sites in connection with media site data and to enable users to verify and/or update the media site data.
  • Elements illustrated in FIG. 8A which are substantially similar or identical to elements in FIGS. 8B and 8C are described below in connection with FIG. 8A, but are not described in detail again in connection with FIGS. 8B and 8C. Therefore, the interested reader is referred to the description of FIG. 8A below for a complete description of those elements in FIGS. 8B and 8C which are the same as elements in FIG. 8A.
  • the example user interface 800 includes an image area 805.
  • the example image area 805 can display a satellite photograph and/or image of a geographic area of interest.
  • the example user interface 800 includes any number and/or type of user-selectable user interface controls 810.
  • the controls 810 By using the controls 810, the user can select a desired portion of a satellite, aerial and/or terrestrial image.
  • the example controls 810 include one or more elements that allow the user to, for example, zoom in, zoom out and rotate the image and to pan the image in left-right and/or up-down directions.
  • the example user interface 800 includes a menu 815 that allows a user to, among other things, open a file-open dialog box 820.
  • the example file -open dialog box 820 allows a user to select and load a media site data file, such as small.kmz.
  • the example user interface 800 includes a list display area 830.
  • the example list display area 830 includes a list of media sites including one entitled "Board 1 " and designated by reference numeral 835.
  • the example image area 805 displays information pertaining to one or more of the media sites.
  • the example image area 805 displays two media sites indicated by media site markers labeled "Board 1" and "Board 2."
  • Board 1 is shown with a media site marker and/or icon 840 that represents the surveyed location of Board 1, a bounding box 845 that represents (based upon the accuracy of the surveying tool) an error margin of location coordinates determined or collected for the surveyed location 840 of Board 1 and a line 850 that represents a line of sight from the location where Board 1 was surveyed to Board 1.
  • the example list display area 830 includes check box controls, one of which is indicated by reference numeral 855.
  • the check box 855 is blank and, thus, Board 2 is not illustrated in the example image area 805 of FIG. 8B.
  • the check boxes for Boards 1, 3 and 4 are checked, therefore, Boards 1 , 3 and 4 are displayed, although Boards 3 and 4 are occluded by a photos-and- details window 870.
  • the example list display area 830 includes tree expansion box controls, one of which is indicated in FIG. 8B by reference numeral 860.
  • tree expansion box 860 By alternately clicking on the example tree expansion box 860, information pertaining to Board 1 can be viewed or hidden from view.
  • Example media information includes photos and detailed information that can be accessed by selecting a photos and details link control 865.
  • the photos and details link 865 is clicked, the photos and details window 870 is displayed. Additionally or alternatively, clicking the site marker icon associated with the media site 840 in the image 805 will launch the window 870.
  • Example textual information 875 includes, for example, the name of the owner of the site, the direction the site is facing, the distance to the site, any other information described above in connection with FIG. 2, etc.
  • the user interface 800 illustrated in FIG. 8B (including the example photographs 880), facilitates visually determining that the surveyed location 840 of Board 1 is different from an actual location 885 of Board 1.
  • a properties dialog user interface 890 shown in FIG. 8C may be instantiated by a user. For example, referring to FIG.
  • a selection window not shown
  • the properties dialog user interface 890 is shown.
  • the example properties dialog box 890 of FIG. 8C displays the surveyed location of the media site.
  • the icon displayed at the surveyed media site location 840 also changes to include a target location icon 895 depicted as a box surrounding the site marker.
  • the user can "click and drag" the target location icon 895 from its original location (e.g., the surveyed location 840) to the actual location of the media site 885 as shown in FIG. 8C.
  • the location of the media site e.g., Board 1 is saved with location information representative of the new location 885.
  • the site data merger 120 e.g., the example modifier 730 of FIG.
  • the location of the media site saved in the site database 105 (e.g., the example coordinate fields 228 and 232 of FIG. 2) will be the coordinates of the new location 885 rather than the coordinates of the surveyed location 840.
  • the site data merger 120 also stores other information from the KML file into the site database 105 for the media site. For example, the owner name shown in the textual information 875 of FIG. 8B can be stored in the example owner name field 208 of FIG. 2. Likewise, other elements of the data record can be filled, updated and/or modified based upon the KML file.
  • FIGS. 9A and 9B illustrate an example data structure 900 that may be used to provide media site data to any or all of the example site data mergers 120 described herein.
  • the example data structure 900 is structured in accordance with a KML file. However, any other type of file format may be used (e.g., a file structure in accordance with the Microsoft® Virtual Earth tool).
  • the example data structure 900 represents media site data for a single media site. As described above, multiple data structures for respective media sites may be stored together in a single file, such as a KMZ file.
  • the example data structure 900 includes a filename field 905.
  • the example filename field 905 includes an alphanumeric string that represents the name of the file that contains the data structure 900.
  • the example data structure 900 includes a name field 910.
  • the example name field 910 includes an alphanumeric string that represents the name of the media site.
  • the example data structure 900 includes folder fields 915 and 920.
  • the example folder fields 915 and 920 delineate the start and end of the media site information for the media site, respectively.
  • the example data structure 900 includes entries 925.
  • the example entries 925 define, describe and provide the information to be displayed when, for example, the example photos and details link 865 of FIG. 8B is selected and/or the site marker icon 840 (FIG. 8B) for the media site is clicked.
  • the entries 925 define the file name 930 of an image to be displayed.
  • the example data structure 900 includes entries 935.
  • the example entries 935 include the start and end coordinates 940 of the line, as well as a width and color 945 for the line.
  • the example data structure 900 includes entries 950 (FIG. 9B).
  • the example entries 950 include the coordinates of a set of points 955 that collectively define the boundary of the potential media site location error, as well as a width and color 960 for the line.
  • the example data structure 900 includes coordinates 965.
  • the example coordinates 965 represent the surveyed location of the media site (e.g., the example location 840 of FIG. 8B). If the data structure 900 is the output of the site data merger 120, the example coordinates represent the verified location of the media site (e.g., the example location 885 of FIG. 8B).
  • the example data structure 900 includes entries 970.
  • the example entries 970 contain values that represent the point of view from the survey location to the media site.
  • the example entries 970 contain coordinates 975 of the survey location, a distance 980 to the media site, a viewing angle (relative to the horizon) 985 from the survey location to the media site and a heading 990 of the surveying equipment.
  • example data structure 900 is illustrated as having the above-described fields and information, the example methods, apparatus and systems described herein may be implemented using other data structures having any number and/or type(s) of other and/or additional fields and/or data. Further, one or more of the fields and/or data illustrated in FIGS. 9A and 9B may be omitted, combined, divided, re -arranged, eliminated and/or implemented in different ways. Moreover, the example data structure 900 may include fields and/or data additional to those illustrated in FIGS. 9A and 9B and/or may include more than one of any or all of the illustrated fields and/or data.
  • FIG. 10 illustrates another example user interface 1000 that may be used to verify the location of a media site.
  • the example user interface 1000 may be used to implement the example image area 805 of FIGS. 8A-8C.
  • a surveyed location indicator 1005 of a media site is overlaid on top of four images 1010, 1011, 1012 and 1013 rather than the single aerial/satellite image illustrated in FIGS. 8A-8C.
  • the example images 1010-1013 of FIG. 10 represent and/or illustrate the area surrounding the media site from different locations and/or points of view. By viewing the surroundings of the media site from different perspectives, the location of the media site may be more accurately determined and/or verified.
  • FIGS. 11 and 12 are flowcharts representative of machine readable instructions that may be executed to implement the example media site data collection system 100 of FIG. 1.
  • FIG. 13 is a flowchart representative of machine readable instructions that may be executed to implement the example survey planner 130 of FIGS. 1 and 2.
  • FIG. 14 is a flowchart representative of machine readable instructions that may be executed to implement the example site data merger 120 of FIGS. 1 and 7.
  • FIG. 15 is a flowchart representative of machine readable instructions that may be executed to implement the example mobile assisted survey tool 111 of FIGS. 1, 5A and 6A-6D.
  • the example processes of FIGS. 11 -15 may be performed using a processor, a controller and/or any other suitable processing device. For example, the example processes of FIGS.
  • FIGS. 11 -15 may be implemented in coded instructions stored on a tangible medium such as a flash memory, a read-only memory (ROM) and/or random-access memory (RAM) associated with a processor (e.g., the example processor 1705 discussed below in connection with FIG. 17).
  • a processor e.g., the example processor 1705 discussed below in connection with FIG. 17.
  • some or all of the example processes of FIGS. 11-15 may be implemented using any combination(s) of application specific integrated circuit(s) (ASIC(s)), programmable logic device(s) (PLD(s)), field programmable logic device(s) (FPLD(s)), discrete logic, hardware, firmware, etc.
  • ASIC application specific integrated circuit
  • PLD programmable logic device
  • FPLD field programmable logic device
  • FIGS. 11-15 may be implemented manually or as any combination(s) of any of the foregoing techniques, for example, any combination of firmware, software, discrete logic and/or hardware. Further, although the example processes of FIGS. 11-15 are described with reference to the flowcharts of FIGS. 11 -15, other methods of implementing the processes of FIGS. 11-15 may be employed. For example, the order of execution of the blocks may be changed, and/or some of the blocks described may be changed, eliminated, sub-divided, or combined. Additionally, any or all of the example processes of FIGS. 11 -15 may be performed sequentially and/or in parallel by, for example, separate processing threads, processors, devices, discrete logic, circuits, etc.
  • the example process of FIG. 11 may be used to collect and merge media site data and/or information from multiple data sources (e.g., the example data sources 110-113 of FIG. 1) into a site database (e.g., the example database 105).
  • the example process of FIG. 11 begins with processing of media site data from government records (block 1105) (e.g., the example government records 110 of FIG. 1) by, for example, performing the example process of FIG. 12.
  • a survey planner e.g., the example survey planner 130 of FIGS. 1 and 3) identifies the dense site areas and sparse site areas (block 1110) by, for example, performing the example process of FIG. 13.
  • the dense site areas are surveyed using pedestrian -based MAST's (block 1115), and the sparse site areas are surveyed using vehicular-based MAST's (block 1120).
  • the dense and sparse site areas are surveyed using the example process of FIG. 14 described below.
  • the example process of FIG. 11 is then ended.
  • the illustrated example process is used to process media site data from government records (e.g., the government records 110 of FIG. 1).
  • the site data merger 120 (FIGS. 1 and 2) obtains media site data from the government records 110 (block 1205).
  • the government records 110 may be obtained from any number and/or type(s) of government agencies and/or offices.
  • the media site data collected from the government records 110 is then entered and/or loaded into the site database 105 (FIG. 1) (block 1210).
  • the site data merger 120 can estimate locations of media sites (block 1215).
  • the site data merger 120 uses the user interfaces 717 (FIG. 7) to plot and verify location and site information of each media site profile (block 1220).
  • the site data merger 120 can present the location and site profile information of the media site locations to a user for verification using any or all of the example user interfaces of FIGS. 8A-8C and/or 10.
  • the modifier 730 (FIG. 7) of the site data merger 120 can determine geocodes (e.g., a longitude coordinate and a latitude coordinate) for the media sites (block 1225), and store the geocodes in the site database 105 (FIG. 1) (block 1230).
  • the modifier 730 can store the geocodes in the example coordinate fields 228 and 232 of the data structure 200 of FIG. 2.
  • the example process of FIG. 12 is then ended by, for example, returning control to the example process of FIG. 11.
  • the depicted example process is used to implement the example survey planner 130 of FIGS. 1 and 3.
  • the data collector 305 (FIG. 3) of the survey planner 130 obtains zoning data for a geographic area (block 1305) and traffic count data for the geographic area (block 1310).
  • the traffic count is a count of all movements for cars, trucks, buses and pedestrians per geographic area for a given duration.
  • the mapper 315 (FIG. 3) of the survey planner 130 displays an image of the geographic area (block 1315) via one of the user interfaces 317 (FIG. 3).
  • the overlayer 325 (FIG. 3) overlays the obtained zoning and traffic count data onto the image of the geographic area (block 1320). For example, the overlayer 325 creates a KML file that the mapper 315 loads and uses to overlay the zoning and traffic count data.
  • the partitioner 330 (FIG. 3) of the survey planner 130 identifies dense media site areas and sparse media site areas (block 1330) based on the overlaid zoning and traffic count data.
  • the partitioner 330 partitions or sub-divides the dense and sparse media site areas (block 1335), and the assignor 335 (FIG. 3) assigns the sub-divided portions to surveyors (e.g., member(s) of the example field force 113 of FIG. 1) (block 1340).
  • the assignor 335 assigns dense areas to be surveyed by pedestrian surveyors using pedestrian-based MAST's and assigns sparse areas to be surveyed by vehicular surveyors using vehicle-based MAST's (e.g., the MAST 111 of FIGS. 6A-6D).
  • the example process of FIG. 13 is then ended by, for example, returning control to the example process of FIG. 11.
  • example media site data collection system 100 collects media site data and/or information for the media site (block 1405).
  • the example media site data collection system 100 can collect the media site data (e.g., site profile and geocode information) using the example process described below in connection with FIG. 15.
  • the site data merger 120 (FIGS. 1 and 7) displays or plots the collected media site data (block 1410).
  • the mapper 715 and the overlayer 725 can use a Google® Earth mapping service window in connection with the example user interfaces of FIGS. 8A-8C and/or 10 to display the media site data in connection with aerial maps, satellite photographs, etc.
  • One or more of the user interfaces 735 and the modifier 730 (FIG. 7) of the data merger 120 then verify and adjust media site location information (block 1415). For example, one or more of the user interfaces described above in connection with FIGS. 8A-8C and 10 may be used to verify and/or adjust the media site location based on user input.
  • the modifier 730 then stores or uploads the media site data to the site database 105 (block 1420).
  • the modifier 730 can parse a KML file to extract values (e.g. site profile and geocode information) that are used to fill fields of a media site data structure (e.g., the example data structure 200 of FIG. 2) stored in the site database 105 to store the updated and/or verified media site data.
  • the example process of FIG. 14 is then ended by, for example, returning control to the example process of FIG. 11.
  • FIG. 15 a depicted example process may be implemented to collect and/or obtain media site data for a media site.
  • the display interface 555 (FIG. 5B) of the user-interface apparatus 505 displays real-time images of a general area of interest (block 1505) captured using the MAST 111 (FIGS. 1, 5A and 6A-6D).
  • a user may manually adjust the MAST 111 as described above in connection with FIGS. 6A and 6B to capture a real-time video feed of a general area of interest in which one or more media sites may be located.
  • the camera positioner interface 580 (FIG. 5B) can control the pan-tilt mechanism 525 (FIG. 5A) to position the field of view of the video camera 510 to capture real-time video of the general area of interest.
  • the captured real-time images are communicated to the user-interface apparatus 505, and the display interface 555 (FIG. 5B) displays them to a user as shown in FIG. 6C.
  • a media site object of interest is then selected in the real-time images (block 1510).
  • a user may visually identify an advertisement object of interest and elect to gather site data about that advertisement object.
  • an automatically positionable MAST 111 as described above in connection with FIG. 6C, a user may use the user-input interface 560 (FIG. 5B) of the example user-interface apparatus 505 to select a location on an image (e.g., a real-time video feed image) displayed via the display interface 555 (FIG. 5B) to specify an advertisement object to be automatically visually located by the MAST 111.
  • the camera positioner interface 580 (FIG. 5B) of the user-interface apparatus 505 determines tilt and pan rotation angles and controls the pan-tilt mechanism 525 (FIG. 5A) to set a pan rotation and a tilt angle to aim the digital still picture camera 515 and the rangefinder 520 at the selected media site object (block 1515).
  • the camera positioner interface 580 sets the pan rotation and the tilt angle of the camera 515 and the rangefinder 520 by controlling the pan-tilt mechanism 525 to position the MAST 111 to position the field of view of the digital still picture camera 515 (FIG. 5) so that the advertisement object of interest is in substantially the center of the field of view of the camera 515.
  • the pan rotation and the tilt angle of the camera 515 and the rangefinder 520 can be controlled manually as described above in connection with FIGS. 6A and 6B.
  • the MAST 111 can be provided with a manually controlled pan-tilt adjustment mechanism to allow a user to perform coarse position adjustments of the MAST 111 and can also be provided with the electronic pan -tilt mechanism 525 to enable the camera positioner interface 580 to automatically control fine position adjustments.
  • the rangefinder 520 (FIGS. 5A and 6A-6C) measures the distance to the media site (block 1520). That is, the rangefinder 520 determines a distance value representative of a distance between the digital camera 515 and the media site object of interest selected by the user.
  • the camera controller 585 (FIG. 5B) of the user-interface apparatus 505 determines a zoom level (block 1522) at which to set the digital camera 515 to capture an image of the user-specified media site. In the illustrated example, the camera controller 585 determines the zoom level based on the distance measured by the rangefmder 520 at block 1520 so that the digital camera 515 can capture at least a portion of the media site object specified by the user at block 1510. The camera controller 585 then sets the zoom level of the digital camera 515 (block 1523) and triggers the digital camera 515 to capture one or more images of the media site (block 1525).
  • the user-interface apparatus 505 causes the GPS receiver 535 to determine the current location of the MAST 111 (block 1530).
  • the data interface 570 (FIG. 5B) of the user- interface apparatus 505 stores the zoom level of the digital camera 515, the distance to the user- specified media site, the captured image(s), the pan and tilt angles of the digital camera 515 and the rangefmder 520, the location information of the MAST 111 and a timestamp indicative of a time at which the digital camera 515 captured the media site image(s) (block 1535).
  • the location information generator 590 (FIG.
  • the example process of FIG. 15 ends by, for example, returning control to the example process of FIG. 14.
  • FIG. 16 illustrates a three-dimensional Cartesian coordinate system showing a plurality of dimensions that may be used to determine a location of a media site 1602 based on a location of the MAST 111 at the time it is used to capture an image of the media site 1602.
  • a location (X1,Y1) of the MAST 111 (observer) is designated by reference numeral 1604, and a location (X2,Y2) of the media site 1602 to be determined is designated by reference numeral 1606.
  • the dimensions used to determine the media site location (X2,Y2) 1606 are shown in association with a right-angle triangle A and another right-angle triangle B overlaid on the Cartesian coordinate system.
  • a first leg of the triangle A represents a MAST-to-media site ground distance (G) extending between the MAST location 1604 and the media site location 1606 and a second leg of the triangle A represents a height (H) of the media site.
  • the MAST-to-media site ground distance (G) and the media site height (H) are determined as described below in connection with equations 1 and 2.
  • a hypotenuse of the triangle A represents a range (R) measured by the rangefmder 520 (FIG. 5) and extends from the MAST location 1604 to substantially the center of the media site 1602.
  • An angle ( ⁇ ) between the second leg (G) and the hypotenuse (R) of the triangle A represents a tilt angle ( ⁇ ) of the rangefmder 520 at the time it measured the range (R).
  • the tilt angle ( ⁇ ) can be provided by the pan-tilt mechanism 525 (FIGS. 5A and 6C).
  • the tilt angle ( ⁇ ) can be provided by a tilt angle sensor (not shown) fixedly mounted relative to the rangefmder 520. In this manner, as the rangefmder 520 is tilted, the tilt angle sensor is also tilted by the same amount to detect the tilt angle of the rangefmder 520.
  • a direction of travel line 1608 represents a heading of the MAST 111 (e.g., the heading of a vehicle carrying the MAST 111).
  • a first angle ( ⁇ l) defined by the travel line 1608 and a first leg of the triangle B represents the angular heading of the MAST 111 (e.g., the vehicle carrying the MAST 111) relative to an x-axis of the Cartesian coordinate system (i.e., the MAST-travel angle ( ⁇ l)).
  • the MAST-travel angle ( ⁇ l) can be provided by the digital compass 530 (FIGS. 5A and 6C) or the GPS receiver 535 (FIGS. 5A and 6A-6C).
  • a second angle ( ⁇ 2) defined by the travel line 1608 and a hypotenuse of the triangle B represents the angle of the rangefmder 520 relative to the heading of the MAST 111 (i.e., the rangefmder-MAST-heading angle ( ⁇ 2)).
  • the rangefinder-MAST -heading angle ( ⁇ 2) can be provided by the pan-tilt mechanism 525 (FIGS. 5A and 6C).
  • the rangefmder- MAST-heading angle ( ⁇ 2) can be provided by the rotary encoder 635.
  • An angle ( ⁇ ) defined by the hypotenuse and the first leg of the triangle B represents the angle between the location (X2,Y2) of the media site 1602 and the x-axis of the Cartesian coordinate system.
  • the angle ( ⁇ ) can be determined as described below in connection with equation 3.
  • equation 1 below is used to determine the MAST-to- media site ground distance (G), and equation 2 below is used to determine the media site height (H).
  • Equation 2 H (R) sine ( ⁇ )
  • the MAST-to-media site ground distance (G) is determined by multiplying the MAST to media site range (R) by the cosine of the tilt angle ( ⁇ ).
  • the media site height (H) is determined by multiplying the MAST to media site range (R) by the sine of the tilt angle ( ⁇ ).
  • Equation 3 ⁇ l + ⁇ 2
  • the first leg of triangle B is labeled as ( ⁇ X) and the second leg is labeled as ( ⁇ Y).
  • the distance of the first leg ( ⁇ X) represents a distance extending between a right-angle intersection 1610 of the first and second legs of triangle B and the location (X1,Y1) of the MAST 111 at a time at which the MAST 111 captured an image of the media site 1602.
  • the distance of the second leg ( ⁇ Y) represents a distance extending between the right-angle intersection 1610 and the location (Xl ,Yl) of the MAST 111.
  • the distance ( ⁇ X) represented by the first leg is determined using equation 5 below
  • the distance ( ⁇ Y) represented by the second leg is determined using equation 6 below.
  • the distance ( ⁇ X) represented by the first leg of triangle B is determined by multiplying the MAST-to-media site ground distance (G) by the cosine of the angle ( ⁇ ).
  • the distance ( ⁇ Y) represented by the second leg of triangle B is determined by multiplying the MAST-to-media site ground distance (G) by the sine of the angle ( ⁇ ).
  • the media site location (X2,Y2) 1606 is determined using equation 7 and 8 below.
  • the x-axis location coordinate (X2) of the media site 1606 is determined by adding the x-axis location coordinate (Xl) (1604) of the MAST 111 to the distance ( ⁇ X) represented by the first leg of triangle B.
  • the y-axis location coordinate (Y2) of the media site 1606 is determined by adding the y-axis location coordinate (Yl) (1604) of the MAST 111 to the distance ( ⁇ Y) represented by the second leg of triangle B.
  • FIG. 17 is a block diagram of an example processor platform 1700 that may be used and/or programmed to implement any or all of the example MAST 111, the example site data merger 120 and/or the example survey planner 130 of FIGS. 1, 3, 5A and/or 7.
  • the processor platform 1700 can be implemented by one or more general purpose processors, processor cores, microcontrollers, etc.
  • the processor platform 1700 of the example of FIG. 17 includes at least one general purpose programmable processor 1705.
  • the processor 1705 executes coded instructions 1710 and/or 1712 present in main memory of the processor 1705 (e.g., within a RAM 1715 and/or a ROM 1720).
  • the processor 1705 may be any type of processing unit, such as a processor core, a processor and/or a microcontroller.
  • the processor 1705 may execute, among other things, the example processes of FIGS. 11 -15 to implement the example MAST 111, the example site data merger 120 and/or the example survey planner 130 described herein.
  • the processor 1705 is in communication with the main memory (including a ROM 1720 and/or the RAM 1715) via a bus 1725.
  • the RAM 1715 may be implemented by DRAM, SDRAM, and/or any other type of RAM device, and ROM may be implemented by flash memory and/or any other desired type of memory device. Access to the memory 1715 and 1720 may be controlled by a memory controller (not shown).
  • the RAM 1715 may be used to store and/or implement, for example, one or more audible messages used by an interactive voice response system and/or one or more user interfaces.
  • the processor platform 1700 also includes an interface circuit 1730.
  • the interface circuit 1730 may be implemented by any type of interface standard, such as a USB interface, a Bluetooth interface, an external memory interface, serial port, general purpose input/output, etc.
  • One or more input devices 1735 and one or more output devices 1740 are connected to the interface circuit 1730.
  • the input devices 1735 and/or output devices 1740 may be used to implement, for example, the example displays 320 and 720 of FIGS 3 and 7.
  • FIG. 18 is a block diagram of an example auditing system 1800 to audit outdoor or indoor signage.
  • the example auditing system 1800 is a modified version of the MAST 111 described above.
  • the MAST 111 is still mounted on a vehicle and structured as shown in FIG. 5A.
  • the user interface apparatus 505 is modified to perform one or more automatic signage auditing tasks.
  • the apparatus 505 is programmed with the location(s) of one or more outdoor signs to be audited. These location(s) can be identified by, for example, geocodes.
  • the user interface apparatus 505 is further provided with navigation software to provide turn-by-turn directions to guide a driver of the vehicle carrying the modified MAST 111 to guide the vehicle to the signage to be audited.
  • the user interface apparatus 505 may provide directions to any operator carrying the modified MAST 111 in virtually any environment in which an automobile may be impractical or prohibited, whether indoor or outdoor, such as, for example, in a dense urban setting, a supermarket, a stadium, an amusement park, an amphitheatre, an airport, a train station, a subway, a bus, a shopping mall, or anywhere else signage may appear.
  • the apparatus 505 is programmed with the locations of the signage to be audited. For indoor applications, mapping systems other than GPS band systems may be utilized.
  • the modified MAST 111 uses the pan/tilt mechanism 525 to point a high resolution camera 515 at the location of the sign.
  • the camera 515 is then actuated to take a first high resolution photograph of the sign being audited.
  • the geocodes provided by the GPS receiver 535 at the time of the first photograph are recorded in association with the first photograph.
  • the pan/tilt angles of the camera and/or the date(s) and time(s) at which the photographs are taken are recorded in association with the first photograph.
  • the geographic location information allows triangulation of the site's geographic location.
  • the user interface apparatus 505 then waits for the MAST 111 to move a distance to a second location wherein the sign is expected to still be within photographic range of the camera 515.
  • the modified MAST 111 again uses the pan/tilt mechanism 525 to adjust the camera 515 and point the high resolution camera 515 at the location of the sign.
  • the camera 515 is then actuated to take a second high resolution photograph of the sign being audited.
  • the geocodes provided by the GPS receiver 535 at the time of the second photograph are recorded in association with the second photograph.
  • the pan/tilt angles of the camera and/or the date(s) and time(s) at which the photographs are taken are recorded in association with the second photograph.
  • the user interface apparatus 505 then directs the operator to the next location/sign to be audited.
  • the collected photographs and associated data are then used to compare the actual signage photographed to the signage identified in a database. For example, billboard owners are often paid a fee to display an advertisement (e.g., an advertisement for Movie A) on particular billboard(s) (e.g., a billboard at intersection B) for particular dates (e.g., the month of June, 2008).
  • a database is created reflecting the identity of the advertisement, the billboard(s) on which it is to be displayed, and the time frame.
  • the photographs and geographic location information collected by the modified MAST 111 can then be automatically and/or manually compared against the data in the database to determine whether the billboard owner is in fact complying with the requirements of the advertisement purchase.
  • the database is preferably maintained by the advertiser and/or by a neutral third party (e.g., The Nielsen Company) who is provided with the relevant data by the advertiser, the advertising agency, the billboard owner and/or any other entity.
  • a neutral third party e.g., The Nielsen Company
  • the information provided to the neutral third party is at least partially based on a contract to provide advertising services via, at least in part, indoor and/or outdoor signage.
  • the neutral third party can develop, provide and/or sell reports verifying that billboard owners are meeting their contractual obligations.
  • the neutral third party can provide reports reflecting the condition(s) of the various signage to ensure that the advertisement is being displayed at a sufficient (e.g., contracted for) quality level.
  • the photographs and geographic location information collected by the modified MAST 111 can be automatically and/or manually compared against the data in the database to determine whether the billboard owner is in fact complying with the requirements of the advertisement purchase.
  • Automatic comparison can be performed using image recognition techniques.
  • optical character recognition can be employed to detect one or more logos appearing in the photographs of the signage. Methods and apparatus to recognize logos are described in U.S. Patent Application Serial No. 60/986,723, filed on November 9, 2007 and entitled "Methods and Apparatus to Measure Brand Exposure in Media Streams," which is hereby incorporated by reference in its entirety.
  • a manual visual comparison of the photographs collected by the modified MAST 111 can be compared to a stored image of the advertisement expected to be carried by the signage to verify compliance.
  • the modified MAST 111 is used to collect other types of information from indoor or outdoor signage. For instance, in some examples a technique similar to that described above is employed to collect images of movable letter signage at fuel or petroleum (e.g., gas) stations with the modified MAST 111.
  • the gas prices appearing on the signage photographed by the MAST 111 can be recognized using optical character recognition analysis to develop lists, statistics and/or economic indicators about gas prices and/or other economic indicators such as, for example, oil prices, oil supplies, a recession, currency values, consumer confidence, or any other bellwether or index that correlates, even remotely, with the price of fuel.
  • the gas price statistics can be developed in near real time so that the data is extremely current.
  • the gas price statistics including, for example, the best gas prices in a particular region, can be published in any form to assist consumers in their purchase decisions.
  • the gas price numbers can be broadcast via the Internet and downloaded to consumers in their automobiles via, for instance, satellite, wifi, or any other broadcast medium.
  • the downloaded information is integrated into navigation systems in the automobiles to guide or direct consumers to a particular fuel station that has low-priced gas.
  • the modified MAST 111 is utilized to photograph movie theater signage to collect data concerning motion pictures.
  • the data collected could be used to determine the number of movie theaters running a particular movie (e.g., a geographic area of interest), the length of time a given movie remains in theaters, etc.
  • the data collected by the modified MAST 111 could be used to create reports that enable movie distributors to verify that they are receiving appropriate compensation from movie theaters (e.g., movie theaters are not running movies outside of- e.g., before or after - the contracted for time frame).
  • the photographs taken by the MAST 111 are preferably date and time stamped to facilitate comparison to appropriate contract terms and/or payment records.
  • this example method of monitoring the distribution of motion pictures may be used to detect whether royalties were paid for showing the motion picture on a particular date and to generate a report based on the detection.
  • unauthorized distribution of a motion picture may also be detected.
  • the modified MAST 111 is used to audit the condition of signage to enable a signage provider to determine when signage is in need of repair or replacement.
  • governmental agencies charged with providing road signage e.g., the US. Department of Transportation, etc.
  • the example auditing system is adapted to survey the condition of road signs and identify signs in need of repair or replacement.
  • the process described above i.e., taking photographs with the modified MAST 111 of the expected position of the road sign(s) and analyzing the photographs) can be used to determine which, if any sign(s) are in need of attention.
  • Damaged signs can be detected by comparing the photograph of the sign to a recorded image of the expected appearance of the sign.
  • the modified MAST 111 may also be used to audit the environment proximate a road sign. For example, as described herein, if a road sign is obscured by fallen debris or any other growth of vegetation, the environment proximate the road sign may need to be cleared. Furthermore, the modified MAST 111 may also be used to survey road signs and audit the road signs to determine if the road signs contain current information. When new streets, access ramps and/or traffic patterns, for example, are created, road signs typically need to be updated to reflect the change. If a road sign has been overlooked and left unchanged, the outdated signage may be located by methods and apparatus described herein. Other road signs in need of attention may be surveyed as well. Also, as explained above, these comparisons can be fully automated, partially automated or manual.
  • the modified MAST 11 IA and/or related methods may be used with a service such, as for example Google® Earth mapping or "Street View" services.
  • a service such as for example Google® Earth mapping or "Street View" services.
  • Google® Earth mapping or "Street View” services Through such service, a user could contract, for example via the Internet, for a vehicle or other operator to pass by and automatically or otherwise capture high resolution images of a particular location such as, for example, a house, a stretch of road, a lot of land, etc.
  • the service may provide the user with the images (e.g., still or video) substantially instantaneously (e.g., via the Internet or a wireless communications service) or at a later time and/or date.
  • This implementation may be used by a user to for a multitude of purposes including, for example, monitoring a house or other property the user has for sale, monitoring a house or other property the user is considering buying, checking on hotels the user is considering visiting, monitoring the user's own home while away on vacation or business, etc.
  • mortgage companies or other business may use the service to monitor or investigate general or specific conditions of their current or prospective holdings.
  • a business may use the modified MAST 11 IA and/or related methods to monitor its franchises and/or its competition.
  • a business such as McDonalds® could use or contract use of the modified MAST 11 IA and/or related methods to have images or videos of one or more restaurant location(s) captured to monitor an appearance of the restaurant(s).
  • a business such as, for example, Blockbuster® could use or contract use of the modified MAST 11 IA and/or related methods to have its competition video/photographed for market research purposes.
  • the modified MAST 11 IA may be mounted, for example, on taxi cabs and the dispatching (i.e., implementation of the service) could be a passenger-less "filler" task between paying fares.
  • the example signage auditing system 1800 may be used to audit a media site to verify that information recorded about the media site such as, for example, the information stored in the site database 105 described above, is accurate and/or to verify that the media site is being operated in accordance with contractually agreed terms.
  • the system 1800 may be used to verify that a media site is in the exact location described to an advertiser, is illuminated, contains certain names, trademarks, words, characters, images, prices, or other indicia, is a particular type of signage or other advertisement, etc.
  • the example system 1800 includes a controller 1805 that may include, in whole or in part, the survey planner 130 described above.
  • the controller 1805 determines one or more media location(s) to be surveyed or audited.
  • Data associated with the media sites to be surveyed is stored in a memory, which may include at least some of the data detailed above with respect to the site database 105 and/or the data described above with respect to FIG. 2 including, for example, a recorded location of the media site, a recorded condition of the media site, a sign type and/or recorded indicia (e.g., an expected advertisement), as detailed herein.
  • the example system 1800 further includes a modified MAST 11 IA.
  • the modified MAST 11 IA is substantially similar to the MAST 111 described above in connection with FIG. 5A and 5B.
  • the example modified MAST 11 IA of FIG. 18 has been modified to include an instructor 1815.
  • the instructor 1815 instructs a person driving a vehicle or person carrying the modified MAST 11 IA, (e.g., an employee or contractor of an advertiser or a third party auditing company), to a location at or proximate the indoor or outdoor signage to be surveyed.
  • the location(s) to be surveyed are supplied by the modified MAST 11 IA using any suitable communications media (e.g., a wired or wireless connection, via the Internet, etc.).
  • the instructor 1815 of the illustrated example includes and/or interfaces with a navigation program and a database of street maps.
  • the navigation program utilizes the location information provided by the GPS receiver 535 and the street maps to provide directions to the person moving the modified MAST 11 IA through any suitable audio or visual signal.
  • the pan/tilt mechanism 525 of the modified MAST 11 IA directs the camera 515 to capture a first image of the media site.
  • the modified MASTl 1 IA includes a memory or storage medium to record the photograph in association with the position of the camera 1820. In the illustrated example, the modified MAST 11 IA records the geocode of the precise location of where the first image was captured.
  • the instructor 1815 instructs the driver to move to a second location at or proximate the same media site to be surveyed.
  • the second location is different than the first location.
  • the MAST 11 IA once again directs the camera to a certain tilt, pan, zoom, etc. to capture a second image of the media site to be surveyed.
  • the MAST 11 IA also records the position (e.g., the geocode) of the second location.
  • the modified MAST 11 IA transfers the data it collects to a comparator 1835.
  • the modified MAST 11 IA can be communicatively coupled to the comparator 1835 and/or the controller 1805 using any past, present or future networking and/or communication technology.
  • the comparator 1835 is implemented by a central facility of an auditing company that collects and analyzes data from a plurality of modified MASTs 11 IA.
  • the comparator 1835 can process the data collected by the MAST(s) 11 IA in any manner suitable for the intended application. For example, if the application is to verify the location of a media site such as a billboard, the comparator 1835 reviews the first and second photographs taken by the modified MAST 11 IA of the location of the media site recorded in the database. In other words, the MAST 11 IA captures images of the location where the media site is supposed to be. If the media site is at or near the recorded location, the media site will appear in both the first and the second images. If the media site is not shown in both of the images, then the comparator 1835 concludes that the location of the media site is incorrectly recorded in the database. To determine the actual location of the media site, when the media site does appear in both the first and the second images, the comparator 1835 processes the data from both the first and the second images with, for example, the triangulation techniques described above.
  • FIG. 19 is a flowchart representative of example machine readable instructions that may be executed to implement the example system 1800 of FIG. 18.
  • the example process of FIG. 19 may be performed using a processor, a controller and/or any other suitable processing device. For example, the example process of FIG.
  • FIG. 19 may be implemented in coded instructions stored on a tangible medium such as a flash memory, a read-only memory (ROM) and/or random-access memory (RAM) associated with a processor (e.g., the example processor 1705 discussed below in connection with FIG. 17).
  • a processor e.g., the example processor 1705 discussed below in connection with FIG. 17.
  • some or all of the example process of FIG. 19 may be implemented using any combination(s) of application specific integrated circuit(s) (ASIC(s)), programmable logic device(s) (PLD(s)), field programmable logic device(s) (FPLD(s)), discrete logic, hardware, firmware, etc.
  • ASIC application specific integrated circuit
  • PLD programmable logic device
  • FPLD field programmable logic device
  • any or all of the example process of FIG. 19 may be performed sequentially and/or in parallel by, for example, separate processing threads, processors, devices, discrete logic, circuits, etc.
  • the example process of FIG. 19 begins with determining one or more media sites to be surveyed or audited (block 1905).
  • the geocodes and other recorded data for each media site in the list of the media sites to be surveyed is uploaded (block 1910), for example, into a memory of the modified MAST 11 IA.
  • the following example assumes the modified MAST 11 IA is mounted on a roof of a vehicle. As noted above, other examples do utilize not a vehicle).
  • the instructor 1815 directs the driver of a vehicle or other operator carrying the MAST 11 IA to a location that is within a certain range of the recorded location of the media site (block 1915).
  • the instructions may be in the form of any type of audio and/or visual signals.
  • the instructions may be played through speakers by navigation software associated with the MAST 11 IA.
  • the instructions may be provided in any level of detail including, for example, turn-by-turn instructions to explicitly guide the operator to the recorded location.
  • the modified MAST 11 IA adjusts the settings of the image capture device (e.g., tilt, pan, zoom, as detailed above), a first image of the recorded location is captured, and the location (e.g., geocodes) of the location from which the first image was captured is recorded (block 1925).
  • the captured image may be a still image, a video, a sound, a high resolution photograph, etc.
  • other data e.g., a timestamp
  • the instructor 1815 directs the operator to travel a short distance (block 1930) to a second location.
  • the second location is also within photographic range of the recorded location of the media site.
  • the image capture device settings are adjusted, a second image is captured of the recorded location of the media site from the second location, and the location of the second location (e.g., geocodes) and/or the MAST 11 IA at the time of the photograph is recorded (block 1935).
  • the captured image may be a still image, a video, a sound, a high resolution photograph, etc.
  • other data e.g., a timestamp
  • the example process also determines if another media site is to be audited (block 1940). If so control returns to block 1915. If no other media sites are to be audited, control advances to block 1945.
  • Characteristics of the recorded location of the media site are determined from the first and/or second images (block 1945). For example, the actual location of the media site may be calculated through one or more triangulation techniques. In addition, any or all of the condition of the media site, the indicia included on the media site, and/or the type of media presented on the media site, may be determined from the first and/or second images. The characteristics of the media site are compared to the recorded characteristics (block 1950) by, for example the comparator 1835. The comparison determines the accuracy of the recorded data (e.g., the data saved in the databases 105, 1810 detailed above) and/or whether the state of the media site matches an expected condition.
  • the accuracy of the recorded data e.g., the data saved in the databases 105, 1810 detailed above
  • the recorded data may be updated and/or a report of the discrepancy can be generated and/or sold.
  • the determination of the characteristics of the recorded location of the media site (block 1945) and the comparison (block 1950) may occur at the first location, the second location, or anywhere else and at any time.
  • Information about any discrepancies may be particularly important, for example to advertisers.
  • an advertiser who paid for a billboard at a specific location off of a highway that is to be illuminated during certain hours for a specific number of weeks or days would be interested in any data about the actual characteristics of the media site. For example, if the billboard is not illuminated, the billboard may be visible to passing traffic for less than the agreed time every day. Also, if a billboard is obstructed by overgrown vegetation, the billboard may be less valuable to advertisers.
  • the example process may also be used by media outlets, consumer advocacy groups and/or other companies for intelligence gathering such as, for example, to survey the price of gas at one or more gas stations, to determine information about the advertising strategies of one or more competitors, etc.
  • the processor platform 1700 shown and described in connection with FIG. 17 can be used to execute the instructions of FIG. 19 to implement the system of FIG. 18.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Development Economics (AREA)
  • Strategic Management (AREA)
  • General Physics & Mathematics (AREA)
  • Finance (AREA)
  • Databases & Information Systems (AREA)
  • Accounting & Taxation (AREA)
  • Remote Sensing (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Data Mining & Analysis (AREA)
  • Game Theory and Decision Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

L'invention porte sur des procédés et un appareil à titre d'exemple pour vérifier une signalisation. Un procédé décrit à titre d'exemple entraîne un opérateur à se diriger vers un emplacement de signalisation et à capturer une image d'une signalisation à l'emplacement de signalisation. Le procédé à titre d'exemple comprend également la détection d'une caractéristique réelle de la signalisation sur la base de l'image et la comparaison de la caractéristique réelle avec une caractéristique attendue.
PCT/US2008/059952 2008-04-10 2008-04-10 Procédés et appareil de vérification de signalisation WO2009126159A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2008/059952 WO2009126159A1 (fr) 2008-04-10 2008-04-10 Procédés et appareil de vérification de signalisation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2008/059952 WO2009126159A1 (fr) 2008-04-10 2008-04-10 Procédés et appareil de vérification de signalisation

Publications (1)

Publication Number Publication Date
WO2009126159A1 true WO2009126159A1 (fr) 2009-10-15

Family

ID=41162134

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/059952 WO2009126159A1 (fr) 2008-04-10 2008-04-10 Procédés et appareil de vérification de signalisation

Country Status (1)

Country Link
WO (1) WO2009126159A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110557687A (zh) * 2019-08-02 2019-12-10 视联动力信息技术股份有限公司 一种组播数据包的处理方法、装置及存储介质
CN111275817A (zh) * 2018-12-04 2020-06-12 赫尔环球有限公司 提供特征三角测量的方法和装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6418372B1 (en) * 1999-12-10 2002-07-09 Siemens Technology-To-Business Center, Llc Electronic visitor guidance system
US20020163444A1 (en) * 1997-05-30 2002-11-07 Budnovitch William F. User assistance system for an interactive facility
US20040064245A1 (en) * 1997-08-19 2004-04-01 Siemens Automotive Corporation, A Delaware Corporation Vehicle information system
US20050149398A1 (en) * 2003-12-15 2005-07-07 Mckay Brent T. Media targeting system and method
US20070090937A1 (en) * 2005-10-21 2007-04-26 Stabler Francis R Method for alerting a vehicle user to refuel prior to exceeding a remaining driving distance

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020163444A1 (en) * 1997-05-30 2002-11-07 Budnovitch William F. User assistance system for an interactive facility
US20040064245A1 (en) * 1997-08-19 2004-04-01 Siemens Automotive Corporation, A Delaware Corporation Vehicle information system
US6418372B1 (en) * 1999-12-10 2002-07-09 Siemens Technology-To-Business Center, Llc Electronic visitor guidance system
US20050149398A1 (en) * 2003-12-15 2005-07-07 Mckay Brent T. Media targeting system and method
US20070090937A1 (en) * 2005-10-21 2007-04-26 Stabler Francis R Method for alerting a vehicle user to refuel prior to exceeding a remaining driving distance

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111275817A (zh) * 2018-12-04 2020-06-12 赫尔环球有限公司 提供特征三角测量的方法和装置
CN110557687A (zh) * 2019-08-02 2019-12-10 视联动力信息技术股份有限公司 一种组播数据包的处理方法、装置及存储介质

Similar Documents

Publication Publication Date Title
US8649610B2 (en) Methods and apparatus for auditing signage
US20080170755A1 (en) Methods and apparatus for collecting media site data
US7451041B2 (en) Network-based navigation system having virtual drive-thru advertisements integrated with actual imagery from along a physical route
US8907978B2 (en) Methods, apparatus and systems for generating searchable electronic records of underground facility locate and/or marking operations
US8260489B2 (en) Methods, apparatus, and systems for acquiring and analyzing vehicle data and generating an electronic representation of vehicle operations
US20140132767A1 (en) Parking Information Collection System and Method
KR20110044217A (ko) 3차원으로 내비게이션 데이터를 디스플레이하는 방법
CN116227834A (zh) 一种基于三维点云模型的智能景区数字化平台
WO2014111874A1 (fr) Système et procédé pour évaluer des panneaux d'affichage géomarqués
CN113191841A (zh) 基于增强现实技术的科技创新与文化共享智能平台模式方法
WO2009126159A1 (fr) Procédés et appareil de vérification de signalisation
CN111275823B (zh) 一种目标关联数据展示方法、装置及系统
Verbree et al. Interactive navigation services through value-added CycloMedia panoramic images
Aydin Low-cost geo-data acquisition for the urban built environment
Hummer et al. Choosing an Inventory data Collection system
KR20230007237A (ko) Ar을 이용한 광고판 관리 및 거래 플랫폼
Simeonova et al. Volunteered Citizen Contribution to Smart Cities
Botner et al. Digital Video–GIS Referenced System for spatial data collection and condition assessment to enhance transportation asset management
Yee GPS and video data collection in Los Angeles County-a status report
Yagoub Web-based and Mobile Geographic Information Systems (GIS) in the United Arab Emirates (UAE)
Stephan Innovative technologies for enhanced transportation infrastructure assessment
Kashishian et al. Digital Video Surveying

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08769122

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08769122

Country of ref document: EP

Kind code of ref document: A1