WO2008072429A1 - Système d'affichage de données d'image associées à des informations de carte - Google Patents

Système d'affichage de données d'image associées à des informations de carte Download PDF

Info

Publication number
WO2008072429A1
WO2008072429A1 PCT/JP2007/071583 JP2007071583W WO2008072429A1 WO 2008072429 A1 WO2008072429 A1 WO 2008072429A1 JP 2007071583 W JP2007071583 W JP 2007071583W WO 2008072429 A1 WO2008072429 A1 WO 2008072429A1
Authority
WO
WIPO (PCT)
Prior art keywords
link
information
image
shooting
image data
Prior art date
Application number
PCT/JP2007/071583
Other languages
English (en)
Japanese (ja)
Inventor
Kazuo Oda
Satoshi Nakayama
Hideyuki Kanehara
Nobuyuki Mizutani
Tomoyasu Ota
Hideto Satoh
Jun Yagome
Original Assignee
Locationview Co.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2007106460A external-priority patent/JP4210309B2/ja
Application filed by Locationview Co. filed Critical Locationview Co.
Publication of WO2008072429A1 publication Critical patent/WO2008072429A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal

Definitions

  • the present invention collects image data for grasping the geographical environment by, for example, shooting by road driving or flying over the air, and depending on the required geographical range, image resolution, and amount of image data.
  • the present invention relates to an image data display system associated with geographic information that is extracted, processed, managed spatially and temporally in association with geographic coordinates, and displayed in conjunction with a map. Background art
  • the present invention has been made to solve the above problems, and associates image data obtained by photographing the surrounding environment of a route network having a graph structure with geographic information including a surrounding map.
  • the purpose is to display the structured data.
  • the collected image and the road map are displayed in an interlocked manner so that the target road can be easily selected on the intersection collected image and the direction of the collected image can be displayed in a linked manner on the road map. Get the way.
  • the image data display system with geographic information uses an omnidirectional camera that shoots every fixed time mounted on a mobile body, and a hybrid type position sensor having GPS, and includes a plurality of route networks. Replays all-around image with time information obtained by moving the link and displays the image of the rectangular area on the screen, while displaying the path selection polygon for selecting one of the links
  • An image data display system associated with geographic information which stores, for each link in a route network, attribute information including position information and time information from a position sensor and position information for each surrounding image in each link and each surrounding image.
  • Memory of The first link includes the shooting plan information including the link ID when the link was shot and the first actual shooting information including the first actual shooting date when shooting was actually performed based on the shooting plan information.
  • the second ID that is different from the first actual shooting date when the actual shooting was performed based on the same link ID, shooting plan information, and shooting plan information as the link ID included in the first link related information.
  • Link information and node including second storage means storing second actual shooting information having shooting date as second link related information, link ID, link vector information, and node IDs at both ends of the link
  • a route network with node information including a link ID that connects to the third storage means that stores the background map, and a map composed of the route network and the background map is displayed in a different area from the rectangular image, and the link is made.
  • the attribute information having the link ID of the selected link and the link ID of the selected link is read, and a plurality of all surrounding images in the selected link related to the attribute information are sequentially read from the first storage means in ascending or reverse order.
  • the all-around image is read from the all-around image reading means to be read and the first storage means, the attribute information related to the all-around image is read, and the position information of the read attribute information is stored in the node of the route network.
  • the reading of the all-around image is stopped, and the means for obtaining all the directions of the link to the node from the route network, the read all-around image, and the obtained node are obtained.
  • the route selection polygon that indicates the direction of all links is displayed when one of the route selection polygon and the means for interlocking display of the route selection polygon on the rectangular area of the screen is selected.
  • the link ID corresponding to the position of the route selection polygon is read from the route network, and the read link ID has the read link ID, actual shooting information, and shooting plan information indicating whether or not all surrounding images are associated with the link ID! Judgment is made based on whether the first link related information or the second link related information exists. If it exists, the suspension is canceled and the read link ID is set as the selected link ID to the all-around image reading means. And a means for outputting.
  • the present invention relates to map information such as roads and images, for example, property information such as condominiums and stores for real estate transactions, route environment indications from town sights and stations to target real estate properties and stores, Visualize the real situation in the local area by using images as geographical attributes to represent places, such as advertising activities in shopping streets, location and surrounding situations in emergency calls such as firefighting and police, and local situation in road river management. Can tell This is an image data display system associated with geographic information.
  • the surrounding environment of a road network having a graph structure (as shown in Fig. 17, the node is an intersection and the network information is a continuous line segment indicating a road center line between the intersections).
  • the data configured by associating multiple image data obtained by shooting and geographic information including the surrounding map it is possible to display the image corresponding to the corresponding point while interlocking with the map. .
  • a target town is obtained in a wide range and data is provided to provide a service, the user can freely select an arbitrary place to be examined. Also, for example, if you prepare images taken by seasons or at different times, it is an image data display system with geographic information that can provide images of town changes and environmental changes.
  • an image in a state desired by the user for example, an image designating a season or time, etc. Provision becomes possible. For example, if images with different shooting years are available, it will be possible to investigate changes over time in a specific area. In addition, “I want to see the image of the cherry blossom season” and “I can see the image at the time of autumn leaves”, it is possible to see the state, the state of the cityscape and the tourist road as seen by the user.
  • an icon for selecting a road is displayed at an intersection, a corresponding link is allocated by selecting this icon, and an image in this link is displayed and displayed on a map.
  • the position of the captured image and the shooting range and direction are known. For this reason, the operator can easily determine where the image is. In addition, since the direction of travel can be selected from the image, the operability is excellent.
  • FIG. 1 is a configuration diagram of an image data management system with map information association and a display system.
  • FIG. 2 is a configuration diagram of an image capturing subsystem.
  • FIG. 3 is a block diagram of an image data editing subsystem.
  • FIG. 4 A map displayed by the map data display means.
  • FIG. 5 is an image displayed on the road surface of image data display means.
  • FIG. 6 is a diagram showing a method of image thinning by the image thinning means.
  • FIG. 7 is a configuration diagram of an image data management subsystem.
  • FIG. 8 is a block diagram of the photographing plan means.
  • FIG. 9 is a configuration diagram of photographing information capturing means.
  • FIG. 10 is a configuration diagram of management means.
  • FIG. 11 is a block diagram of data extraction means.
  • FIG. 12 is a structural diagram of a graph structure database.
  • FIG. 13 is a structural diagram of a camera information database.
  • FIG. 14 is a structural diagram of the plan 'actual imaging information database'.
  • FIG. 15 is a structural diagram of a link related information database.
  • FIG. 16 is a structural diagram of a physical image database.
  • FIG. 17 is a diagram showing a feature having a graph structure.
  • FIG. 18 is a flowchart illustrating a procedure in the image data management system with a geographic information association according to the embodiment.
  • FIG. 19 is a flowchart for explaining a procedure in the image data management system with a geographic information association according to the embodiment.
  • FIG. 20 is a flowchart for explaining a procedure in the image data management system with geographic information according to the embodiment.
  • FIG. 21 is a flowchart illustrating a procedure in the image data management system with a geographic information association according to the embodiment.
  • FIG. 24 An example of annotation overlay display on the display screen of the map information association image data display system.
  • FIG.26 Flowchart of note data search of map information related image data display system.
  • FIG. 27 is a detailed configuration diagram of the image data display system 100 with geographic information.
  • FIG. 28 is an explanatory diagram for simultaneously displaying a photographing position and a corresponding image.
  • FIG. 29 It is an explanatory diagram showing a more specific configuration of cut-out image data.
  • FIG. 30 is an explanatory diagram for explaining a more specific configuration of a physical image database.
  • FIG. 31 This is an explanatory diagram of thinning out of the present embodiment, display of shooting positions, and registration of cut-out data.
  • FIG. 32 is an explanatory diagram for explaining a traveling direction mark display means.
  • FIG. 34 is a schematic configuration diagram of a geographic information-related image data management system according to the second embodiment.
  • FIG. 35 is an explanatory diagram of image grouping for links.
  • FIG. 36 is a block diagram for displaying a traveling direction selection polygon.
  • FIG. 38 is a comparison diagram of images corresponding to the same link photographed in different directions.
  • the image data display system 100 with a geographic information association includes a geographic display means 101, a photographing position display means 102, a photographed image display means 103, and a route search interlocking means. 104, an image search means 105, and a storage device 106 that stores map data 108 and cut-out image data f4.
  • the geographic display means 101 reads the map data 108 stored in the storage device 106, and displays the map while scrolling, switching scales, switching display items, switching display colors, and the like.
  • the imaging position display means 102 stores a storage device on the map displayed by the geographic display means 101.
  • Cut-out image data stored in 106 f4 image file (collected image data), image file additional information (image acquisition coordinates, image acquisition orientation, image acquisition link ID, order in acquisition link, image acquisition camera information (Shooting height etc.))
  • image file collected image data
  • image file additional information image acquisition coordinates, image acquisition orientation, image acquisition link ID, order in acquisition link, image acquisition camera information (Shooting height etc.)
  • the shooting position image acquisition coordinates on the map
  • the user designates this drawing position with a mouse click or the like, and displays a photographed image at a desired photographing point.
  • the captured image display means 103 includes an image display unit and an image continuous display unit.
  • the image display unit displays the image closest to the shooting position. Search using the image acquisition coordinates of the image file attachment information of the cut-out image data, and read and display the closest image file.
  • the stored image is an all-round image (360 ° circumference is taken and equirectangular projection ( If the image was taken with a vertical projection (a projection in which the horizontal projection is linear to the horizontal orientation)
  • An image taken from the shadow center with an arbitrary camera parameter (camera direction 'focal length (angle of view)) is synthesized and displayed by means of CG processing or the like. This kind of display method is used to specify camera parameters.
  • the image is converted into a rectangular area and displayed using texture mapping and recording means generally used in CG (computer graphics).
  • CG computer graphics
  • map data 108 includes annotation data (data consisting of a combination of the feature name on the map 'feature type and position coordinates on the map), as shown in FIG.
  • annotation data data consisting of a combination of the feature name on the map 'feature type and position coordinates on the map
  • the orientation of the object to which the annotation data is added as seen from the image acquisition coordinates is calculated from the image acquisition coordinates and image acquisition orientation included in the cut-out image data f4 and the annotation display position included in the map data 108) and displayed.
  • the continuous image display unit displays images having the same link ID sequentially in ascending or reverse order according to the shooting order in the acquisition link, so that the power is continuously moved. And display it as a video.
  • the continuous display can be controlled with the movement direction designation button and the pause button.
  • the movement direction designation button is displayed as a symbol corresponding to the traveling direction of the image data associated with the same route.
  • the moving speed can be specified in the moving speed specification box.
  • the continuous display is discontinued and the route is selected.
  • route selection the node ID of the last displayed side is checked from the attached information (also called attribute information) of the image file.
  • the graph structure included in the cut-out image data is examined, and a link connected to the node is searched. Furthermore, by checking the image acquisition link ID included in the cut-out image data, if there is image data (collected image data) that captures the link, it is displayed on the image and displayed on each path. Draw a click button, etc., and specify which route image to display next.
  • the route search interlocking means 104 uses the route network included in the map data 108 to search for a route from the specified start point coordinates to the end point coordinates, and the geographical display means 101 The route is displayed on the displayed map, and for the portion where the cutout image data f4 corresponding to the route exists, the shooting position of the image acquisition coordinates of the image is also drawn on the map displayed by the geographic display means 101.
  • the route search interlocking means 104 draws only the shooting positions of all image acquisition coordinates on the route during the route search, and does not draw the rest. At this time, if a shooting position on the route is selected, the other shooting positions are drawn in a different color.
  • the above-described route network includes, for example, a travel locus pattern, a road shape, a width, and the like.
  • a network having a graph structure includes a road, a railroad, a seaway, a pipeline, an airway, a tunnel, and the like, and may be collectively referred to as a route network.
  • the image search means 105 searches for an image of the object being photographed using the note data including the facility name and the like included in the map data 108.
  • the note data includes the note string to be drawn on the map, the note display position, and the note type! /.
  • the image search is performed according to the procedure shown in the flowchart of FIG.
  • Step S200 Verification of note data and image
  • Each note data is searched for an image that has been photographed and judged to be the best!
  • an image file having a shooting position coordinate closest to the note display position of the note data is assigned, and the image file attached information of the cut-out image data having this image file is searched and selected.
  • the defined criteria eg distance threshold, value
  • Step S201 Keyword input
  • a keyword including the name and type of the object you want to search.
  • the types of features include public facilities, parks, and “intersection names”.
  • the keyword specifies all or part of the feature name (Shinjuku Station is examined! /, In this case, “Shinjuku”, etc.).
  • Step S202 Note data search
  • step S200 Search for note data corresponding to the above keywords. At this time, in step S200 Only notes with corresponding images selected are considered. Matching with keywords is performed on note strings or note types.
  • Step S203 Note candidate selection
  • Step S204 Image display
  • An image corresponding to the note selected by the user is cut out from the image data and displayed on the image display unit of the computer screen.
  • the map data 108 includes network data fO such as roads, a background map, note data given to objects existing on the map, and map symbol data.
  • the map data 108 may be general data created for use in a navigation system or the like, rather than data created exclusively.
  • the cut-out image data f4 includes an image file, image file attached information, and graph structure information.
  • the image file attachment information includes the image acquisition coordinates, image acquisition orientation, image acquisition link ID, shooting start node ID, shooting end node ID, and shooting order counted from the shooting start side for each image file.
  • graph structure information associated with the acquired image and image acquisition camera information (such as shooting height).
  • the graph structure of the link indicates the connection relation of the link from which the image is acquired. For each node included in the data, a list ID of the node ID and the link ID of the link connected to the node is displayed.
  • the image acquisition direction is the direction of the central axis of the camera at the time of shooting.
  • the center of the omnidirectional image viewed from the shooting point is the corresponding orientation.
  • the image acquisition link ID corresponds to the ID of the center of the road where the image was taken when the route network is used as a graph structure.
  • the node is a coordinate point for specifying, for example, a road intersection or a dead end.
  • the link is a vector connecting each node, and includes data such as the link start point coordinates, end point coordinates, link direction, and link road width.
  • the cut-out image data f4 is data created by an image data management system with geographic information association 10 described later.
  • the image data display system 100 associated with geographic information may be a program operating alone on a computer.
  • the cut-out image data f4 is a data server, map data 1
  • 08 may be realized as a system on the Internet with a client or data server.
  • the image data management system 10 with geographic coordinates is divided into an image capturing subsystem 20, an image data editing subsystem 30, an image data management subsystem 40, and an image data management subsystem 40. Consists of connected storage devices 50.
  • the image capturing subsystem 20 is connected to the image sensor 22, and an image data collection unit 24 that collects an image of the surrounding environment along the link as collected image data dl together with the photographing time, and a hybrid type A position sensor 23 is connected, and a moving means equipped with a position data collecting means 25 for collecting the photographing position coordinates of the image together with the coordinate measurement time as the collected image attribute information d2 is included.
  • the collected image data dl from the above-described image capturing subsystem is a group of images captured at regular intervals within the link.
  • a hybrid position sensor 22 Also included are a hybrid position sensor 22, a speed sensor 'acceleration sensor', an angular acceleration sensor, and the like.
  • a link refers to a road between two points (from one intersection to the next intersection) having the graph structure shown in FIG.
  • one point is marked as the starting point, and the other point is marked as the ending point, but it does not mean the starting point force, the end point and! /, Or the direction of the link.
  • the moving means refers to an aircraft for flying along a link and shooting from above, or a vehicle 21 traveling on a link road.
  • the image data editing subsystem 30 includes a map data display means 31 for displaying links, an image data display means 32 for displaying image data collected by the image data collection means 24, a photographing time and Link start and end point passage time registration means 33 that records the start time and end point passage time of the link from the coordinate measurement time (GPS time), and image thinning that thins out data from the multiple collected image data dl in the link at regular intervals.
  • Means 34 and actual photographing management information registration means 35 are included.
  • the image data management subsystem 40 includes an imaging plan means 41 for managing an imaging plan by the image imaging subsystem 20, the collected image data dl edited by the image data editing subsystem 30, and the collected image attribute information d2.
  • Shooting information capturing means 42 for capturing information
  • data extracting means 44 for extracting information that meets the specified conditions from information recorded in the storage device 50
  • registration of information in the storage device 50 and storage device 50 Management means 43 for deleting information from the storage device 50 and printing information in the storage device 50.
  • the storage device 50 includes a graph structure database 51 that records link information, and an image sensor.
  • the camera information database 52 that records 22 information, the plan / actual shooting information database 53 that records the shooting plan information, and the collected image data dl and the collected image attribute information d2 captured by the capturing information capturing means 42, respectively. It includes a collected image database 55 and a collected image attribute information database 56 to be recorded, and a link related information database 54 for recording management information for associating information of each database for each link as shown in FIG.
  • Data exchange between the image capturing subsystem 20, the image data editing subsystem 30, and the image data management subsystem 40 is performed via a portable storage medium such as a disk.
  • Each subsystem can be provided with data communication means, and can be performed by wired or wireless data communication.
  • the vehicle 21 is taken as an example of the moving means and the force S and the vehicle 21 are not limited.
  • the image capturing subsystem 20 is a subsystem intended to capture a peripheral image along a link used in the embodiment of the present invention.
  • the image data collection means 24 captures an image while traveling on a road as a link using one or a plurality of synchronized cameras attached to the vehicle 21 as the image sensor 22.
  • the time transmitted from the global radio positioning system (GPS) detected by the hybrid position sensor 23 is displayed in each time series. It is added to the frame and recorded in the temporary memory 26 as the collected image data dl.
  • GPS global radio positioning system
  • time IJ is given to each image frame in units of 1/1000 seconds using the detected GPS time and the clock of the computer on which the image capturing subsystem 20 is mounted.
  • the position data collecting means 25 uses a hybrid position sensor 23 combined with GPS and other sensors (gyro, vehicle speed sensors) attached to the same vehicle 21 to obtain the position obtained from the GPS. Information (coordinates), vehicle speed information obtained from the vehicle speed sensor, acceleration information and direction information obtained from the gyro are collected. Further, the GPS time is assigned to these pieces of information and recorded in the temporary memory 26 as the collected image attribute information d2.
  • the position data collecting unit 25 is provided with a process for instructing the image data collecting unit 24 not to collect image data. You can also stop shooting while stopped. As a result, when the vehicle is stopped due to traffic light or traffic congestion, it is possible to temporarily stop recording image data and save the data collection capacity.
  • the collected image data dl and the collected image attribute information d2 are assigned time information using a common GPS time as shown in FIG.
  • the force S is used to associate the position with the image using as a medium.
  • the camera direction can be obtained from the relative direction of the camera relative to the vehicle 21 and the direction of the vehicle 21.
  • the relative orientation is measured in advance. Normally, the image data collection cycle is shorter, so the position, speed, and direction in each image frame can be obtained by subtracting the position, speed, and direction at the previous and subsequent times.
  • the collected image data dl and the collected image attribute information d2 recorded in the temporary memory 26 are output to a portable storage medium such as a disc as a primary imaging information import file fl, and the next image data editing subsystem Input to 30.
  • the position data collection unit 25 is created by the shooting plan unit 41 of the image data management subsystem 40 described later, and is output to the portable storage medium as a shooting information export file f3 described later.
  • the link shape and information contained in dO are read, and the position information obtained from the hybrid position sensor 23 is superimposed on the computer screen. As a result, the operator what force s I force was data collection of the link of the collection target, Ru.
  • the image data editing subsystem 30 uses the collected image data d2 collected by the image capturing subsystem 20 and output as the primary image capturing information import file fl as the implementation of the present invention. Edited for use in the form of the image data, created by the shooting plan means 41 of the image data management subsystem 40 described later, and using the shape information of the link information output as the shooting information export file f3, a graph such as a route network It is a subsystem designed to correlate and optimize map information with a structure.
  • the map data display means 31 has, as input information, a primary imaging information import file fl output from the imaging system 20 having the same structure as a physical image database to be described later.
  • the link shape is displayed from the time-series position information included in the collected image attribute information d2 constituting the primary shooting information import file fl and the map information included in the shooting information export file f3.
  • the time series position information is displayed as a point sequence, and the link shape is displayed as a continuous line segment.
  • the image data display means 32 displays the collected image data dl of the primary imaging information import file fl as shown in FIG.
  • the image collection position is displayed as a symbol in the map displayed by the map data display means 31.
  • the display of image data can be sequentially switched back and forth in time series.
  • the link start / end point passage time registration means 33 displays the link selected by the mouse on the map shown in FIG. 4 displayed by the map data display means 31, or the link being selected that has been sequentially selected. Specify and register the passage time corresponding to the start and end points. Passing time is one Collection of the next shooting information import file fl Using the time series position information included in the image attribute information d2 to specify from the time of the time series position corresponding to the start point or end point, and the collection of the primary shooting information import file fl There is a method to specify from the image shooting time given to the image data dl.
  • the transit time of the link start point and end point is the same as that for the shooting plan export file, along with information such as shooting date and time, camera information key ID used for shooting, and physical storage location (such as hard disk volume name) of the image to be stored. Saved in the actual shooting result information d3 in the format.
  • the automatic designation method is achieved by searching for a time-series position closest to the start point or end point of the link.
  • You may use the map matching technology used in the navigation system of automobiles, that is, the technology that automatically corresponds to the route you are traveling in consideration of not only the position but also the direction of travel and speed. In this case, the closest point is not necessarily selected.
  • the collected image data dl corresponding to the start time of the link and the end time is displayed on the image display screen.
  • the previous or next image is displayed in time series by operating the time series screen.
  • the image capture time of the image at the intersection determined by the operator can be corrected and registered as the transit time of the link start point or end point.
  • start point passage time of the link registered by the link start / end point passage time registration means 33 is earlier than the end point passage time, the image is taken while traveling from the start point to the end point on the corresponding link. It is judged that it was done. Conversely, if the start point passage time is later than the end point passage time, it is determined that the image is taken while traveling from the end point toward the start point on the corresponding link.
  • the link start and end passage times registered by the link start / end point passage time registration means 33 are output to the secondary imaging information import file f2.
  • the image thinning means 34 thins the collected image data dl of the primary shooting information import file fl in accordance with the moving distance or the like for the purpose of reducing the data capacity.
  • the thinning method as shown in FIG. 6, there is the following method. Starting from one frame, the distance from the starting frame to the i-th frame is calculated for each frame using the formula [Equation 1].
  • the amount of data can be reduced by thinning out the number of image frames to such an extent that there is no human visual problem, and the amount of memory required for storage can be reduced. Can be made inexpensively. Furthermore, if the amount of data to be subjected to image display processing is small, the display processing speed can be improved and the burden on a display device such as a computer can be reduced.
  • the image thinning means 34 is not particularly required.
  • the actual shooting management information registration means 35 inputs the attribute information (shooting area information, operator, etc.) at the time of actual shooting to be registered in the plan / actual shooting information database, which will be described later, and enters the actual shooting management information d4. input.
  • the image data management subsystem 40 Planning of equipment to be used-Necessary from management of progress to management of collected image data dl actually captured and edited, and to extract and provide collected image data dl that meets user requirements It is a subsystem for the purpose of batch management of various data groups.
  • the shooting plan means 41 includes a screen display unit 411, a link selection unit 412, a camera information input unit 414, a shooting plan information input unit 413, and a shooting information export file creation unit 415. Composed.
  • the screen display unit 411 displays the node and link of the graph structure database 51 recorded in the storage device 50 on the computer screen.
  • the link selection unit 412 selects and determines a link for which shooting is planned from among the links displayed by the screen display unit 411.
  • Link selection methods include rectangular area specification by mouse operation on the screen, selection by position such as polygon area specification, selection by attribute information included in the graph structure information such as road width and type, and position and attribute information. There is a composite selection
  • the purpose is achieved if only the prefectural road is extracted in advance by selection based on attribute information, and then the municipality boundary is designated by polygon designation.
  • the camera information input unit 414 records the input camera information used for shooting in the storage device 50 as the camera information database 52. At this time, the camera information key ID specified automatically or arbitrarily is assigned so that individual camera information can be identified.
  • the shooting plan information input unit 413 displays the input shooting plan such as the shooting area (link group selected by the link selection unit 412), the scheduled shooting start date, the shooting operator, and the camera information key ID of the camera to be used. Is recorded in the storage device 50 as a plan / actual shooting information database 53. At this time, an automatic or arbitrarily designated shooting plan key ID is assigned so that individual shooting plans can be identified.
  • the shooting information export file creation unit 415 combines the link selected by the link selection unit 412 and the information input by the shooting plan information input unit 413, and displays them in the format shown in FIG. As link related information, it is recorded in the link related information database 54 in the storage device 50. At this stage, the link-related information and the link shape recorded in the graph structure database are output to the portable storage medium as a photographing information export file f3.
  • the shooting information capturing means 42 includes an actual shooting management information recording unit 421 and an image data recording unit 422.
  • the actual shooting management information recording unit 421 stores the actual shooting management information d3 such as the photographer at the time of shooting, which is input as the secondary shooting information import file f2 from the image data editing subsystem 30, as the actual shooting key. Assign ID and plan ⁇ Record in actual shooting information database 53.
  • the actual shooting ID should be a combination of the shooting date and the camera information key ID used, etc., so that IDs do not overlap.
  • the shooting information related to the shot link that is, the shooting date and time, the camera information key ID used for shooting, the passage time of the link start point and end point, and the physical storage location of the image to be stored (such as the volume name of the hard disk) ) Etc.
  • the actual shooting result information d 3 is stored in the link related information database 54 created and saved for each link key ID by the shooting information export file creation unit 415. Update to.
  • the image data recording unit 422 collects the collected image data dl and the collected image attribute information d2 input from the image data editing subsystem 30 as the secondary shooting information import file f2, and the collected image database 55 and the collected image attribute information d2, respectively. Record in the image attribute information database 56.
  • the management unit 43 includes a network registration unit 431, a printing unit 432, and a maintenance unit 433.
  • the network registration unit 431 reads route network data created and distributed externally and registers it in the graph structure database 51.
  • the printing unit 432 prints the information of each database recorded in the storage device 50 using a printer.
  • the maintenance unit 433 performs periodic knock-up of each database recorded in the storage device 50, and stores unnecessary data (such as shooting plan information that is no longer shot). Delete it. In addition, by searching for the work progress code in the link related information database 54 reflecting the shooting plan information dO and the actual shooting management information d3, the power of shooting progress is totaled.
  • the data extraction means 44 includes an extraction condition input unit 441, a graph structure information extraction unit 442, a link related information extraction unit 443, a collected image data extraction unit 444, and a collected image.
  • An attribute information extraction unit 554 is included.
  • the extraction condition input unit 441 inputs conditions for extraction from the database.
  • Conditions include positional conditions, link attribute conditions, and shooting attribute conditions.
  • the positional conditions include a rectangular area and a polygon area obtained by operating the mouse on the screen or reading a file.
  • the link attribute condition includes designation of attribute information included in the graph structure information such as road width and type.
  • the shooting attribute condition includes designation of a shooting date and a shooting time zone.
  • the graph structure information extraction unit 442 receives an extraction condition input unit from the graph structure database 51.
  • a link that matches the positional condition and the link attribute condition input in 441 is extracted.
  • the extracted link IDs and the node IDs at both ends thereof are checked, the link IDs connected to each node are listed, and stored as the graph structure information of the link of the cut-out image data.
  • the link related information extraction unit 443 extracts the link corresponding to the ID of the link extracted above from the link related information database 54. Further, link information that matches the photographing attribute condition input by the extraction condition input unit 441 is extracted from the information.
  • the collected image data extraction unit 444 extracts information that meets the conditions input by the extraction condition input unit 441, from the collected image database 55.
  • the collected image data extraction unit 444 searches for an image photographed with the camera information key ID stored in each link related information extracted above from the collected image database. To do. Further, an image photographed between the viewpoint passage time and the end point passage time is extracted from the image, cut out as an image file, and stored as an image file in the image data.
  • the image acquisition link ID, the node ID on the shooting start side, the node IDs on both sides of the shooting collection, and the shooting order are cut out and stored as image file attribute information in the image data.
  • the collected image attribute information extraction unit 445 extracts information that meets the conditions input by the extraction condition input unit 441 from the collected image attribute information database 56.
  • information such as image acquisition coordinates and image acquisition orientation corresponding to the image file extracted above is extracted and stored as image file attribute information in the cut-out image data.
  • Cut-out image data f4 This includes image files, image file attachment information, link related information, link graph structure information, and the like.
  • ⁇ Specifying the positional condition cutting out by geographic mesh or administrative area: The ability to select the link to cut out by specifying the mouse on the computer screen interface or specifying the area with the mouse.
  • the inside of the outer polygon such as the administrative area may be registered in advance, and the link in the outer polygon corresponding to the area designated by the user may be extracted.
  • shooting attribute condition By referring to shooting date / time information included in the link related information database 54, it is possible to specify and extract the shooting time of the image.
  • the image corresponding to each link is extracted based on the information described in the link related information database 54. That is, from the collected image database 55 recorded in the storage location of the storage device 50 described in the record corresponding to the link key ID of the link to be extracted, the image is captured by the camera described in the camera information key ID, In addition, it is only necessary to extract the time that the image frame is between the start point passage time and the end point passage time.
  • the cut out image data can include the following. [0135] ⁇ Image file
  • Image acquisition link ID ID of the road centerline that captured the image
  • the database used in the embodiment of the present invention is roughly divided into an image information database 50a for managing the relationship between feature information having a graph structure and captured images.
  • the physical image data base 50b that stores the actually captured images and their associated information can be divided into two groups.
  • the image information database 50a includes a graph structure database 51 that stores the graph structure, a camera information database 52 that stores information attached to the shooting, such as the camera used for shooting, and the shooting plan and actual shooting times.
  • a plan-actual imaging information database 53 corresponding to each link storing information
  • a link-related information database 54 that associates the above information and captured images for each link.
  • the graph structure database 51 stores feature information having a graph structure as shown in FIG.
  • a route network is used as geographic information
  • a road center line connecting the intersection as a node of the graph structure and the intersection as a link corresponds.
  • each node is defined by geographic coordinates and identified by a node key ID.
  • Each link is defined by the geographical coordinates of both ends or the node key ID) and the link shape, and is identified by the link key ID.
  • link road Both ends of the (road center line) are connected to nodes (intersections), and node key IDs indicating the intersections of the start and end points of links are given to each link. This reveals the connection relationship between nodes and links, that is, the network structure.
  • attached information can be attached to nodes and links.
  • a route network it is possible to give information such as the name of an intersection if it is an intersection, and information such as road width and regulatory information (such as one-way traffic) if it is a link.
  • device information such as a camera to be photographed, a car to be mounted, and a camera height is stored in the camera information database 52, and is identified by a camera information key ID.
  • the plan / actual shooting information database 53 records shooting plan information dO and actual shooting management information corresponding to each link, as shown in FIG. That is, the operator, the camera information key ID of the camera used for the work, and the image shooting date. Each shooting plan and actual shooting management information are identified by shooting plan key ID and actual shooting key ID, respectively.
  • the link related information database 54 includes a link key ID, a shooting plan key ID, an actual shooting key ID, and a work progress code indicating the shooting status (planned, shot, edited, A table containing a series of information such as shooting date, camera information key ID used for shooting, transit time of link start and end points, and physical storage location of the image to be stored (volume name of the node disk, etc.) With structure.
  • the link related information database 54 is a table that can have a plurality of records having the same link key ID, as indicated by the hatched portion in FIG. For this reason, it is possible to store multiple shooting results for one link key ID.
  • the collected image database 55 and the collected image attribute information database 56 belong to the physical image database.
  • the collected image database 55 includes a series of images actually taken in time series, The shooting camera information (camera information key ID) used to collect each image, the shooting date, and the shooting time are also recorded.
  • the collected image attribute information database 56 includes a position at the time of shooting, a speed at the time of shooting, and a shooting direction.
  • Information such as the shooting date, shooting time (GPS time), and shooting camera is stored.
  • the information in the collected image database 55 and the information in the collected image attribute information database 56 have time information, they are associated with each other by the time information.
  • the storage device 50 is not limited to one.
  • the physical image database 50b may be stored in a storage medium physically different from the image information database 50a. Actually, it is divided and stored on a large-capacity hard disk of 100 GB or more.
  • images of a plurality of periods can be simultaneously stored for one link.
  • Step S100 The link information from the external route network data f0 is registered in the graph structure database 51 by the management means 43 of the image data management subsystem 40.
  • Step S101 Screen display section of the photographing plan means 41 of the image data management subsystem 40
  • the graph structure of the link is displayed from the link information of the graph structure database 51.
  • Step S102 Link selection unit of photographing plan means 41 of image data management subsystem 40
  • the link to be photographed is selected from the displayed link graph structure.
  • Step S103 The camera information input unit 414 of the shooting plan means 41 of the image data management subsystem 40 registers information about the camera used for shooting in the camera information database 52.
  • Step S104 Shooting Plan Information of Image Data Management Subsystem 40
  • Shooting Plan Information of 41 The information input unit 413 prepares a shooting plan for the extracted shooting target link
  • Step S105 The shooting information status port file creation unit 415 of the shooting plan means 41 of the image data management subsystem 40 outputs the shooting plan information dO to the shooting information export file f3.
  • Step S106 The link shape in the information of the shooting information export file f3 is read into the image shooting subsystem 20 and the link to be shot is displayed.
  • Step S107 A captured image obtained by taking an image while actually traveling on the road of the link by the moving means (vehicle 21), and giving the photographing time by the image data collecting means 24 of the image taking subsystem 20. Collect data dl.
  • Step S108 Also, by the position data collection means 25 of the image capturing subsystem 20
  • Step S109 The image capturing subsystem 20 outputs the collected image data dl and the collected image attribute information d2 together with the actual shooting result information to the primary shooting information import file fl.
  • Step S110 The image data editing subsystem 30 and the shooting information export file f
  • Step S111 Register actual photographing management information.
  • Step S112 The map data display means 31 of the image data editing subsystem 30 displays the map data.
  • Step S113 The image data display means 32 of the image data editing subsystem 30 displays the images collected for the link designated on the map data.
  • Step S114 Time information is obtained from the displayed collected image data dl and the collected image attribute information d2, and the link start / end time registration means 33 of the image data editing subsystem 30 determines the link information. Record the passage time of the start and end points.
  • Step S115 Furthermore, the image thinning means 34 of the image data editing subsystem 30 thins out the image from the collected image data dl as necessary to reduce the data amount.
  • Step SI 16 Output the edited information to the secondary shooting information import file f2.
  • Step S117 The information of the secondary shooting information import file f2 is input to the image data management subsystem 40.
  • Step S118 The actual photographing information data is recorded in the plan / actual photographing information database 53 by the photographing information capturing means 42 of the image data management subsystem 40.
  • Step S119 The photographing information capturing means 42 of the image data management subsystem 40 updates the information in the link related information database 54.
  • Step S120 The data extraction means 44 of the image data management subsystem 40 extracts data from the collected image database 55 based on the information in the link related information database 54 according to the user's requirements, and provides link information (network information). Etc. and output as cut-out image data f4.
  • Step S121 The geographic data-related image data display system 100 of the present invention displays a map and a surrounding image of a designated link (road) by using the cut-out image data f4 and commercially available map data.
  • the following effects can be obtained by the image data management system with geographic information described above, and the user's request can be obtained by reading the created cut-out image data with the image data display system with geographic information 100.
  • the local image corresponding to can be displayed.
  • Camera information used for shooting as the camera information database 52 (shooting height and power of the camera
  • the identification number of the camera itself and the camera calibration information for each identification number) are centrally managed, so measurements can be made later using the captured images.
  • the actually captured images are associated with map information having a graph structure such as a complex route network with three or more intersections or intersections of five or more Y-junctions. Can be managed.
  • the primary shooting information import file fl in Figs. 1, 2, 3, and 7 includes image collection data dl with time and collection image attribute information d2.
  • the collected image attribute information d2 includes position information (coordinates) obtained from GPS, vehicle speed information obtained from a vehicle speed sensor, gyro force, obtained acceleration information and direction information, GPS time, and the like.
  • the shooting position display means 102 reads the collected image attribute information d2 of the cutout data f4.
  • This time-series information (image acquisition coordinates) can be displayed as a symbol (for example, a circle: the shooting position hereinafter) as shown in FIG.
  • FIG. 31 (a) shows the shooting position when the car was shot at regular intervals while moving faster or slower.
  • Fig. 31 (b) shows that the collected image data dl (with shooting time) captured at regular intervals stored in the primary shooting information import file fl at this time is arranged on the distance axis (coordinates). Is shown.
  • the map data display means 31 in Fig. 1 reads the image acquisition coordinates of the collected image attribute information d2 having the time of the collected image data dl, for example, and the image acquisition coordinates are stored in the distance axis (X — Allocate to memory defined in the Y coordinate system.
  • the image thinning means 34 sequentially sets the thinning interval d (distance) to the extent that there is no human visual problem on the distance axis from the shooting start point.
  • d distance
  • the image acquisition coordinates (imaging position) of the collected image data dl exist.
  • the image acquisition coordinates (shooting position) are assigned to the distance axis as representative shooting points (simply referred to as shooting positions) and related to the assigned image acquisition coordinates (shooting positions). Only the collected image data dl and the collected image attribute information d2 to be extracted are extracted and stored in the secondary imaging information import file f2.
  • time is extracted as a key. That is, only the collected image data dl and the collected image attribute information d2 at the time when the location of the bold line (representative shooting point) in FIG.
  • the link start / end point passage time registration means 33 performs the actual shooting result having the time of the collected image attribute data d2 including the position information of these representative shooting points.
  • Information d3 is extracted from the shooting information export file f3 and stored in the secondary shooting information import file f2.
  • the actual photographing management information registering means 35 performs the actual photographing management having the time of the collected image attribute data d2 having the image acquisition coordinates (photographing positions) of these representative photographing points.
  • Information d4 (shooting area information, operator information, etc.) is extracted from the shooting information tab file f3 and stored in the secondary shooting information import file f2.
  • the secondary shooting information import file f2 stores the collected image data dl, the collected image attribute information d2, the actual shooting result information d3, the actual shooting management information d4, etc. of each representative shooting point. It is.
  • the actual shooting management information recording unit 421 of the shooting information capturing means 42 of the image data management subsystem 40 assigns an actual shooting key ID to the actual shooting management information d3 of the secondary shooting information import file f2.
  • Plan ⁇ Record in the actual shooting information database 53.
  • the image data recording unit 422 records the collected image data dl and the collected image attribute information d2 of the secondary imaging information import file f 2 in the collected image database 55 and the collected image attribute information database 56, respectively.
  • the actual shooting management information d4 is stored in the plan / actual shooting information database 53.
  • the data extraction means 44 includes an extraction condition input unit 441, a graph structure information extraction unit 442, a link related information extraction unit 443, a collected image data extraction unit 444, and a collected image attribute. It consists of an information extraction unit 554,
  • the extraction condition input unit 441 inputs conditions for extraction from the database.
  • Conditions include positional conditions, link attribute conditions, and shooting attribute conditions.
  • Positional conditions also called spatial clipping conditions, include rectangular areas and polygonal areas that can be obtained from mouse operations on the screen and reading files. Various information (image files, attribute information, etc.) in these areas can be clipped. Call it image data!
  • the link attribute condition includes designation of attribute information included in the graph structure information such as road width and type.
  • the shooting attribute condition includes designation of a shooting date and a shooting time zone.
  • the graph structure information extraction unit 442 receives the extraction condition input unit from the graph structure database 51.
  • a link that matches the positional condition and the link attribute condition input in 441 is extracted.
  • the link related information extraction unit 443 extracts the link corresponding to the link ID extracted above from the link related information database 54. Further, link information that matches the photographing attribute condition input by the extraction condition input unit 441 is extracted from the information.
  • the collected image data extraction unit 444 extracts information that meets the conditions input by the extraction condition input unit 441, from the collected image database 55.
  • the collected image data extraction unit 444 executes each of the resources extracted from the collected image database. Search for images taken with the camera information key ID on the shooting date stored in the link related information. Furthermore, an image taken between the viewpoint passage time and the end point passage time is extracted from the image, cut out as an image file, and stored as an image file in the image data.
  • the image acquisition link ID, the node ID on the shooting start side, the node IDs on both sides of the shooting collection, and the shooting order are cut out and stored as image file attribute information in the image data.
  • the collected image attribute information extraction unit 445 extracts information that meets the conditions input by the extraction condition input unit 441 from the collected image attribute information database 56.
  • information such as image acquisition coordinates and image acquisition orientation corresponding to the image file extracted above is extracted and stored as image file attribute information in the cut-out image data.
  • the data extraction means 44 uses all the cut-out image data f4 (graph structure information, link-related information, image file) corresponding to the representative shooting point shown in Fig. 31 (d) of the cut-out condition (area: area). , Image file attribute information... Is stored in the storage device 106 (see FIG. 31 (f)).
  • the cut-out image data f4 is used to display a map and an image file by the image data display system 100 with geographic information association.
  • FIG. 27 is a more specific configuration diagram of the image data display system 100 with relation to geographic information.
  • the image data display system 100 with a geographic information association is a storage device.
  • the storage device 106 stores note data related information f6 and the like.
  • This note data related information f6 is preferably registered in advance by the note data 'registration display means 109.
  • the aforementioned map data 108 preferably has a hierarchical structure.
  • Eg background map This is a hierarchical structure such as the world of network, network information, note data, and map symbol data.
  • the annotation data is the name of the feature, and this name is associated with the coordinates of the annotation data field.
  • the storage device 106 stores a plurality of pieces of cut-out image data f4 in the cut-out condition area. These cut-out image data f4 are preferably given an ID code. More specifically, the data structure is as shown in FIG.
  • the physical image database has a data structure shown in FIG. 30 more specifically.
  • the geographic display means 101 displays the background map, network information, note data, and map symbol in this region in an overlapping manner. At this time, only the note data in the vicinity of the road of the network data may be extracted and displayed.
  • the note data 'registration display means 109 has a function of displaying note data on an image on a computer screen.
  • the note data 'registration display means 109 is an image that exists within a certain distance from the coordinates of the note data of the map data (for example, the coordinates of the character at the center of the note data).
  • the acquisition coordinates are searched, and among these image acquisition coordinates, the image acquisition coordinates closest to the note data are searched.
  • the ID of the cut-out image data having the image acquisition coordinates is registered in advance in the storage device 106 as the annotation data related information f6 together with the annotation data and the coordinates of the annotation data.
  • the image of the image file linked to this ID is a 360-degree panoramic image, and the center position of the image corresponding to the camera center is the camera center (0 degrees).
  • the coordinates of the annotation data on the image are calculated at the position of the eye level in the direction of ⁇ k from the camera center.
  • it has a memory that defines a field for displaying note data (X-Y coordinate system: corresponding to the coordinate system of the panoramic image), and the coordinates of the note data on this memory (note data display on the image) Definition).
  • the note data related information f6 includes the ID of the adjacent clipped image data, the note data display position and direction ( ⁇ k), the characters of the note data, and the like.
  • the geographic information means 101 reads out map data corresponding to the input segmentation condition area from the storage device 106 and displays it on the computer screen.
  • This map data consists of background maps, note data, map symbol data, and the like.
  • the shooting position display means 102 extracts the image acquisition coordinates of all the cut-out image data f4 stored in the storage device 106, and positions on the map on the screen corresponding to these coordinates (for example, on the road) A symbol (photographing position symbol and label) indicating the image acquisition coordinates (photographing position) is displayed at the photographing point (see Figs. 23 and 24).
  • the photographing position display means 102 displays the designated photographing position symbol with another symbol (for example, a cross or a double circle). , Different colors).
  • the map traveling direction mark display means 107 indicates that the shooting order of images having coordinates corresponding to the shooting position symbol is the start number in the same link or It is determined whether it is the end number.
  • the shot image display means 103 extracts an image file of cut-out image data having image acquisition coordinates of the shooting position symbol, and the operator extracts the image file from this image file.
  • the image of the input display range is displayed on the screen.
  • the image file of the cut-out image data described above is a 360-degree panoramic image. For this reason, the operator designates a display range of about 50 degrees to the left and right, for example, centering on the shooting direction.
  • the on-map image display direction display means 110 displays the display range (from the shooting direction) based on the shooting direction from the image acquisition coordinates (shooting position) on the map on the computer screen.
  • a polygon is generated in the angle direction of the current display range (in the above case, 50 degrees left and right), and this polygon is overlaid on the map on the computer screen (refer to the fan of the map display range in Fig. 28).
  • the on-image traveling direction mark display means 107 obtains the direction of the connection link (road) included in the shooting range in the shooting direction from the intersection.
  • the direction of the connection link is defined by an angle. For example, as shown in Fig. 32 (a), when the shooting direction is the camera center (0 degree), the direction of the center line of the road ka from the camera center is ⁇ p, and the direction of the center line of the road kb is It can be defined by ⁇ i.
  • the captured image display means 103 sequentially extracts the image file of the shooting position of the road in the selected direction and displays it according to the display range.
  • the note data registration and display means 109 reads the specified note (00 apartment, 00 hospital, 00 office), and the ID of the image being displayed and the information related to the note data f 6 When the ID of the image data registered in is matched, the close note data is extracted from the map data. Next, the extracted note data is displayed on an image on a computer screen.
  • This annotation data is displayed by assigning the annotation data related information f 6 having the image acquisition coordinates (imaging position) when the imaging position symbol on the road on the computer screen has moved to the imaging position. Then, the character string of the note data is displayed on the computer screen according to the note data display position and direction of the assigned note data related information f6 (see Figure 24).
  • the cut-out image data fi when the cut-out image data fi is generated, the cut-out image data fi is displayed.
  • images related to privacy such as a face and a license plate also exist. is doing.
  • Such privacy-related images may be subjected to a process of blurring with a Gaussian finorator or the like, and stored in the storage device 106 as cut-out image data fi.
  • privacy-related images are displayed in a blurred manner on the display unit.
  • the road network and the background map of the shooting plan area included in the shooting plan information dO are displayed on the screen, and these links are shot, and the acquired collected image data dl (all Ambient images: Both image files! /, And collected image attribute data d2 are managed according to the shooting time, and images of a predetermined rectangular area of these all-around images (also referred to as camera parameter designation display images) are displayed on the screen.
  • images of a predetermined rectangular area of these all-around images also referred to as camera parameter designation display images
  • a path selection polygon for selecting a path is displayed.
  • a number of camera parameter designation display images in the course (link) corresponding to the selected course selection polygon are reproduced.
  • a map including a link is displayed on the screen, and the configuration and operation for displaying the position of the camera parameter designation display image with a symbol are supplemented.
  • FIG. 34 is a schematic configuration diagram of the image data management system with geographic information association according to the second embodiment.
  • description of components having the same reference numerals as those in FIG. 1 is omitted.
  • the image capturing subsystem 20 of the image data management system with geographical information association includes an image sensor 22, a hybrid type position sensor 23, an image data collecting unit 24, and an actual image capturing management information input unit 72. It has.
  • the actual shooting management information input means 72 displays a screen for inputting actual shooting management information before the start of shooting, and the operator name, actual shooting camera information key ID, and actual shooting date (time) are displayed on this input screen. Is input and stored in a memory (not shown).
  • the actual shooting management information input means 72 opens the above-described input screen and inputs a shooting result (shooted, shot not possible, etc.).
  • the actual photographing management information input means 72 stores the actual worker name, actual photographing date, photographing result, and the like input on the input screen in the temporary memory 26 as actual photographing input information d3a.
  • an actual shooting key ID combining the actual shooting date and the actual shooting camera information key ID is generated and included in the actual shooting input information d3a.
  • the actual shooting plan key ID may be entered manually.
  • the collected image data dl, the collected image attribute data d2, and the actual shooting input information d3a are stored as a set in the temporary memory 26, and the primary shooting information is imported. Output to file fl.
  • the image data editing subsystem 30 has the same configuration as that shown in FIG. 1, and includes a large number of collected image data dl, a large number of collected image attribute data d2, and actual shooting for each link in the secondary shooting information import file f2. Result information d3b and actual photographing management information d4 are stored. This actual shooting result information d3b is the same as the actual shooting result information d3 shown in Fig. 1.
  • the image data management subsystem 40 includes a photographing plan means 41, a photographing information capturing means 42 (see FIG. 34), and a management means 43, which are the same as those in FIG. Means 73 are provided.
  • the above-described data extraction means 73 inputs the extraction area (link ID or area) and the shooting time (year, month, day, time, normal, reverse direction) and has a link function having the link ID of the input extraction condition.
  • Link information is allocated from the link information related information database 54, and this link ID, shooting date / time, collected image data dl, and collected image attribute data d2 are output to the import file f4.
  • the data extraction means 73 reads the link ID, and if a link ID with the same number exists in the link related information database 54, the shooting plan key ID and the actual shooting key are recorded in the records of these link IDs. If the ID is written and the actual shooting information (actual shooting result information) is associated, the collected image data of all the link related information having the same link ID dl The collected image attribute information data d2, link-related information, map data, etc. are extracted as cut-out data and output (stored) in the import file f4.
  • Figure 34 is divided into multiple sections to emphasize that it is cut-out data that includes information with different shooting times.
  • this cut-out image data is extracted from the screen shot at different times on the same link.
  • Figure 37 shows images extracted by shooting on the same link at different dates.
  • 37—A1 shows an image taken in the morning (part of the entire surrounding image).
  • 37— A2 below A1 shows a map with shooting locations.
  • 37—B1 shows an image taken at night (a part of the all-around image), and 37—B2 below 37—B1 shows a map with the shooting position.
  • Fig. 38 shows images extracted by shooting on the same link in different movement directions.
  • 38-A1 shows an image (part of the all-around image) taken in the north lane (front), and 38-A2 below 38-A1 shows a map with the shooting position.
  • 38—B1 shows an image (part of the all-around image) taken when driving in the south lane (backward direction), and 38—B2 below 38—B1 shows a map with the shooting position.
  • the shooting plan means 41 of the image data management subsystem 40 displays a road network including a link to be shot on the screen (with a background map), and selects a link. At this time, an input screen (not shown) for shooting plan is displayed. The link ID of the selected link is automatically written in the input screen for this shooting plan. Also, the camera information of the camera information database 52 is displayed on the screen, and the camera key ID (camera to be shot) of the selected camera information is taken. Write to the shadow plan input screen.
  • the shooting plan means 41 displays a shooting date (including time) on the shooting plan input screen.
  • a shooting plan key ID is generated by combining the scheduled shooting date and the camera key ID scheduled to be shot.
  • the shooting plan means 41 uses the shooting plan information dO (shooting plan key ID, link ID, planned work person, scheduled camera information key ID, scheduled shooting date, etc.) Save to 53. At this time, it is preferable that the hard disk volume name and work progress code (not photographed) are input and included in the photographing plan information dO.
  • the shooting plan means 41 reads the plan 'actual shooting information database dO, and records the link ID, shooting plan key ID, scheduled shooting date, scheduled shooting camera information key ID and the link related information database 54 record. Write. At this time, write the hard disk volume name and work progress code.
  • the record in the link related information database 54 described above is linked ID, shooting plan key ID, actual shooting key ID, work progress code, shooting result, shooting date, start point passing time, and end point passing.
  • the record has an area where the time and storage location are written.
  • the imaging plan means 41 reads the work plan information dO having the same link ID and stores it in the link related information database 54. That is, link related information having the same link shown in FIG. 15 is generated in advance.
  • the imaging plan means 41 records such link related information in the link related information database 54 as fe.
  • the shooting plan means 41 sequentially reads these link-related information records, and for each read, the link ID, shooting plan key ID, shooting preliminary date, scheduled work person, etc. of the record are recorded.
  • Shooting plan information dO is output to shooting information export file f3.
  • the shooting plan information dO including the road network and the background map.
  • the position data collection means 25 of the image capturing subsystem 20 displays the road network and background map of the shooting plan information dO in the shooting information etsport file f3 to confirm the link to be shot.
  • a plurality of collected image data dl and collected image attribute data d3 in these links are output to the primary imaging import file fl.
  • the actual shooting management information input means 72 opens the above-mentioned input screen, and the shooting result (shooting completed, shooting disabled, etc.), actual worker name, actual shooting date and time, actual shooting camera information key I
  • the actual shooting management information input means 72 has the shooting plan key ID, the actual shooting key ID, the shooting result (taken), the work progress code (taken), the actual worker name, the actual shooting date, and storage.
  • the location etc. (actual shooting input information d3a) is output to the primary shooting information import file fl.
  • the image data editing subsystem 30 reads the actual shooting input information d3a stored in the primary shooting information import file fl! /, And the actual shooting key I of this actual shooting input information d3a.
  • shooting result (photographed)
  • work progress code (photographed)
  • actual worker name is linked and the link ID of shooting plan information dO is set as actual shooting management information d4 Save to the next shooting information import file f2.
  • map data display means 31 of the image data editing subsystem 30 displays map data
  • the image data display means 32 displays the link specified on the map data.
  • V Display the collected image (collected image).
  • the operator displays the collected image data dl corresponding to the time of the end point of the selected link on the screen.
  • the operator looks at the roads and intersections photographed in the image, and if it is determined that the position is shifted at the intersection, etc., the operator operates the screen in chronological order. Display the previous or next image. Then, the photographing time of the image at the intersection determined by the operator is corrected and registered as the passage time of the start point or end point of the link.
  • This correction registration is because the image sensor 22 (camera) takes a picture at a certain time, and the vehicle moves at a predetermined speed (5km / h, •• Okm '' 60km '' 80km). For this reason, each link has multiple collection images and attribute information. In addition, since the vehicle stops at the intersection and the speed is slow, there are a large number of collected images and attribute information at this intersection. That is, there are a large number of collected images and attribute information in the vicinity of the node. Therefore, it is necessary to determine the collected images and attribute information at the start point (node) and end point (node) of one link.
  • the one at the specified passing time is the collected image at the intersection, and the one between this start point and the end point is within the link.
  • Specifying the passage time of the start point and the end point means that the most appropriate one of the collected image data dl and attribute information data d3 is assigned to the link. Become. Also called image grouping. (See Fig. 35)
  • the shooting time (a) of the collected image and the shooting time (b) of another collected image in the link for which the link start / end point passage time registration means 33 is selected are designated,
  • the shooting time (a) is assigned to the start point (Ea) of the selected link, and the shooting time (b) is assigned to the end point (Eb).
  • the passage time of the start point and the end point is added to the actual shooting input information d3a having this link ID, and stored as the actual shooting result information d3b in the secondary shooting information import file f2.
  • the image thinning means 34 of the image editing data system 30 moves the collected image data dl of the primary shooting information import file fl for the purpose of reducing the data capacity. Thin out according to the distance.
  • the thinning method there are the following methods as shown in FIG. Using one frame (collected image data: also called image frame) as the starting point, the distance from the starting frame to the i-th frame is calculated by Equation 1 for each frame.
  • the image thinning means 34 is associated with the link ID! /, The speed information and the time information of the time-series information for each shooting time (fixed time). For each), find each distance Li (Ll, L2--': each distance to the end point) from the specified start point (eg start point). For each distance d that is an integral multiple of a predetermined fixed distance, the distance di is the closest to this integral multiple distance di!
  • Each attribute information (d2) having position information corresponding to the attribute information (d2) and a collected image (dl) of the attribute information are extracted as extracted data from a plurality of collected images and attribute information for each collected image. Yes. That is, in the secondary import file f2, a large number of collected images dl and attribute information d2 after being thinned within the link are stored for each link.
  • the secondary shooting import file f2 contains the actual shooting result information d3b including the link ID, shooting plan key ID, actual shooting key ID, and actual shooting date for each link.
  • the actual image management information d4 a large number of collected image data dl thinned out in the link, and these collected image attribute data d2 are stored as a set.
  • the shooting information capturing means 42 of the image data management subsystem 40 stores the actual shooting management information d4 for each link of the secondary shooting import file f2 in the plan 'actual shooting information database 53 of the storage device 50 and Read actual shooting result information d3b for each link
  • the actual shooting key ID of the actual shooting result information d3b, the actual shooting date, and the camera actually shooting Write the information key ID, the start time and end time, and the work progress code (already edited).
  • the contents at the time of shooting planning of the record are updated (scheduled shooting date, camera information key ID, work progress code, shooting result: V where nothing else has been written, location is newly written).
  • the collected image data dl of the link of the secondary shooting import file f2 is collected.
  • a large number of collected image attribute data d 2 is stored in the collected image attribute information database 56.
  • the shooting information capturing means 42 re-reads the secondary shooting import file f2, and collects image data dl, collected image attribute data d2, and actual shooting result information d3b for each link (may be the same link). If actual shooting management information d4 exists, it is newly read.
  • the actual shooting result information d3b actual shooting key ID, actual shooting date ID and actual shooting are recorded in the record having the link ID and shooting plan key ID of this shooting result information d3b in the link related information database 54.
  • Write the camera information key ID, start time, end time, and work progress code (edited). In other words, the contents at the time of shooting planning of the record are updated (Scheduled shooting date, camera information key ID, work progress code, shooting result: everything else is written, nare, location is newly written)
  • the collected image data dl of the link of the secondary shooting import file f2 is stored in the collected image database 55, and the collected image attribute data d2 is stored in the collected image attribute information database 56.
  • the data extraction means 73 inputs the extraction area (link ID or area) and shooting time (year / month / day, time, forward / reverse direction) and inputs the input extraction condition link ID and shooting time.
  • the link related information having etc. is allocated from the link information related information database 54.
  • a large number of collected image data dl and collected image attribute data d2 between the start point passage time and the end point passage time of the link related information related to this link ID are also read out and output to the import file f4.
  • link related information having a link ID corresponding to the extraction condition is read from the link related information database 52 and output.
  • the data extraction means 73 reads the link IDs, and if a link ID having the same number exists in the link related information database 54, the data extraction means 73 records the photometer in the records of these link IDs. If the image key ID and actual shooting key ID are written! /, And the actual shooting information (actual shooting result information) is associated, all the links having the same link ID A large number of collected image data dl and collected image attribute information data d2 are extracted and output (stored) in an import file f4.
  • the data extracting means 73 reads out the link related information having the link ID of the same number from the link related information database 52 and outputs it.
  • 40-C1 in Fig. 37 shows an example of link-related information.
  • This 40-C1 link-related information is a more specific example of Figure 15.
  • reference numeral 41-D1 in FIG. 38 shows an example of link-related information.
  • This 41-C1 link related information is a more specific example of Figure 15.
  • the data extraction means 73 outputs network information corresponding to the link type or area, as well as note data, map symbol data, and the like to the storage device 106.
  • the image data display system 100 with relation to geographic information includes a traveling direction arrow link determination unit 107 in addition to the photographed image display unit 103.
  • the captured image display means 103 includes a camera parameter designation image display unit 103a and an all-around image extraction unit 103b. Further, the traveling direction arrow link determination means 107 includes a selection arrow reading unit 107a, a connection point determination unit 107b, and a traveling direction arrow creation unit 107d, and the collected image data dl of the extracted image data (also referred to as extracted data) is included in the link.
  • the continuous display of the camera parameter designation display image is discontinued, and the route is selected. In route selection, the node ID of the last displayed side is first checked from the attached information (also called attribute information) of the image file.
  • the graph structure included in the cut-out image data is examined, and a link connected to the node is searched. Furthermore, by examining the image acquisition link ID included in the cut-out image data, if there is image data (collected image data) that captured the link, an arrow or click button is displayed on each path displayed on the image. Etc. (Drawing direction selection polygon Pi) Next, the user designates which route image to display.
  • the camera parameter designation image display unit 103a of the captured image display means 103 uses the image file (collected image data dl: all-around image) from the all-around image extraction unit 103b for the production of CG (composite graphic). It is converted into a rectangular area using a method called texture mapping and displayed (camera parameter designation display image).
  • the traveling direction selection polygon Pi (Pl, ⁇ ⁇ 4) from the traveling direction arrow creation unit 107c of the traveling direction arrow link determination 107 is displayed so as to overlap the collected image data d2.
  • the all-around image retrieval unit 103b reads the link related information corresponding to the shooting time from the storage device 10b, and starts the link related information.
  • a lot of collected image data dl having time information (shooting date and time) of passage time and end point passage time is allocated, read out in ascending or reverse order, and output to the camera parameter designated image display unit 103a.
  • the all-around image extraction unit 103b assigns attribute information d2 having this position, and collects image data related to this attribute information. dl is output to the camera parameter designation image display unit 103a.
  • the omnidirectional image extraction unit 103b uses the shooting time as the extraction condition and takes a picture of this extraction condition. There is a search method by the same link time to read link related information with time. Then, when the link-related information includes an actual shooting key ID, the all-around image extraction unit 103b has multiple pieces of time information within the passage time of the start point and the end point of the link-related information.
  • the collected image data dl are extracted from the storage device 10b in ascending or reverse order and output to the camera parameter designated image display unit 103a.
  • connection point determination unit 107 of the travel direction arrow link determination unit 107 determines whether the position specified on the road map is a connection point (for example, an intersection) or not in the storage device 10b. [0307] Further, the connection point determination unit 107 of the traveling direction arrow link determination unit 107 determines that the connection point is reached when the all-around image extraction unit 103b extracts the start point or the end point collection image.
  • a stop instruction is output to the all-around image extraction unit 103b to stop reading the collected image data dl (image file). That is, the continuous display of the collected image data dl (image file) is stopped.
  • the traveling direction arrow creation unit 107c When the traveling direction arrow creation unit 107c is determined to be a connection point, it reads the connection point from the network information, reads the direction of each link connected from the node information of the link at this connection point, The direction information corresponding to these link directions, the preset display position, and the traveling direction selection polygon Pi are output to the camera parameter designation image display unit 103a.
  • the travel direction selection polygons Pl, P2, P3, and P4 are the four-link images of the camera parameter designation image display unit 103a.
  • the camera parameter designation display image (rectangular area) of this all-around image is displayed. Since this camera parameter designation display image is a predetermined field of view range (rectangular region), it is an image in the traveling direction, so two traveling direction selection polygons are displayed.
  • the selection arrow reading unit 107a reads the selection position and notifies the selection link determination unit 107d when any of the screen direction selection polygons Pl and P2 is selected.
  • the selected link determination unit 107 obtains the selected position from the all-around image (image file: collected image data dl) existing in the camera parameter designation image display unit 103a, and stores the link having the obtained position coordinates. Obtained from the network information fo of device 10b. Then, the link ID of the obtained link is output to the all-around image extraction unit 103b.
  • the omnidirectional image extraction unit 103b displays the image data (collected image data: omnidirectional image) obtained by capturing the link related to the link ID included in the cut-out image data in ascending or reverse order. .
  • the start time of the link is early! /
  • the time of the end point is late! /.
  • a young number is assigned to the start point and an old number is assigned to the end point, and read in ascending order.
  • the start point is late! /
  • the end point is early! /
  • the point is the young one
  • the number is the old one! /
  • the number is assigned, the young! / Is read out in order!
  • 40—C1 in Figure 37 is a specific example of link-related information, and the top record of 40—C1 is
  • the top record of 41—D2 in FIG. 38 is link related information when taken in the morning, and the next record is link related information when taken in the afternoon. This link related information indicates that the picture was taken from the end point to the start point.
  • the road is described as a road, but a pipeline, a park, a mountainous area, a seaway

Abstract

L'invention concerne un système pour afficher des données d'image associées à des informations de carte, qui comprend un stockage pour stocker des données de carte et des données d'images découpées, des moyens d'affichage de carte pour afficher une carte générée à partir des données de carte, des moyens d'affichage d'emplacement de photographie pour afficher un emplacement de photographie sur la carte à l'aide des données d'image découpées, des moyens d'affichage d'images photographiées pour afficher une image des données d'images découpées, des moyens de traitement de recherche de route pour rechercher une route d'un point de départ spécifié à un point de destination spécifié sur la base des données de carte et des moyens de recherche d'image pour rechercher une image spécifiée sur la base des données d'images découpées.
PCT/JP2007/071583 2006-12-12 2007-11-06 Système d'affichage de données d'image associées à des informations de carte WO2008072429A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2006-334926 2006-12-12
JP2006334926 2006-12-12
JP2007106460A JP4210309B2 (ja) 2006-12-12 2007-04-13 地図情報関連付き画像データ表示システムおよび地図情報関連付き画像データ表示のプログラム
JP2007-106460 2007-04-13

Publications (1)

Publication Number Publication Date
WO2008072429A1 true WO2008072429A1 (fr) 2008-06-19

Family

ID=39511457

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/071583 WO2008072429A1 (fr) 2006-12-12 2007-11-06 Système d'affichage de données d'image associées à des informations de carte

Country Status (1)

Country Link
WO (1) WO2008072429A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013121471A1 (fr) * 2012-02-16 2013-08-22 パナソニック株式会社 Dispositif de génération d'image
JP2015509239A (ja) * 2012-05-28 2015-03-26 テンセント テクノロジー (シェンツェン) カンパニー リミテッド 電子マップに基づく位置検索方法及び装置
JP2015165320A (ja) * 2015-05-01 2015-09-17 任天堂株式会社 表示システム、表示制御装置、情報処理プログラム及び表示方法
JP2016110639A (ja) * 2014-12-05 2016-06-20 株式会社リコー サービスシステム、情報処理装置、サービス提供方法
CN103148861B (zh) * 2011-12-07 2017-09-19 现代自动车株式会社 使用地理标注图像的道路引导显示方法和系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002269590A (ja) * 2001-03-07 2002-09-20 Mixed Reality Systems Laboratory Inc 画像再生装置及び方法
JP2002269591A (ja) * 2001-03-07 2002-09-20 Mixed Reality Systems Laboratory Inc 画像再生装置及び画像処理装置及び方法
US20030142115A1 (en) * 2002-01-15 2003-07-31 Takaaki Endo Information processing apparatus and method
JP2005006081A (ja) * 2003-06-12 2005-01-06 Denso Corp 画像サーバ、画像収集装置、および画像表示端末

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002269590A (ja) * 2001-03-07 2002-09-20 Mixed Reality Systems Laboratory Inc 画像再生装置及び方法
JP2002269591A (ja) * 2001-03-07 2002-09-20 Mixed Reality Systems Laboratory Inc 画像再生装置及び画像処理装置及び方法
US20030142115A1 (en) * 2002-01-15 2003-07-31 Takaaki Endo Information processing apparatus and method
JP2005006081A (ja) * 2003-06-12 2005-01-06 Denso Corp 画像サーバ、画像収集装置、および画像表示端末

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103148861B (zh) * 2011-12-07 2017-09-19 现代自动车株式会社 使用地理标注图像的道路引导显示方法和系统
WO2013121471A1 (fr) * 2012-02-16 2013-08-22 パナソニック株式会社 Dispositif de génération d'image
JP2015509239A (ja) * 2012-05-28 2015-03-26 テンセント テクノロジー (シェンツェン) カンパニー リミテッド 電子マップに基づく位置検索方法及び装置
US9489766B2 (en) 2012-05-28 2016-11-08 Tencent Technology (Shenzhen) Company Limited Position searching method and apparatus based on electronic map
US9646406B2 (en) 2012-05-28 2017-05-09 Tencent Technology (Shenzhen) Company Limited Position searching method and apparatus based on electronic map
JP2016110639A (ja) * 2014-12-05 2016-06-20 株式会社リコー サービスシステム、情報処理装置、サービス提供方法
JP2015165320A (ja) * 2015-05-01 2015-09-17 任天堂株式会社 表示システム、表示制御装置、情報処理プログラム及び表示方法

Similar Documents

Publication Publication Date Title
JP4210309B2 (ja) 地図情報関連付き画像データ表示システムおよび地図情報関連付き画像データ表示のプログラム
US20210049412A1 (en) Machine learning a feature detector using synthetic training data
JP3432212B2 (ja) 画像処理装置及び方法
US11501104B2 (en) Method, apparatus, and system for providing image labeling for cross view alignment
CN103971589B (zh) 将地图的兴趣点信息添加于街景图像中的处理方法及装置
JP5290493B2 (ja) ナビゲーションシステムに使用するための地理的データベースの情報を収集する方法
US11590989B2 (en) Training data generation for dynamic objects using high definition map data
CN109891195A (zh) 用于在初始导航中使用视觉地标的系统和方法
US20140132767A1 (en) Parking Information Collection System and Method
US20140301645A1 (en) Method and apparatus for mapping a point of interest based on user-captured images
CN105973236A (zh) 室内定位或导航方法、装置以及地图数据库生成方法
JP3972541B2 (ja) 地図表示方法及び地図表示装置
JP2006017739A (ja) 映像を使用してナビゲーションシステムを操作する方法
CN106294458A (zh) 一种地图兴趣点更新方法及装置
KR20050094780A (ko) 항공사진을 이용한 약식현황도 제작방법
CN103593450B (zh) 建立街景空间数据库的系统和方法
JP2003287434A (ja) 画像情報検索システム
JP2007122247A (ja) 自動ランドマーク情報作成方法及びシステム
WO2008072429A1 (fr) Système d'affichage de données d'image associées à des informations de carte
CN107341213A (zh) 一种街景视图制作方法及系统
JP2015032176A (ja) ユーザの位置を基準にローカルな観光情報を検索する方法
JP5883723B2 (ja) 3次元画像表示システム
JPH031300A (ja) 道路交通のための最新のデータフアイルを作成し保存するシステム
JP2008243185A (ja) 地理座標関連付き画像データ管理システム
JP4068656B1 (ja) 地理情報関連付け画像データ管理システムおよび地理情報関連付け画像データ管理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07831315

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: COMMUNICATION NOT DELIVERED. NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112 EPC (EPO FORM 1205A DATED 09.09.2009)

122 Ep: pct application non-entry in european phase

Ref document number: 07831315

Country of ref document: EP

Kind code of ref document: A1