WO2012014837A1 - Image providing system, image display device, image display program, and image display method - Google Patents

Image providing system, image display device, image display program, and image display method Download PDF

Info

Publication number
WO2012014837A1
WO2012014837A1 PCT/JP2011/066833 JP2011066833W WO2012014837A1 WO 2012014837 A1 WO2012014837 A1 WO 2012014837A1 JP 2011066833 W JP2011066833 W JP 2011066833W WO 2012014837 A1 WO2012014837 A1 WO 2012014837A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
data
image data
acquisition
reliability
Prior art date
Application number
PCT/JP2011/066833
Other languages
French (fr)
Japanese (ja)
Inventor
知裕 佐藤
Original Assignee
ブラザー工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ブラザー工業株式会社 filed Critical ブラザー工業株式会社
Publication of WO2012014837A1 publication Critical patent/WO2012014837A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6581Reference data, e.g. a movie identifier for ordering a movie or a product identifier in a home shopping application

Definitions

  • the present disclosure relates to an image providing system that provides images acquired in various places, an image display device that displays images acquired in various places, an image display program, and an image display method.
  • Patent Document 1 a system that can obtain images from various locations is known (for example, see Patent Document 1).
  • the image display device when a location is specified and a current video acquisition instruction is input, imaging data of the current video is acquired from an external video acquisition device existing at the specified location.
  • imaging data of the past video is acquired from the imaging data storage unit or the center station. Thereby, the image display apparatus can display the present image or the past image of the designated place.
  • the newer the video when the current video or past video at the specified location is displayed, the newer the video, the higher the reliability of the information.
  • the older the video the lower the reliability of the information. That is, the acquisition date and time of video data greatly affects the identity (hereinafter referred to as video reliability) with the cityscape, landscape, state, etc. at the current designated location.
  • video reliability the identity with the cityscape, landscape, state, etc. at the current designated location.
  • the present disclosure has been made in order to solve the above-described problem, and an image providing system, an image display device, and an image that allow a user to easily grasp the reliability of image data obtained by photographing each point on a map image
  • An object is to provide a display program.
  • an image acquisition device that acquires image data, a server that manages the image data, and an image display device that displays the image data are connected via a network.
  • the image acquisition device includes: an image data acquisition unit that acquires the image data from an imaging unit that images the periphery of the image acquisition device; and a position that acquires a current position of the image acquisition device. Based on timing information acquisition means for acquiring position information indicating the acquisition position of the image data from the acquisition means, and timing at which the image data acquisition means acquires the image data, the temporal reliability of the image data is increased.
  • Reliability information generating means for generating reliability information to be shown, the image data acquired by the image data acquiring means, and the position information acquisition Information transmission means for transmitting the position information acquired by the stage and the reliability information generated by the reliability information generation means to the server, the server receiving from the image acquisition device
  • the image data, the position information, and the reliability information are associated with each other and stored in the first storage device, and stored in the second storage device in response to a request from the image display device.
  • Map information, and information transmission means for returning the reliability information associated with the position information included in the display range of the map data to the image display device, the image display device, By transmitting the request to a server, the information acquisition means for acquiring the map data and the reliability information from the server, and the information acquired by the information acquisition means Based on the map data and the reliability information, a special display indicating the temporal reliability of the image data is generated by the composite image generation means for generating a composite image added to the map data, and the composite image generation means And a composite image display means for displaying the composite image on the first display device.
  • imaging the periphery of the image acquisition device is not limited to only the periphery in terms of the distance of the image acquisition device. Imaging performed in a predetermined direction around the image acquisition apparatus (the captured image includes a distance up to infinity) is also included in the range of “imaging the periphery of the image acquisition apparatus”.
  • the image data acquired by the image acquisition device is stored in the first storage device together with the position information and the reliability information.
  • the map data stored in the second storage device is returned to the image display device in response to a request from the image display device.
  • the reliability information associated with the position information included in the display range of the map data is also returned to the image display device.
  • a composite image is generated based on the map data and reliability information acquired from the server and displayed on the first display device.
  • the composite image is an image in which a special display indicating temporal reliability of the image data is added to the map data. Therefore, the user of the image display device can easily grasp the reliability of the image data obtained by photographing each point on the map image by referring to the composite image displayed on the first display device.
  • an image acquisition device that acquires image data, a server that manages the image data, and an image display device that displays the image data are connected via a network.
  • the image acquisition device includes: an image data acquisition unit that acquires the image data from an imaging unit that images the periphery of the image acquisition device; and a position that acquires a current position of the image acquisition device. Based on timing information acquisition means for acquiring position information indicating the acquisition position of the image data from the acquisition means, and timing at which the image data acquisition means acquires the image data, the temporal reliability of the image data is increased.
  • Reliability information generation means for generating the reliability information to be shown, the image data acquired by the image data acquisition means, and the position information acquisition.
  • Information transmission means for transmitting the position information acquired by the means and the reliability information generated by the reliability information generation means to the server, the server receiving from the image acquisition device.
  • the image data, the position information, and the reliability information are associated with each other and stored in the first storage device, and stored in the second storage device in response to a request from the image display device.
  • a special display indicating the temporal reliability of the image data is added to the map data.
  • Composite image generation means for generating a composite image
  • information transmission means for returning the composite image generated by the composite image generation means to the image display device.
  • the image display device transmits the request to the server to acquire the composite image from the server, and the composite image acquired by the information acquisition device to the first display device.
  • a composite image display means for displaying.
  • the image data acquired by the image acquisition device is stored in the first storage device together with the position information and the reliability information.
  • the server synthesizes the map data stored in the second storage device and the reliability information associated with the position information included in the display range of the map data.
  • An image is generated.
  • the composite image is an image in which a special display indicating temporal reliability of the image data is added to the map data.
  • the composite image generated by the server is returned to the image display device and displayed on the first display device. Therefore, the user of the image display device can easily grasp the reliability of the image data obtained by photographing each point on the map image by referring to the composite image displayed on the first display device.
  • the reliability information generation means generates a time stamp indicating the date and time when the image data was acquired as the reliability information, and the composite image generation means and the image data with a new acquisition date and time indicated by the time stamp
  • the composite image may be generated by adding to the map data the special display that visually distinguishes the image data with the old acquisition date and time indicated by the time stamp on the map data. .
  • the user of the image display device can visually discriminate between a point where the acquisition date / time of the image data is new and a point where the acquisition date / time of the image data is old with reference to the composite image.
  • the generation unit generates update frequency data indicating a frequency at which the image data is updated as the reliability information, and the composite image generation unit includes the image data having a high update frequency indicated by the update frequency data,
  • the composite image may be generated by adding to the map data the special display that allows the image data having a high update frequency indicated by the update frequency data to be visually distinguishable on the map data.
  • the user of the image display device can visually determine a point where the update frequency of the image data is high and a point where the update frequency of the image data is low with reference to the composite image.
  • the image display device includes a position specifying unit for a user to specify a target position, and the image corresponding to the target position specified by the position specifying unit.
  • Target data acquisition means for acquiring target data as data from the first storage device
  • target data display means for displaying the target data acquired by the target data acquisition means on a second display device Also good.
  • the user of the image display device can view the image data on the second display device after confirming the temporal reliability of the image data at each point on the composite image.
  • the image display device may include warning display means for displaying a warning on the second display device when the temporal reliability indicated by the reliability information corresponding to the target data is lower than a predetermined value. In this case, the user of the image display device can surely grasp that the temporal reliability of the image data at the designated point is low.
  • the image display device includes target list display means for displaying the plurality of target data in a list or thumbnail form on the first display device when a plurality of the target data is acquired by the target data acquisition means.
  • the target data display means may display the target data selected by the user among the plurality of target data displayed by the target list display means on the second display device. In this case, when there are a plurality of image data at the designated point, the user of the image display device can freely select the image data to be displayed on the second display device.
  • An image display device refers to a storage device that stores map data, map data acquisition means for acquiring the map data, image data, and position information indicating an acquisition position of the image data And the storage device that stores the reliability information indicating the temporal reliability of the image data in association with each other, and the reliability associated with the position information included in the display range of the map data Based on the reliability information acquisition means for acquiring information, the map data acquired by the map data acquisition means, and the reliability information acquired by the reliability information acquisition means, the time of the image data A composite image generating means for generating a composite image in which a special display indicating reliability is added to the map data; and the composite image generated by the composite image generating means And a composite image display means for displaying on the display device.
  • the map data and the reliability information of the image data associated with the position information included in the display range of the map data are acquired from the storage device.
  • a composite image is generated based on the acquired map data and reliability information and displayed on the display device.
  • the composite image is an image in which a special display indicating temporal reliability of the image data is added to the map data. Therefore, the user can easily grasp the reliability of the image data obtained by photographing each point on the map image by referring to the composite image displayed on the display device.
  • An image display program refers to a storage device that stores map data in a computer, and includes a map data acquisition step of acquiring the map data, an image data, and an acquisition position of the image data.
  • the storage device stores the positional information to be displayed and the reliability information indicating the temporal reliability of the image data in association with each other, and is associated with the positional information included in the display range of the map data.
  • the image data based on the reliability information acquisition step for acquiring the reliability information, the map data acquired by the map data acquisition step, and the reliability information acquired by the reliability information acquisition step.
  • a composite image generating step for generating a composite image in which a special display indicating temporal reliability of the map is added to the map data; and the composite The combined image generated by the image generating step, characterized in that to perform the composite image display step of displaying on the display device.
  • the map data and the reliability information of the image data associated with the position information included in the display range of the map data are acquired from the storage device.
  • a composite image is generated based on the acquired map data and reliability information and displayed on the display device.
  • the composite image is an image in which a special display indicating temporal reliability of the image data is added to the map data. Therefore, the user can easily grasp the reliability of the image data obtained by photographing each point on the map image by referring to the composite image displayed on the display device.
  • an image acquisition device that acquires image data
  • a server that manages the image data
  • an image display device that displays the image data
  • An image display method executed in the image providing system, the image data acquiring step for acquiring the image data, the position information acquiring step for acquiring the position information indicating the acquisition position of the image data, and the image data
  • the reliability information generation step for generating reliability information indicating temporal reliability of the image data based on the acquired timing, map data, and the position information included in the display range of the map data
  • a special display indicating the temporal reliability of the image data is added to the map data based on the reliability information attached.
  • FIG. 1 is an overall configuration diagram of an image providing system 1.
  • FIG. 3 is a block diagram showing an electrical configuration of the image management server 2.
  • FIG. 3 is a diagram illustrating a data configuration of an image management database 70.
  • FIG. 3 is a diagram showing a data configuration of a position management database 80.
  • FIG. It is the perspective view seen from the slanting upper part of user terminal 5.
  • It is a block diagram which shows the electric constitution of HMD200.
  • 2 is a block diagram showing an electrical configuration of an HMD control terminal 100.
  • FIG. 4 is a flowchart showing upload processing of a fixed point camera 10. It is a flowchart which shows the upload process of the user terminal 5.
  • FIG. 4 is a flowchart showing DB update processing of the image management server 2. It is a flowchart which shows DB update processing of the position management server 3.
  • FIG. 4 is a flowchart showing image providing processing of the image management server 2.
  • 5 is a flowchart showing image display processing of the user terminal 5. It is a figure which shows one specific example of the map data. It is a figure which shows one specific example of the heat map. It is a flowchart which shows a data display process.
  • the image providing system 1 is a system that provides image data taken at a point designated on a map image (hereinafter referred to as a designated point) to a user via a network.
  • a designated point a point designated on a map image
  • an image management server 2 a position management server 3 a plurality of user terminals 5, and a plurality of fixed point cameras 10 are connected via a network 4.
  • the image management server 2 is a computer that manages image data taken in various places, and is connected to the network 4 by wire.
  • the location management server 3 is a computer that manages the current location of the user terminal 5, and is connected to the network 4 by wire.
  • the network 4 is the Internet that can be connected via a public network.
  • the user terminal 5 is a small and lightweight terminal device carried by a user who uses the image providing system 1, and is connected to the network 4 wirelessly.
  • the user terminal 5 acquires the image data of the point designated by the user on the map image from the image management server 2 or the fixed point camera 10 and displays it.
  • the user terminal 5 captures the current position at a predetermined time interval and transmits the captured image data to the image management server 2. For this reason, the image data captured by the user terminal 5 displays a cityscape, a landscape, and the like in the vicinity of the current position of the user terminal 5.
  • the fixed point camera 10 is a camera installed in various places, and is connected to the network 4 by wire.
  • the fixed point camera 10 transmits the captured image data to the image management server 2 and the user terminal 5.
  • the fixed point camera 10 of the present embodiment is provided at main intersections at street corners, and always shoots the four directions of east, west, south, and north of the intersection. Therefore, the image data photographed by the fixed point camera 10 shows the road traffic situation near the intersection where the fixed point camera 10 is set, the flow of pedestrians, and the like.
  • the image management server 2 is a general-purpose server, and includes a CPU 20, a ROM 21, a RAM 22, an HDD 23, a communication device 24, and an I / O interface 29.
  • the ROM 21, RAM 22, and I / O interface 29 are each connected to the CPU 20.
  • the HDD 23 and the communication device 24 are connected to the I / O interface 29.
  • the communication device 24 is a controller that controls data communication via the network 4.
  • the HDD 23 is a large-capacity hard disk drive, and is provided with a program storage area 231, a map data storage area 232, an image data storage area 233, and the like.
  • Various programs for operating the image management server 2 are stored in the program storage area 231, and programs for executing DB update processing (see FIG. 11) and image providing processing (see FIG. 13) described later are also stored. ing.
  • the map data storage area 232 stores map data indicating a map image for the user to specify a point.
  • a plurality of map data are prepared in advance so that at least the entire map of Japan and various places can be displayed at a plurality of scales.
  • CPU20 can specify the positional information (latitude and longitude) of an actual point corresponding to the display position on the map image displayed based on map data.
  • the image data storage area 233 image data obtained by shooting various locations is stored.
  • the image data storage area 233 is provided with an image management database 70 for managing image data.
  • the image management database 70 stores a record indicating image data, position information indicating an acquisition position (that is, a shooting point) of image data, and a time stamp indicating the acquisition date and time of image data. Store every data.
  • the position information indicates the latitude and longitude of the shooting point and the camera direction at the time of shooting.
  • the position management server 3 is a general-purpose server, and includes a CPU, a ROM, a RAM, an HDD, a communication device, and an I / O interface (not shown), like the image management server 2 (FIG. 3).
  • the HDD of the location management server 3 is provided with a current location storage area (not shown) in addition to the program storage area. In this program storage area, various programs for operating the location management server 3 are stored.
  • position information indicating the current position of the user terminal 5 (hereinafter referred to as current position data) is stored.
  • a position management database 80 for managing current position data is provided in the current position data storage area. As shown in FIG. 5, in the position management database 80, a record including a terminal ID unique to the user terminal 5 and the current position of the user terminal 5 is provided for each user terminal 5. The current position is the latest position information (latitude, longitude, and direction) acquired from the user terminal 5.
  • the fixed-point camera 10 includes a camera unit that captures an installation point, and a controller that controls the camera unit and the like.
  • the controller includes a CPU, a ROM, a RAM, and the like, and a control program is stored in the ROM. This control program causes the CPU to execute an image upload process (see FIG. 9) described later.
  • the ROM stores a terminal ID unique to the fixed point camera 10, position information of the fixed point camera 10, and the like.
  • the user terminal 5 of the present embodiment includes a head mounted display (hereinafter referred to as HMD) 200 and an HMD control terminal 100.
  • the HMD 200 is a glasses-type display device that a user can wear on the head and view an image.
  • the HMD control terminal 100 is a small and lightweight electronic device connected to the HMD 200 via a cable 190, and controls display of the HMD 200 and the like.
  • the HMD 200 of this embodiment is a so-called retinal scanning display. That is, the HMD 200 scans a laser beam (hereinafter referred to as video light A) modulated in accordance with a signal of image data (hereinafter referred to as an image signal) that is visually recognized by the user, and the at least one eye of the user. Emits the retina. Thereby, HMD200 can project an image directly on a user's retina, and can visually recognize an image.
  • the HMD 200 includes at least an emission device 210, a half mirror 250, and a head mounting portion 220.
  • the head mounting unit 220 supports the emission device 210 and fixes the HMD 200 to the user's head.
  • the half mirror 250 is disposed at the light emission port of the emission device 210.
  • the emission device 210 emits video light A corresponding to the image signal to the half mirror 250.
  • the half mirror 250 is in a fixed position with respect to the emission device 210.
  • the half mirror 250 reflects the image light A emitted from the emission device 210 toward the eyes of the user.
  • the half mirror 250 is formed by evaporating a metal thin film on a transparent resin plate so as to have a predetermined reflectance (for example, 50%). Therefore, the half mirror 250 transmits part of the external light B from the outside and guides it to the user's eyes.
  • the half mirror 250 causes the image light A incident from the side of the user and the external light B from the outside to enter the user's eyes. As a result, the user can visually recognize the actual field of view and the image based on the video light A.
  • the electrical configuration of the HMD 200 will be described with reference to FIG.
  • the HMD 200 includes a display unit 240, a device connection interface 243, a flash memory 249, a control unit 246, a camera 207, and a power supply unit 247.
  • the control unit 246 includes at least a CPU 261, a ROM 262, and a RAM 248, and controls the entire HMD 200.
  • the control unit 246 executes various processes described below when the CPU 261 reads out various programs stored in the ROM 262.
  • the display unit 240 allows the user to visually recognize the image.
  • the display unit 240 includes an image signal processing unit 270, a laser group 272, and a laser driver group 271.
  • the image signal processing unit 270 receives an image signal from the control unit 246.
  • the image signal processing unit 270 converts the received image signal into signals necessary for direct projection onto the user's retina.
  • the laser group 272 includes a blue output laser (B laser) 721, a green output laser (G laser) 722, and a red output laser (R laser) 723.
  • the laser group 272 outputs blue, green, and red laser beams.
  • the laser driver group 271 performs control for outputting laser light from the laser group 272.
  • the image signal processing unit 270 is electrically connected to the laser driver group 271.
  • the B laser 721, the G laser 722, and the R laser 723 are collectively referred to as lasers.
  • the display unit 240 includes a vertical scanning mirror 812, a vertical scanning control circuit 811, a horizontal scanning mirror 792, and a horizontal scanning control circuit 791.
  • the vertical scanning mirror 812 performs scanning by reflecting the laser beam output from the laser in the vertical direction.
  • the vertical scanning control circuit 811 performs drive control of the vertical scanning mirror 812.
  • the horizontal scanning mirror 792 performs scanning by reflecting the laser beam output from the laser in the horizontal direction.
  • the horizontal scanning control circuit 791 performs drive control of the horizontal scanning mirror 792.
  • the image signal processing unit 270 is electrically connected to the vertical scanning control circuit 811 and the horizontal scanning control circuit 791.
  • the image signal processing unit 270 is electrically connected to the control unit 246 via a bus. Therefore, the image signal processing unit 270 can cause each laser to output laser light with a color and timing according to the image signal received from the control unit 246. Further, the image signal processing unit 270 can reflect the laser light in a direction corresponding to the image signal received from the control unit 246. As a result, the HMD 200 can scan a light beam corresponding to the image signal in a two-dimensional direction, and guide the scanned light to the user's eyes to form a display image on the retina.
  • the camera control unit 299 includes a camera 207 and a camera control circuit 208.
  • the camera 207 captures the same direction as the user's line of sight (see FIG. 6).
  • a camera control circuit 208 controls the camera 207.
  • the camera control circuit 208 is electrically connected to the control unit 246 via a bus. Therefore, the control unit 246 can cause the camera 207 to perform shooting or acquire image data shot by the camera 207.
  • the GPS control unit 289 includes a GPS receiver 288 and a GPS control circuit 287.
  • the GPS receiver 288 acquires the latitude and longitude indicating the current position of the HMD 200.
  • the GPS control circuit 287 controls the GPS receiver 288.
  • the GPS control circuit 287 is electrically connected to the control unit 246 through a bus. Therefore, the control unit 246 can acquire the latitude and longitude of the current position from the GPS receiver 288.
  • the accelerometer control unit 239 includes an accelerometer 238 and an accelerometer control circuit 237.
  • the accelerometer 238 acquires the direction in which the user wearing the HMD 200 is traveling (that is, the shooting direction of the camera 207).
  • the accelerometer control circuit 237 controls the accelerometer 238.
  • the accelerometer control circuit 237 is electrically connected to the control unit 246 via a bus. Therefore, the control unit 246 can acquire the direction of the HMD 200 (that is, the shooting direction of the camera 207) from the accelerometer 238.
  • the device connection interface 243 is a controller that controls data communication via the cable 190.
  • the device connection interface 243 is electrically connected to the control unit 246 via a bus. Therefore, the control unit 246 can receive image data from the HMD control terminal 100 and allow the user to visually recognize the image data. In addition, the control unit 246 can transmit image data captured by the camera 207 and position information at the time of capturing to the HMD control terminal 100.
  • the power supply unit 247 includes a battery 259 and a charge control circuit 260.
  • the battery 259 is a power source that drives the HMD 200.
  • the charge control circuit 260 supplies the power of the battery 259 to the HMD 200.
  • the flash memory 249, the video RAM 244, and the font ROM 245 are electrically connected to the control unit 246 via a bus.
  • the control unit 246 can appropriately refer to information stored in the flash memory 249, the video RAM 244, and the font ROM 245.
  • the HMD control terminal 100 is an external device that supplies image data to the HMD 200.
  • the HMD control terminal 100 includes a CPU 120, ROM 121, RAM 122, HDD 123, communication device 124, operation key 125, device connection interface 126, and I / O interface 129.
  • the ROM 121, RAM 122, and I / O interface 129 are each connected to the CPU 120.
  • the HDD 123, the communication device 124, the operation keys 125, and the device connection interface 126 are connected to the I / O interface 129.
  • the communication device 124 is a controller that controls data communication via the network 4.
  • the device connection interface 126 is a controller that controls data communication via the cable 190.
  • the HDD 133 is a large-capacity hard disk drive, and stores various programs for operating the HMD control terminal 100 and image data to be displayed on the HMD 200.
  • the HDD 133 also stores a program for executing an upload process (see FIG. 10) and an image display process (see FIG. 14) described later.
  • Processing executed in the image providing system 1 will be described with reference to FIGS. More specifically, the processes executed by the image management server 2, the position management server 3, the user terminal 5, and the fixed point camera 10 will be described.
  • the upload process executed by the fixed point camera 10 will be described.
  • the CPU of the fixed point camera 10 When the fixed point camera 10 is powered on, the CPU of the fixed point camera 10 repeatedly executes the upload process (FIG. 9) based on the control program stored in the ROM of the fixed point camera 10.
  • step S1 YES
  • image data indicating the video currently being shot is acquired from the camera unit that is always shooting the installation point (S3).
  • the current date and time indicated by a timer (not shown) built in the fixed point camera 10 is acquired as a time stamp (S5).
  • the positional information (that is, the latitude, longitude, and direction of the fixed point camera 10) stored in the ROM of the fixed point camera 10 is acquired (S7).
  • step S7 data upload to the image management server 2 is executed (S9). Specifically, a data file including the image data acquired in step S3, the time stamp acquired in step S5, and the position information acquired in step S7 is sent to the image management server 2 via the network 4. Sent. After the execution of step S9 or when it is not a predetermined timing (S1: NO), the process returns to step S1. With the above processing, the fixed point camera 10 uploads the image data showing the current state of the installation point to the image management server 2 at intervals of 10 minutes.
  • the upload process executed on the user terminal 5 will be described.
  • the CPU 120 When the user terminal 5 (that is, the HMD control terminal 100 and the HMD 200) is powered on, the CPU 120 repeatedly executes this process based on a program stored in the HDD 133.
  • step S11 In the upload process of the user terminal 5, processes similar to steps S1 to S9 in FIG. 9 are executed (S11 to S19). Specifically, every time 10 minutes have passed since the previous data upload (step S19 described later), it is determined that the predetermined timing is reached (step S11: YES). In step S11, even when a condition other than the elapsed time is satisfied (for example, when the user performs imaging using the camera 207 via the operation key 125), it is determined that the predetermined timing is reached. Good. In this case, the camera 207 of the HMD 200 captures the current position, and image data captured by the camera 207 is acquired from the HMD 200 (S13). The current date and time indicated by a timer (not shown) built in the HMD control terminal 100 is acquired as a time stamp (S15).
  • the GPS receiver 288 is caused to measure the latitude and longitude of the current position.
  • the accelerometer 238 is caused to measure the direction of the user terminal 5.
  • the latitude, longitude, and direction measured in this way are acquired from the HMD 200 as position information of the current position (S17).
  • a data file including the information acquired in steps S13, S15, and S17 is transmitted to the image management server 2 via the network 4 (S19).
  • a current position notification is transmitted to the position management server 3 via the network 4 (S21).
  • the current position notification includes the terminal ID unique to the user terminal 5 and the position information acquired in step S17.
  • the process returns to step S11.
  • the user terminal 5 uploads the image data showing the current position of the user terminal 5 to the image management server 2 at intervals of 10 minutes. Further, the current position of the user terminal 5 is notified to the position management server 3 at intervals of 10 minutes.
  • the DB update process executed by the image management server 2 will be described.
  • the CPU 20 When the image management server 2 is powered on, the CPU 20 repeatedly executes this process based on a program stored in the HDD 33.
  • step S31 the image management server 2 can collect image data of various locations from the fixed point camera 10 and the user terminal 5.
  • a record including the image data “camera_a.jpg” is registered in the image management database 70 based on the data file uploaded from the fixed point camera 10 “camera_a”.
  • position information indicating the installation position of the fixed point camera 10 “camera_a”
  • a time stamp indicating the acquisition date and time of the image data “camera_a.jpg” are set.
  • a record including the image data “hmd001_a.jpg” is registered in the image management database 70 based on the data file uploaded from the user terminal 5 “hmd001” worn by the user A (see FIG. 2). .
  • image data “camera_a.jpg” position information indicating the position of the user terminal 5 “hmd001” at the time of acquisition and a time stamp indicating the acquisition date and time are set.
  • the DB update process executed by the location management server 3 will be described.
  • the CPU When the position management server 3 is powered on, the CPU repeatedly executes this process based on a program stored in the HDD.
  • the position management server 3 determines whether or not there is a current location notification (S41). Specifically, when the current position notification is received from the user terminal 5, it is determined that there is a current position notification (S41: YES). In this case, the location management database 80 is updated based on the received current location notification (S43). After execution of step S43 or when there is no current position notification (S43: NO), the process returns to step S41 and waits for reception of the current position notification. With the above processing, the position management server 3 can always grasp the installation position of the fixed point camera 10 fixed in each place and the current position of the user terminal 5 that can move to each place.
  • a record including the terminal ID “hmd001” is registered in the location management database 80 based on the current location notification received from the user terminal 5 “hmd001” worn by the user A (see FIG. 2). Has been.
  • position information indicating the current position of the user terminal 5 “hmd001” is set.
  • the image providing process executed by the image management server 2 will be described.
  • the CPU 20 When the image management server 2 is powered on, the CPU 20 repeatedly executes this process based on a program stored in the HDD 33.
  • the map request is a signal for requesting map data from the image management server 2 by the user terminal 5, and includes a display range of a map image designated by the user (hereinafter referred to as a map display range).
  • map data indicating the map display range included in the map request is acquired from the map data storage area 232 (S53).
  • the time stamp corresponding to the map display range of the map data acquired in step S53 is acquired from the image management database 70 (S55). Specifically, all image data having latitude and longitude included in the map display range of the map data is searched from the image management database 70. A time stamp associated with the image data hit in the search is acquired from the image management database 70.
  • map drawing information is transmitted to the requesting user terminal 5 via the network 4 (S57). The map drawing information is information used to display a map image on the user terminal 5, and includes the map data acquired in step S53 and the time stamp acquired in step S55.
  • time stamps are acquired from all records including position information matching the latitude and longitude (S55). ).
  • the time stamp “100707120230” is acquired.
  • the map data corresponding to the map display range is transmitted to the requesting user terminal 5 together with the time stamp indicating the image acquisition date and time at each point in the map display range.
  • step S57 it is determined whether an image request has been received (S59).
  • the image request is a signal for the user terminal 5 to request image data from the image management server 2 and includes position information of a point designated by the user (hereinafter referred to as a designated point).
  • the image data associated with the position information of the designated point is specified with reference to the image management database 70 (S61).
  • a list of image data which is a list of image data specified in step S61, is transmitted to the requesting user terminal 5 via the network 4 (S63).
  • image data from all records including position information that matches the latitude, longitude, and direction. Is identified (S61).
  • the image data “camera_a.jpg” is specified.
  • step S63 thumbnail images obtained by reducing and displaying each of these image data and an image data list indicating the acquisition date and time are transmitted to the user terminal 5 that has made the request.
  • step S63 it is determined whether an image selection instruction has been received (S65).
  • the image selection instruction includes identification information of image data selected by the user from the image data list. Based on the identification information included in the image selection instruction, the CPU 20 can identify the image data selected by the user from among the image data identified in step S61.
  • an image selection instruction is received (S65: YES)
  • the time stamp of the image data selected by the user is specified with reference to the image management database 70. If the elapsed time from the identified time stamp to the current date and time is within a predetermined value (in this embodiment, less than 10 minutes), it is determined that the image data is the current image (S67: YES).
  • an image provision instruction is transmitted to the designated point via the network 4 (S69).
  • the image providing instruction provides image data being captured by the requesting user terminal 5 (in this embodiment, video data that is a moving image that can be played back in real time) to the image acquisition device existing at the designated point. Is a signal for instructing.
  • the CPU 20 inquires of the position management server 3 about the image acquisition device existing at the designated point. In response to an inquiry from the image management server 2, the CPU of the position management server 3 refers to the position management database 80 to identify the image acquisition device corresponding to the designated point and notifies the image management server 2 of the image acquisition device.
  • the CPU 20 transmits a request for transmitting video data in real time to the requesting user terminal 5 to the image acquisition apparatus notified from the position management server 3.
  • the image acquisition device is identified from the record including this position information (S67). : YES, S69).
  • the fixed point camera 10 “camera_a” is specified as the image acquisition device.
  • a request is transmitted to the fixed point camera 10 “camera_a”, thereby providing the requesting user terminal 5 with the video data of the fixed point camera 10 “camera_a” in real time.
  • the image data selected by the user is read from the image management database 70 and transmitted to the requesting user terminal 5 via the network 4 (S71).
  • the image data indicated by the image selection instruction is “hmd — 001.jpg”
  • the time difference between the time stamp “100707080513” and the current date and time is 10 minutes or more (S67: NO). Therefore, in step S71, the image data “hmd — 001.jpg” is read from the image management database 70 and transmitted to the requesting user terminal 5.
  • step S69 or step S71 the process returns to step S51.
  • step S51: NO a map request is not received
  • S59: NO an image request is not received
  • S65: NO an image selection instruction is not received
  • the process returns to step S51.
  • the image management server 2 can provide map data and image data of a designated point in response to a request from the user terminal 5.
  • the video data can be provided to the user terminal 5 in real time from the image acquisition device at the designated point.
  • the HDD 133 stores an application program that displays image data, map data, and the like provided from the image management server 2.
  • the image display process is repeatedly executed by the CPU 120 when the application program is activated.
  • a map display operation (S101).
  • the map display operation is such that the user wearing the HMD 200 moves, enlarges, reduces, etc. a map image (in detail, a heat map described later) projected on the eyes using the operation keys 125 (see FIG. 8). It is an operation to do.
  • a map display operation (S101: YES)
  • a map request is transmitted to the image management server 2 via the network 4 (S103).
  • map drawing information is received (S105: YES).
  • a heat map is created (S107).
  • the heat map is an image in which a special display indicating temporal reliability of the image data is added to the map data. Temporal reliability means whether or not the elapsed time calculated from the acquisition of image data is small (that is, whether or not the image data is new).
  • this map display range Is transmitted (S103).
  • the map drawing information including the map data and the time stamp is returned from the image management server 2.
  • the CPU 120 creates a heat map based on the map drawing information received from the image management server 2 (S107).
  • the map data 300 included in the map drawing information is a map image showing a map display range.
  • the display area for each place included in the map display range is color-coded according to the time stamp of each point included in the map drawing information.
  • a heat map is created that indicates the newness of the update date and time of image data at each point included in the map display range in a color-coded manner. In this heat map, a point where the time stamp is new is colored in a warm color, while a point where the time stamp is old is colored in a cold color.
  • the display area of the point corresponding to the time stamp is colored in “red”.
  • the display area of the point corresponding to the time stamp is colored “pink”.
  • the display area of the point corresponding to the time stamp is colored “yellow”.
  • the display area of the point corresponding to the time stamp is colored “colorless”. Since the display area of the point where the time stamp does not exist does not include image data obtained by photographing the point, the display area of the point is colored “black”.
  • the user can easily visually determine whether or not the image data of each place included in the map display range is new with reference to the heat map 310. In other words, the user can visually discriminate between a point where the image data acquisition date is new and a point where the image data acquisition date is old with reference to the heat map 310.
  • the heat map created in step S107 is displayed on the HMD 200 (S109). Specifically, the heat map is displayed on the eyes of the user wearing the HMD 200.
  • step S109 it is determined whether or not there is an input of a designated point / direction (S111). For example, when the user performs a cursor operation with the operation key 125 and designates the coordinate position and the shooting direction on the heat map, it is determined that there is an input of the designated point / direction (S111: YES).
  • an image request for requesting image data at the designated point is transmitted to the image management server 2 via the network 4 (S113).
  • step S113 it is determined whether an image data list has been received (S115). When the image data list is received (S115: YES), the image data list is displayed on the HMD 200 (S117).
  • An image request including the position information is transmitted (S113).
  • the image management server 2 returns a list of image data including the thumbnail image of the image data of the designated point (image data “camera_a.jpg” in the example of FIG. 4) and the acquisition date and time.
  • the CPU 120 displays the image data of the designated point in the form of thumbnails on the eyes of the user wearing the HMD 200 (S117).
  • step S117 After execution of step S117, it is determined whether image data has been selected (S119). For example, when the user performs a cursor operation with the operation key 125 and selects a thumbnail image from the image data list, it is determined that image data has been selected (S119: YES). In this case, an image selection instruction including identification information of the selected image data is transmitted to the image management server 2 via the network 4 (S121). After execution of step S121, a data display process described later is executed (S123), and the process returns to step S101.
  • step S101: NO when there is no map display operation (S101: NO), when map drawing information is not received (S105: NO), when there is no input of a designated point / direction (S111: NO), an image data list is received. If not (S115: NO) and if no target image is selected (S119: NO), the process returns to step S101.
  • the user terminal 5 can display a heat map indicating the temporal reliability of image data obtained by photographing each point on the map image in response to a user request.
  • image data As shown in FIG. 17, in the data display process (S123), it is determined whether image data has been received (S151).
  • image data is received from the image management server 2 (S151: YES)
  • a predetermined warning is displayed on the HMD 200 (S155). Examples of the predetermined warning include “This image was acquired more than a month ago and may be different from the current landscape”.
  • step S155 After execution of step S155 or when the image data is not older than the predetermined time (S153: NO), the received image data is displayed on the HMD 200 (S157). If a warning is displayed in step S155, image data is displayed together with the warning. On the other hand, if no image data has been received (S151: NO), it is determined whether video data has been received (S159). When video data is received from the fixed point camera 10 (S159: YES), the received video data is streamed and played back by the HMD 200 (S161).
  • an image selection instruction including the identification information is transmitted (S121).
  • the image data “camera_a.jpg” is returned from the image management server 2.
  • the CPU 120 displays the image data “camera_a.jpg” received from the image management server 2 on the eyes of the user wearing the HMD 200 (S157).
  • an image selection instruction including the identification information is transmitted (S121).
  • real-time video data is transmitted from the fixed point camera 10 “hmd_001”.
  • the CPU 120 displays the video data received from the fixed point camera 10 “hmd_001” on the eyes of the user wearing the HMD 200 (S161).
  • step S157 or step S161 or when no video data is received (S159: NO)
  • the data display process ends, and the process returns to the image display process (FIG. 14).
  • the user terminal 5 can acquire and display image data or video data of a designated point designated from the heat map from the image management server 2 or the fixed point camera 10.
  • the image data acquired by the fixed point camera 10 or the user terminal 5 is stored in the image management database 70 together with the position information and time stamp at the time of acquisition.
  • the map data stored in the map data storage area 232 is returned to the user terminal 5 in response to a request from the user terminal 5.
  • the time stamp associated with the position information included in the display range of the map data is also returned to the user terminal 5.
  • a heat map is generated based on the map data and time stamp acquired from the image management server 2, and displayed on the HMD 200.
  • a heat map is an image in which a special display indicating temporal reliability of image data is added to the map data. Therefore, the user of the user terminal 5 can easily grasp the reliability of the image data obtained by photographing each point on the map image by referring to the heat map displayed on the HMD 200.
  • the user of the user terminal 5 displays the image data of the designated point selected on the heat map on the HMD 200. Therefore, the user can browse the image data with the HMD 200 after confirming the temporal reliability of the image data at each point on the heat map. Further, when the image data of the designated point selected on the heat map is older than a predetermined time, a warning is displayed on the HMD 200. Therefore, the user can surely grasp that the temporal reliability of the image data at the designated point is low. In addition, when there are a plurality of image data at a designated point, the user can freely select image data to be displayed on the HMD 200 from the image data list.
  • the fixed point camera 10 and the user terminal 5 correspond to the “image acquisition device” of the present invention.
  • the image management server 2 corresponds to the “server” of the present invention.
  • the user terminal 5 corresponds to the “image display device” of the present invention.
  • the CPU of the fixed point camera 10 that executes step S3 and the CPU 120 that executes step S13 correspond to the “image data acquisition unit” of the present invention.
  • the CPU of the fixed point camera 10 that executes step S7 and the CPU 120 that executes step S17 correspond to the “positional information acquisition unit” of the present invention.
  • the CPU of the fixed point camera 10 that executes step S5 and the CPU 120 that executes step S15 correspond to the “reliability information generation unit” of the present invention.
  • the CPU of the fixed-point camera 10 that executes step S9 and the CPU 120 that executes step S19 correspond to the “information transmitting unit” of the present invention.
  • the CPU 20 executing step S35 corresponds to the “storage control means” of the present invention.
  • the CPU 20 that executes step S57 corresponds to the “information transmitting unit” of the present invention.
  • the CPU 120 executing steps S103 and S105 corresponds to the “information acquisition unit” of the present invention.
  • the CPU 120 that executes step S107 corresponds to the “composite image generation unit” of the present invention.
  • the CPU 120 executing step S109 corresponds to the “composite image display unit” of the present invention.
  • the CPU 120 that executes step S111 corresponds to the “position specifying means” of the present invention.
  • the CPU 120 that executes steps S151 and S159 corresponds to the “target data acquisition unit” of the present invention.
  • the CPU 120 that executes steps S157 and S161 corresponds to the “target data display unit” of the present invention.
  • the CPU 120 executing step S155 corresponds to the “warning display means” of the present invention.
  • the CPU 120 executing step S117 corresponds to the “target list display unit” of the present invention.
  • the CPU 120 that executes steps S103 and S105 corresponds to the “map data acquisition means” and “reliability information acquisition means” of the present invention.
  • Steps S103 and S105 correspond to the “map data acquisition step” and “reliability information acquisition step” of the present invention.
  • Step S107 corresponds to the “composite image generation step” of the present invention.
  • Step S109 corresponds to the “composite image display step” of the present invention.
  • the retinal scanning display is exemplified as the HMD 200, but the display method can be changed.
  • the display method can be changed.
  • it may be a head mounted display of another display method such as a liquid crystal display or an organic EL (ElectroLuminescence) display.
  • a display device that can be carried by the user, such as a mobile phone, a notebook computer, or a PDA, may be used.
  • the time stamp indicating the acquisition date and time of the image data is exemplified as the reliability information indicating the temporal reliability of the image data.
  • the update frequency of the image data may be used as the reliability information.
  • the update frequency of the image data (for example, 10 minute interval, 1 hour interval, 1 day interval, etc.) may be managed instead of the time stamp.
  • the time interval of the update frequency of the image data at each point included in the map display range (that is, the number of updates per unit time) is color-coded and shown stepwise.
  • a heat map is created. In this heat map, warm colors are used for points with a high update frequency, while cool colors are used for points with a low update frequency.
  • the user can easily visually determine whether or not the image data of each place included in the map display range is new with reference to the heat map. In other words, the user can visually discriminate between a point where the image data update frequency is high and a point where the image data update frequency is low with reference to the heat map.
  • a part of the image display process (FIG. 14) of the user terminal 5 may be executed by another computer.
  • the image management server 2 instead of transmitting the map drawing information (S57), heat map creation (S107) may be executed. Then, the heat map created by the image management server 2 may be transmitted to the user terminal 5. In this case, the heat map can be displayed on the HMD 200 while omitting the heat map creation process in the user terminal 5.

Abstract

Disclosed is an image providing system wherein an image acquisition device for acquiring image data, a server for managing the image data, and an image display device for displaying the image data are connected through a network. On the basis of the timing that the image data is acquired, reliability information indicating the time-related reliability of the image data is generated. The image data, positional information, and the reliability information are stored in the server. The image display device displays a composite image wherein a special image indicating the time-related reliability of the image data is added to map data, on the basis of the map data and the reliability information.

Description

画像提供システム、画像表示装置、画像表示プログラムおよび画像表示方法Image providing system, image display device, image display program, and image display method
 本開示は、各地で取得された画像を提供する画像提供システム、並びに、各地で取得された画像を表示する画像表示装置、画像表示プログラムおよび画像表示方法に関する。 The present disclosure relates to an image providing system that provides images acquired in various places, an image display device that displays images acquired in various places, an image display program, and an image display method.
 従来、各地の映像を入手可能なシステムが知られている(例えば、特許文献1参照)。具体的には、画像表示装置において、場所を指定して現在映像の入手指示が入力された場合、指定された場所に存在する外部映像入手装置から現在映像の撮像データが取得される。画像表示装置において、場所を指定して過去映像の入手指示が入力された場合、撮像データ記憶部またはセンタ局から過去映像の撮像データが取得される。これにより、画像表示装置は、指定場所の現在映像または過去映像を表示できる。 Conventionally, a system that can obtain images from various locations is known (for example, see Patent Document 1). Specifically, in the image display device, when a location is specified and a current video acquisition instruction is input, imaging data of the current video is acquired from an external video acquisition device existing at the specified location. In the image display device, when a location is specified and an instruction to obtain a past video is input, imaging data of the past video is acquired from the imaging data storage unit or the center station. Thereby, the image display apparatus can display the present image or the past image of the designated place.
特開2003-299156号公報JP 2003-299156 A
 上記のシステムでは、指定場所の現在映像または過去映像が表示される場合、映像が新しいほど情報の信頼性が高い。一方で、映像が古いほど情報の信頼性が低い。つまり、映像データの取得日時が、現在の指定場所における街並み、風景、状態等との同一性(以下、映像の信頼性という。)に大きな影響を与える。しかしながら、ユーザは、指定場所の映像が新しいか古いか、つまり映像の信頼性を把握することが困難であった。 In the above system, when the current video or past video at the specified location is displayed, the newer the video, the higher the reliability of the information. On the other hand, the older the video, the lower the reliability of the information. That is, the acquisition date and time of video data greatly affects the identity (hereinafter referred to as video reliability) with the cityscape, landscape, state, etc. at the current designated location. However, it is difficult for the user to grasp whether the video at the designated location is new or old, that is, the reliability of the video.
 本開示は、上述した課題を解決するためになされたものであり、ユーザが地図画像上の各地点を撮影した画像データの信頼性を容易に把握可能な画像提供システム、画像表示装置、および画像表示プログラムを提供することを目的とする。 The present disclosure has been made in order to solve the above-described problem, and an image providing system, an image display device, and an image that allow a user to easily grasp the reliability of image data obtained by photographing each point on a map image An object is to provide a display program.
 本開示の第1態様に係る画像提供システムは、画像データを取得する画像取得装置と、前記画像データを管理するサーバと、前記画像データを表示する画像表示装置とが、ネットワークを介して接続された画像提供システムであって、前記画像取得装置は、前記画像取得装置の周辺を撮像する撮像手段から、前記画像データを取得する画像データ取得手段と、前記画像取得装置の現在位置を取得する位置取得手段から、前記画像データの取得位置を示す位置情報を取得する位置情報取得手段と、前記画像データ取得手段が前記画像データを取得するタイミングに基づいて、前記画像データの時間的な信頼性を示す信頼性情報を生成する信頼性情報生成手段と、前記画像データ取得手段によって取得された前記画像データと、前記位置情報取得手段によって取得された前記位置情報と、前記信頼性情報生成手段によって生成された前記信頼性情報とを、前記サーバに送信する情報送信手段とを備え、前記サーバは、前記画像取得装置から受信した前記画像データ、前記位置情報、および前記信頼性情報を対応付けて、第一記憶装置に記憶させる記憶制御手段と、前記画像表示装置からの要求に応じて、第二記憶装置に記憶されている地図データと、前記地図データの表示範囲に含まれる前記位置情報と対応付けられている前記信頼性情報とを、前記画像表示装置に返信する情報送信手段とを備え、前記画像表示装置は、前記サーバに前記要求を送信することで、前記サーバから前記地図データおよび前記信頼性情報を取得する情報取得手段と、前記情報取得手段によって取得された前記地図データおよび前記信頼性情報に基づいて、前記画像データの時間的信頼性を示す特殊表示が前記地図データに付加された合成画像を生成する合成画像生成手段と、前記合成画像生成手
段によって生成された前記合成画像を、第一表示装置に表示する合成画像表示手段とを備えている。なお、「画像取得装置の周辺を撮像する」とは、画像取得装置の距離的に周辺のみに限定されない。画像取得装置の周辺において所定の方角に対して行う撮像(撮像画像には無限遠の距離までが含まれる)に関しても、「画像取得装置の周辺を撮像する」の範囲に含まれる。
In the image providing system according to the first aspect of the present disclosure, an image acquisition device that acquires image data, a server that manages the image data, and an image display device that displays the image data are connected via a network. In the image providing system, the image acquisition device includes: an image data acquisition unit that acquires the image data from an imaging unit that images the periphery of the image acquisition device; and a position that acquires a current position of the image acquisition device. Based on timing information acquisition means for acquiring position information indicating the acquisition position of the image data from the acquisition means, and timing at which the image data acquisition means acquires the image data, the temporal reliability of the image data is increased. Reliability information generating means for generating reliability information to be shown, the image data acquired by the image data acquiring means, and the position information acquisition Information transmission means for transmitting the position information acquired by the stage and the reliability information generated by the reliability information generation means to the server, the server receiving from the image acquisition device The image data, the position information, and the reliability information are associated with each other and stored in the first storage device, and stored in the second storage device in response to a request from the image display device. Map information, and information transmission means for returning the reliability information associated with the position information included in the display range of the map data to the image display device, the image display device, By transmitting the request to a server, the information acquisition means for acquiring the map data and the reliability information from the server, and the information acquired by the information acquisition means Based on the map data and the reliability information, a special display indicating the temporal reliability of the image data is generated by the composite image generation means for generating a composite image added to the map data, and the composite image generation means And a composite image display means for displaying the composite image on the first display device. Note that “imaging the periphery of the image acquisition device” is not limited to only the periphery in terms of the distance of the image acquisition device. Imaging performed in a predetermined direction around the image acquisition apparatus (the captured image includes a distance up to infinity) is also included in the range of “imaging the periphery of the image acquisition apparatus”.
 第1態様によれば、画像取得装置で取得された画像データが、位置情報および信頼性情報とともに、第一記憶装置に記憶される。サーバでは、画像表示装置からの要求に応じて、第二記憶装置に記憶されている地図データが、画像表示装置に返信される。このとき、地図データの表示範囲に含まれる位置情報と対応付けられている信頼性情報も、画像表示装置に返信される。画像表示装置では、サーバから取得した地図データおよび信頼性情報に基づいて合成画像が生成されて、第一表示装置に表示される。合成画像は、画像データの時間的信頼性を示す特殊表示が、地図データに付加された画像である。したがって、画像表示装置のユーザは、第一表示装置に表示された合成画像を参照することで、地図画像上の各地点を撮影した画像データの信頼性を容易に把握することができる。 According to the first aspect, the image data acquired by the image acquisition device is stored in the first storage device together with the position information and the reliability information. In the server, the map data stored in the second storage device is returned to the image display device in response to a request from the image display device. At this time, the reliability information associated with the position information included in the display range of the map data is also returned to the image display device. In the image display device, a composite image is generated based on the map data and reliability information acquired from the server and displayed on the first display device. The composite image is an image in which a special display indicating temporal reliability of the image data is added to the map data. Therefore, the user of the image display device can easily grasp the reliability of the image data obtained by photographing each point on the map image by referring to the composite image displayed on the first display device.
 本開示の第2態様に係る画像提供システムは、画像データを取得する画像取得装置と、前記画像データを管理するサーバと、前記画像データを表示する画像表示装置とが、ネットワークを介して接続された画像提供システムであって、前記画像取得装置は、前記画像取得装置の周辺を撮像する撮像手段から、前記画像データを取得する画像データ取得手段と、前記画像取得装置の現在位置を取得する位置取得手段から、前記画像データの取得位置を示す位置情報を取得する位置情報取得手段と、前記画像データ取得手段が前記画像データを取得するタイミングに基づいて、前記画像データの時間的な信頼性を示す信頼性情報を生成する信頼性情報生成手段と、前記画像データ取得手段によって取得された前記画像データと、前記位置情報取得手段によって取得された前記位置情報と、前記信頼性情報生成手段によって生成された前記信頼性情報とを、前記サーバに送信する情報送信手段とを備え、前記サーバは、前記画像取得装置から受信した前記画像データ、前記位置情報、および前記信頼性情報を対応付けて、第一記憶装置に記憶させる記憶制御手段と、前記画像表示装置からの要求に応じて、第二記憶装置に記憶されている地図データと、前記地図データの表示範囲に含まれる前記位置情報と対応付けられている前記信頼性情報とに基づいて、前記画像データの時間的信頼性を示す特殊表示が前記地図データに付加された合成画像を生成する合成画像生成手段と、前記合成画像生成手段によって生成された前記合成画像を、前記画像表示装置に返信する情報送信手段とを備え、前記画像表示装置は、前記サーバに前記要求を送信することで、前記サーバから前記合成画像を取得する情報取得手段と、前記情報取得手段によって取得された前記合成画像を、第一表示装置に表示する合成画像表示手段とを備えている。 In the image providing system according to the second aspect of the present disclosure, an image acquisition device that acquires image data, a server that manages the image data, and an image display device that displays the image data are connected via a network. In the image providing system, the image acquisition device includes: an image data acquisition unit that acquires the image data from an imaging unit that images the periphery of the image acquisition device; and a position that acquires a current position of the image acquisition device. Based on timing information acquisition means for acquiring position information indicating the acquisition position of the image data from the acquisition means, and timing at which the image data acquisition means acquires the image data, the temporal reliability of the image data is increased. Reliability information generation means for generating the reliability information to be shown, the image data acquired by the image data acquisition means, and the position information acquisition. Information transmission means for transmitting the position information acquired by the means and the reliability information generated by the reliability information generation means to the server, the server receiving from the image acquisition device The image data, the position information, and the reliability information are associated with each other and stored in the first storage device, and stored in the second storage device in response to a request from the image display device. Based on the map data and the reliability information associated with the position information included in the display range of the map data, a special display indicating the temporal reliability of the image data is added to the map data. Composite image generation means for generating a composite image, and information transmission means for returning the composite image generated by the composite image generation means to the image display device. The image display device transmits the request to the server to acquire the composite image from the server, and the composite image acquired by the information acquisition device to the first display device. And a composite image display means for displaying.
 第2態様によれば、画像取得装置で取得された画像データが、位置情報および信頼性情報とともに、第一記憶装置に記憶される。サーバでは、画像表示装置からの要求に応じて、第二記憶装置に記憶されている地図データと、地図データの表示範囲に含まれる位置情報と対応付けられている信頼性情報とに基づいて合成画像が生成される。合成画像は、画像データの時間的信頼性を示す特殊表示が、地図データに付加された画像である。サーバで生成された合成画像は、画像表示装置に返信されて第一表示装置に表示される。したがって、画像表示装置のユーザは、第一表示装置に表示された合成画像を参照することで、地図画像上の各地点を撮影した画像データの信頼性を容易に把握することができる。 According to the second aspect, the image data acquired by the image acquisition device is stored in the first storage device together with the position information and the reliability information. In response to a request from the image display device, the server synthesizes the map data stored in the second storage device and the reliability information associated with the position information included in the display range of the map data. An image is generated. The composite image is an image in which a special display indicating temporal reliability of the image data is added to the map data. The composite image generated by the server is returned to the image display device and displayed on the first display device. Therefore, the user of the image display device can easily grasp the reliability of the image data obtained by photographing each point on the map image by referring to the composite image displayed on the first display device.
 前記信頼性情報生成手段は、前記画像データが取得された日時を示すタイムスタンプを、前記信頼性情報として生成し、前記合成画像生成手段は、前記タイムスタンプが示す取得日時が新しい前記画像データと、前記タイムスタンプが示す取得日時が古い前記画像データとを、前記地図データ上で視覚的に区別可能とする前記特殊表示を前記地図データに付加することで、前記合成画像を生成してもよい。この場合、画像表示装置のユーザは、画像データの取得日時が新しい地点と、画像データの取得日時が古い地点とを、合成画像を参照して視覚的に判別できる。 The reliability information generation means generates a time stamp indicating the date and time when the image data was acquired as the reliability information, and the composite image generation means and the image data with a new acquisition date and time indicated by the time stamp The composite image may be generated by adding to the map data the special display that visually distinguishes the image data with the old acquisition date and time indicated by the time stamp on the map data. . In this case, the user of the image display device can visually discriminate between a point where the acquisition date / time of the image data is new and a point where the acquisition date / time of the image data is old with reference to the composite image.
 前記生成手段は、前記画像データが更新される頻度を示す更新頻度データを前記信頼性情報として生成し、前記合成画像生成手段は、前記更新頻度データが示す更新頻度が多い前記画像データと、前記更新頻度データが示す更新頻度が多い前記画像データとを、前記地図データ上で視覚的に区別可能とする前記特殊表示を前記地図データに付加することで、前記合成画像を生成してもよい。この場合、画像表示装置のユーザは、画像データの更新頻度が多い地点と、画像データの更新頻度が少ない地点とを、合成画像を参照して視覚的に判別できる。 The generation unit generates update frequency data indicating a frequency at which the image data is updated as the reliability information, and the composite image generation unit includes the image data having a high update frequency indicated by the update frequency data, The composite image may be generated by adding to the map data the special display that allows the image data having a high update frequency indicated by the update frequency data to be visually distinguishable on the map data. In this case, the user of the image display device can visually determine a point where the update frequency of the image data is high and a point where the update frequency of the image data is low with reference to the composite image.
 前記画像表示装置は、前記第一表示装置に表示された前記合成画像において、ユーザが対象位置を指定するための位置指定手段と、前記位置指定手段によって指定された前記対象位置に対応する前記画像データである対象データを、前記第一記憶装置から取得する対象データ取得手段と、前記対象データ取得手段によって取得された前記対象データを、第二表示装置に表示する対象データ表示手段とを備えてもよい。この場合、画像表示装置のユーザは、各地点の画像データの時間的信頼性を合成画像上で確認したうえで、第二表示装置で画像データを閲覧できる。 In the composite image displayed on the first display device, the image display device includes a position specifying unit for a user to specify a target position, and the image corresponding to the target position specified by the position specifying unit. Target data acquisition means for acquiring target data as data from the first storage device, and target data display means for displaying the target data acquired by the target data acquisition means on a second display device Also good. In this case, the user of the image display device can view the image data on the second display device after confirming the temporal reliability of the image data at each point on the composite image.
 前記画像表示装置は、前記対象データに対応する前記信頼性情報が示す前記時間的信頼性が所定値よりも低い場合、前記第二表示装置に警告を表示する警告表示手段を備えてもよい。この場合、画像表示装置のユーザは、指定地点の画像データの時間的信頼性が低いことを確実に把握できる。 The image display device may include warning display means for displaying a warning on the second display device when the temporal reliability indicated by the reliability information corresponding to the target data is lower than a predetermined value. In this case, the user of the image display device can surely grasp that the temporal reliability of the image data at the designated point is low.
 前記画像表示装置は、前記対象データ取得手段によって複数の前記対象データが取得された場合に、前記複数の対象データをリスト状またはサムネイル状に前記第一表示装置に表示する対象一覧表示手段を備え、前記対象データ表示手段は、前記対象一覧表示手段によって表示された前記複数の対象データのうちで、ユーザが選択した前記対象データを前記第二表示装置に表示してもよい。この場合、指定地点の画像データが複数存在する場合、画像表示装置のユーザは第二表示装置に表示する画像データを自由に選択できる。 The image display device includes target list display means for displaying the plurality of target data in a list or thumbnail form on the first display device when a plurality of the target data is acquired by the target data acquisition means. The target data display means may display the target data selected by the user among the plurality of target data displayed by the target list display means on the second display device. In this case, when there are a plurality of image data at the designated point, the user of the image display device can freely select the image data to be displayed on the second display device.
 本開示の第3態様に係る画像表示装置は、地図データを記憶する記憶装置を参照して、前記地図データを取得する地図データ取得手段と、画像データ、前記画像データの取得位置を示す位置情報、および前記画像データの時間的な信頼性を示す信頼性情報を対応付けて記憶する記憶装置を参照して、前記地図データの表示範囲に含まれる前記位置情報と対応付けられている前記信頼性情報を取得する信頼性情報取得手段と、前記地図データ取得手段によって取得された前記地図データと、前記信頼性情報取得手段によって取得された前記信頼性情報とに基づいて、前記画像データの時間的信頼性を示す特殊表示が前記地図データに付加された合成画像を生成する合成画像生成手段と、前記合成画像生成手段によって生成された前記合成画像を、表示装置に表示する合成画像表示手段とを備えている。 An image display device according to the third aspect of the present disclosure refers to a storage device that stores map data, map data acquisition means for acquiring the map data, image data, and position information indicating an acquisition position of the image data And the storage device that stores the reliability information indicating the temporal reliability of the image data in association with each other, and the reliability associated with the position information included in the display range of the map data Based on the reliability information acquisition means for acquiring information, the map data acquired by the map data acquisition means, and the reliability information acquired by the reliability information acquisition means, the time of the image data A composite image generating means for generating a composite image in which a special display indicating reliability is added to the map data; and the composite image generated by the composite image generating means And a composite image display means for displaying on the display device.
 第3態様によれば、地図データと、地図データの表示範囲に含まれる位置情報と対応付けられている画像データの信頼性情報とが、記憶装置から取得される。取得された地図データおよび信頼性情報に基づいて合成画像が生成されて、表示装置に表示される。合成画像は、画像データの時間的信頼性を示す特殊表示が、地図データに付加された画像である。したがって、ユーザは表示装置に表示された合成画像を参照することで、地図画像上の各地点を撮影した画像データの信頼性を容易に把握することができる。 According to the third aspect, the map data and the reliability information of the image data associated with the position information included in the display range of the map data are acquired from the storage device. A composite image is generated based on the acquired map data and reliability information and displayed on the display device. The composite image is an image in which a special display indicating temporal reliability of the image data is added to the map data. Therefore, the user can easily grasp the reliability of the image data obtained by photographing each point on the map image by referring to the composite image displayed on the display device.
 本開示の第4態様に係る画像表示プログラムは、コンピュータに、地図データを記憶する記憶装置を参照して、前記地図データを取得する地図データ取得ステップと、画像データ、前記画像データの取得位置を示す位置情報、および前記画像データの時間的な信頼性を示す信頼性情報を対応付けて記憶する記憶装置を参照して、前記地図データの表示範囲に含まれる前記位置情報と対応付けられている前記信頼性情報を取得する信頼性情報取得ステップと、前記地図データ取得ステップによって取得された前記地図データと、前記信頼性情報取得ステップによって取得された前記信頼性情報とに基づいて、前記画像データの時間的信頼性を示す特殊表示が前記地図データに付加された合成画像を生成する合成画像生成ステップと、前記合成画像生成ステップによって生成された前記合成画像を、表示装置に表示する合成画像表示ステップとを実行させることを特徴とする。 An image display program according to a fourth aspect of the present disclosure refers to a storage device that stores map data in a computer, and includes a map data acquisition step of acquiring the map data, an image data, and an acquisition position of the image data. The storage device stores the positional information to be displayed and the reliability information indicating the temporal reliability of the image data in association with each other, and is associated with the positional information included in the display range of the map data. The image data based on the reliability information acquisition step for acquiring the reliability information, the map data acquired by the map data acquisition step, and the reliability information acquired by the reliability information acquisition step. A composite image generating step for generating a composite image in which a special display indicating temporal reliability of the map is added to the map data; and the composite The combined image generated by the image generating step, characterized in that to perform the composite image display step of displaying on the display device.
 第4態様によれば、地図データと、地図データの表示範囲に含まれる位置情報と対応付けられている画像データの信頼性情報とが、記憶装置から取得される。取得された地図データおよび信頼性情報に基づいて合成画像が生成されて、表示装置に表示される。合成画像は、画像データの時間的信頼性を示す特殊表示が、地図データに付加された画像である。したがって、ユーザは表示装置に表示された合成画像を参照することで、地図画像上の各地点を撮影した画像データの信頼性を容易に把握することができる。 According to the fourth aspect, the map data and the reliability information of the image data associated with the position information included in the display range of the map data are acquired from the storage device. A composite image is generated based on the acquired map data and reliability information and displayed on the display device. The composite image is an image in which a special display indicating temporal reliability of the image data is added to the map data. Therefore, the user can easily grasp the reliability of the image data obtained by photographing each point on the map image by referring to the composite image displayed on the display device.
 本開示の第5態様に係る画像表示方法は、画像データを取得する画像取得装置と、前記画像データを管理するサーバと、前記画像データを表示する画像表示装置とが、ネットワークを介して接続された画像提供システムにおいて実行される画像表示方法であって、前記画像データを取得する画像データ取得ステップと、前記画像データの取得位置を示す位置情報を取得する位置情報取得ステップと、前記画像データが取得されるタイミングに基づいて、前記画像データの時間的な信頼性を示す信頼性情報を生成する信頼性情報生成ステップと、地図データと、前記地図データの表示範囲に含まれる前記位置情報と対応付けられている前記信頼性情報とに基づいて、前記画像データの時間的信頼性を示す特殊表示が前記地図データに付加された合成画像を生成する合成画像生成ステップと、前記合成画像生成ステップによって生成された前記合成画像を、前記画像表示装置に表示する合成画像表示ステップと、を備えたことを特徴とする。 In the image display method according to the fifth aspect of the present disclosure, an image acquisition device that acquires image data, a server that manages the image data, and an image display device that displays the image data are connected via a network. An image display method executed in the image providing system, the image data acquiring step for acquiring the image data, the position information acquiring step for acquiring the position information indicating the acquisition position of the image data, and the image data Corresponding to the reliability information generation step for generating reliability information indicating temporal reliability of the image data based on the acquired timing, map data, and the position information included in the display range of the map data And a special display indicating the temporal reliability of the image data is added to the map data based on the reliability information attached. A synthetic image generating step of generating a composite image, the composite image generated by the synthesized image generating step, characterized in that and a composite image display step of displaying on the image display device.
定点カメラ10が設けられた街角の一具体例を示す図である。It is a figure which shows one specific example of the street corner in which the fixed point camera 10 was provided. 画像提供システム1の全体構成図である。1 is an overall configuration diagram of an image providing system 1. FIG. 画像管理サーバ2の電気的構成を示すブロック図である。3 is a block diagram showing an electrical configuration of the image management server 2. FIG. 画像管理データベース70のデータ構成を示す図である。3 is a diagram illustrating a data configuration of an image management database 70. FIG. 位置管理データベース80のデータ構成を示す図である。3 is a diagram showing a data configuration of a position management database 80. FIG. ユーザ端末5の斜め上方からみた斜視図である。It is the perspective view seen from the slanting upper part of user terminal 5. HMD200の電気的構成を示すブロック図である。It is a block diagram which shows the electric constitution of HMD200. HMD制御端末100の電気的構成を示すブロック図である。2 is a block diagram showing an electrical configuration of an HMD control terminal 100. FIG. 定点カメラ10のアップロード処理を示すフローチャートである。4 is a flowchart showing upload processing of a fixed point camera 10. ユーザ端末5のアップロード処理を示すフローチャートである。It is a flowchart which shows the upload process of the user terminal 5. 画像管理サーバ2のDB更新処理を示すフローチャートである。4 is a flowchart showing DB update processing of the image management server 2. 位置管理サーバ3のDB更新処理を示すフローチャートである。It is a flowchart which shows DB update processing of the position management server 3. FIG. 画像管理サーバ2の画像提供処理を示すフローチャートである。4 is a flowchart showing image providing processing of the image management server 2. ユーザ端末5の画像表示処理を示すフローチャートである。5 is a flowchart showing image display processing of the user terminal 5. 地図データ300の一具体例を示す図である。It is a figure which shows one specific example of the map data. ヒートマップ310の一具体例を示す図である。It is a figure which shows one specific example of the heat map. データ表示処理を示すフローチャートである。It is a flowchart which shows a data display process.
 本開示を具体化した実施の形態について、図面を参照して説明する。参照する図面は、本開示が採用しうる技術的特徴を説明するために用いられるものであり、単なる説明例である。 Embodiments embodying the present disclosure will be described with reference to the drawings. The drawings to be referred to are used to explain technical features that can be adopted by the present disclosure, and are merely illustrative examples.
 図1および図2を参照して、本実施形態に係る画像提供システム1の全体構成について説明する。画像提供システム1は、地図画像上で指定された地点(以下、指定地点という。)で撮影された画像データを、ネットワークを介してユーザに提供するシステムである。画像提供システム1では、画像管理サーバ2、位置管理サーバ3、複数のユーザ端末5、および複数の定点カメラ10が、ネットワーク4を介して接続されている。 The overall configuration of the image providing system 1 according to the present embodiment will be described with reference to FIGS. The image providing system 1 is a system that provides image data taken at a point designated on a map image (hereinafter referred to as a designated point) to a user via a network. In the image providing system 1, an image management server 2, a position management server 3, a plurality of user terminals 5, and a plurality of fixed point cameras 10 are connected via a network 4.
 画像管理サーバ2は、各地で撮影された画像データを管理するコンピュータであり、ネットワーク4と有線で接続されている。位置管理サーバ3は、ユーザ端末5の現在位置を管理するコンピュータであり、ネットワーク4と有線で接続されている。ネットワーク4は、公衆回線網を経由して接続可能なインターネットである。 The image management server 2 is a computer that manages image data taken in various places, and is connected to the network 4 by wire. The location management server 3 is a computer that manages the current location of the user terminal 5, and is connected to the network 4 by wire. The network 4 is the Internet that can be connected via a public network.
 ユーザ端末5は、画像提供システム1を利用するユーザが携行する小型・軽量の端末装置であり、ネットワーク4と無線で接続されている。ユーザ端末5は、ユーザが地図画像上で指定した地点の画像データを、画像管理サーバ2や定点カメラ10から取得して表示する。また、ユーザ端末5は、現在位置を所定の時間間隔で撮影して、撮影した画像データを画像管理サーバ2に送信する。そのため、ユーザ端末5で撮影された画像データには、ユーザ端末5の現在位置近傍の街並みや風景などが映し出される。 The user terminal 5 is a small and lightweight terminal device carried by a user who uses the image providing system 1, and is connected to the network 4 wirelessly. The user terminal 5 acquires the image data of the point designated by the user on the map image from the image management server 2 or the fixed point camera 10 and displays it. In addition, the user terminal 5 captures the current position at a predetermined time interval and transmits the captured image data to the image management server 2. For this reason, the image data captured by the user terminal 5 displays a cityscape, a landscape, and the like in the vicinity of the current position of the user terminal 5.
 定点カメラ10は、各地に設置されたカメラであり、ネットワーク4と有線で接続されている。定点カメラ10は、撮影した画像データを画像管理サーバ2やユーザ端末5に送信する。本実施形態の定点カメラ10は、街角の主要な交差点に設けられて、交差点の東西南北の4方向を常時撮影する。そのため、定点カメラ10で撮影された画像データには、定点カメラ10が設定されている交差点近傍の道路交通状況や歩行者の流れなどが映し出される。 The fixed point camera 10 is a camera installed in various places, and is connected to the network 4 by wire. The fixed point camera 10 transmits the captured image data to the image management server 2 and the user terminal 5. The fixed point camera 10 of the present embodiment is provided at main intersections at street corners, and always shoots the four directions of east, west, south, and north of the intersection. Therefore, the image data photographed by the fixed point camera 10 shows the road traffic situation near the intersection where the fixed point camera 10 is set, the flow of pedestrians, and the like.
 図3を参照して、画像管理サーバ2の電気的構成について説明する。画像管理サーバ2は、汎用のサーバであり、CPU20、ROM21、RAM22、HDD23、通信装置24、I/Oインタフェイス29を備える。ROM21、RAM22、I/Oインタフェイス29は、それぞれCPU20に接続されている。HDD23および通信装置24は、I/Oインタフェイス29に接続されている。通信装置24は、ネットワーク4を介したデータ通信を制御するコントローラである。 The electrical configuration of the image management server 2 will be described with reference to FIG. The image management server 2 is a general-purpose server, and includes a CPU 20, a ROM 21, a RAM 22, an HDD 23, a communication device 24, and an I / O interface 29. The ROM 21, RAM 22, and I / O interface 29 are each connected to the CPU 20. The HDD 23 and the communication device 24 are connected to the I / O interface 29. The communication device 24 is a controller that controls data communication via the network 4.
 HDD23は、大容量のハードディスクドライブであって、プログラム記憶エリア231、地図データ記憶エリア232、および画像データ記憶エリア233などが設けられている。プログラム記憶エリア231には、画像管理サーバ2を動作させる各種プログラムが記憶されており、後述のDB更新処理(図11参照)や画像提供処理(図13参照)を実行するためのプログラムも記憶されている。 The HDD 23 is a large-capacity hard disk drive, and is provided with a program storage area 231, a map data storage area 232, an image data storage area 233, and the like. Various programs for operating the image management server 2 are stored in the program storage area 231, and programs for executing DB update processing (see FIG. 11) and image providing processing (see FIG. 13) described later are also stored. ing.
 地図データ記憶エリア232には、ユーザが地点の指定を行うための地図画像を示す地図データが記憶される。本実施形態では、少なくとも日本の全体および各地の地図を複数の縮尺で表示可能なように、複数の地図データが予め用意されている。なお、CPU20は、地図データに基づいて表示される地図画像上の表示位置に対応して、現実の地点の位置情報(緯度および経度)を特定可能である。 The map data storage area 232 stores map data indicating a map image for the user to specify a point. In the present embodiment, a plurality of map data are prepared in advance so that at least the entire map of Japan and various places can be displayed at a plurality of scales. In addition, CPU20 can specify the positional information (latitude and longitude) of an actual point corresponding to the display position on the map image displayed based on map data.
 画像データ記憶エリア233には、各地を撮影した画像データが記憶される。画像データ記憶エリア233には、画像データを管理するための画像管理データベース70が設けられている。図4に示すように、画像管理データベース70は、画像データと、画像データの取得位置(つまり、撮影地点)を示す位置情報と、画像データの取得日時を示すタイムスタンプとを示すレコードを、画像データ毎に記憶する。位置情報は、撮影地点の緯度および経度と、撮影時のカメラ方向とを示す。 In the image data storage area 233, image data obtained by shooting various locations is stored. The image data storage area 233 is provided with an image management database 70 for managing image data. As shown in FIG. 4, the image management database 70 stores a record indicating image data, position information indicating an acquisition position (that is, a shooting point) of image data, and a time stamp indicating the acquisition date and time of image data. Store every data. The position information indicates the latitude and longitude of the shooting point and the camera direction at the time of shooting.
 位置管理サーバ3の電気的構成について説明する。位置管理サーバ3は、汎用のサーバであり、画像管理サーバ2(図3)と同様に、CPU、ROM、RAM、HDD、通信装置、I/Oインタフェイスを備える(図示外)。ただし、位置管理サーバ3のHDDには、プログラム記憶エリアの他に、現在位置記憶エリア(図示外)が設けられている。このプログラム記憶エリアには、位置管理サーバ3を動作させる各種プログラムが記憶されている。 The electrical configuration of the location management server 3 will be described. The position management server 3 is a general-purpose server, and includes a CPU, a ROM, a RAM, an HDD, a communication device, and an I / O interface (not shown), like the image management server 2 (FIG. 3). However, the HDD of the location management server 3 is provided with a current location storage area (not shown) in addition to the program storage area. In this program storage area, various programs for operating the location management server 3 are stored.
 現在位置記憶エリアには、ユーザ端末5の現在位置を示す位置情報(以下、現在位置データという。)が記憶されている。現在位置データ記憶エリアには、現在位置データを管理するための位置管理データベース80が設けられている。図5に示すように、位置管理データベース80は、ユーザ端末5に固有の端末IDと、ユーザ端末5の現在位置とを含むレコードが、ユーザ端末5毎に設けられている。現在位置は、ユーザ端末5から取得された最新の位置情報(緯度、経度、および方向)である。 In the current position storage area, position information indicating the current position of the user terminal 5 (hereinafter referred to as current position data) is stored. In the current position data storage area, a position management database 80 for managing current position data is provided. As shown in FIG. 5, in the position management database 80, a record including a terminal ID unique to the user terminal 5 and the current position of the user terminal 5 is provided for each user terminal 5. The current position is the latest position information (latitude, longitude, and direction) acquired from the user terminal 5.
 定点カメラ10の電気的構成について説明する。図示しないが、定点カメラ10は、設置地点を撮影するカメラ部と、カメラ部の制御等を実行するコントローラを備える。コントローラは、CPU、ROM、RAMなどを有しており、ROMには制御プログラムが記憶されている。この制御プログラムが、CPUに後述の画像アップロード処理(図9参照)を実行させる。また、ROMには、定点カメラ10に固有の端末IDや、定点カメラ10の位置情報などが記憶されている。 The electrical configuration of the fixed point camera 10 will be described. Although not shown, the fixed-point camera 10 includes a camera unit that captures an installation point, and a controller that controls the camera unit and the like. The controller includes a CPU, a ROM, a RAM, and the like, and a control program is stored in the ROM. This control program causes the CPU to execute an image upload process (see FIG. 9) described later. Further, the ROM stores a terminal ID unique to the fixed point camera 10, position information of the fixed point camera 10, and the like.
 図6を参照して、ユーザ端末5の物理的構成について説明する。本実施形態のユーザ端末5は、ヘッドマウントディスプレイ(以下、HMDという。)200と、HMD制御端末100とを含む。HMD200は、ユーザが頭部に装着して画像を鑑賞可能な眼鏡型の表示装置である。HMD制御端末100は、HMD200とケーブル190で接続された小型・軽量の電子機器であり、HMD200の表示制御等を司る。 The physical configuration of the user terminal 5 will be described with reference to FIG. The user terminal 5 of the present embodiment includes a head mounted display (hereinafter referred to as HMD) 200 and an HMD control terminal 100. The HMD 200 is a glasses-type display device that a user can wear on the head and view an image. The HMD control terminal 100 is a small and lightweight electronic device connected to the HMD 200 via a cable 190, and controls display of the HMD 200 and the like.
 本実施形態のHMD200は、所謂網膜走査型ディスプレイである。すなわち、HMD200は、ユーザに視認させる画像データの信号(以下、画像信号という。)に応じて変調されたレーザ光(以下、映像光Aという。)を走査して、ユーザの少なくとも一方の眼の網膜に出射する。これによりHMD200は、ユーザの網膜に画像を直接投影し、画像を視認させることができる。HMD200は、出射装置210と、ハーフミラー250と、頭部装着部220とを少なくとも備えている。頭部装着部220は、出射装置210を支持し、且つ、HMD200をユーザの頭部に固定する。ハーフミラー250は、出射装置210の光出射口に配置される。 The HMD 200 of this embodiment is a so-called retinal scanning display. That is, the HMD 200 scans a laser beam (hereinafter referred to as video light A) modulated in accordance with a signal of image data (hereinafter referred to as an image signal) that is visually recognized by the user, and the at least one eye of the user. Emits the retina. Thereby, HMD200 can project an image directly on a user's retina, and can visually recognize an image. The HMD 200 includes at least an emission device 210, a half mirror 250, and a head mounting portion 220. The head mounting unit 220 supports the emission device 210 and fixes the HMD 200 to the user's head. The half mirror 250 is disposed at the light emission port of the emission device 210.
 出射装置210は、画像信号に応じた映像光Aを、ハーフミラー250に対し出射する。ハーフミラー250は、出射装置210に対して固定的な位置にある。ハーフミラー250は、出射装置210から出射した映像光Aを、ユーザの眼に向かって反射させる。ハーフミラー250は、例えば所定の反射率(例えば50%)となるように、透明樹脂板に対して金属薄膜を蒸着することで形成される。そのため、ハーフミラー250は、外界からの外光Bの一部を透過させてユーザの眼に導く。つまり、ハーフミラー250は、ユーザの側方から入射した映像光Aと、外界からの外光Bとを、ユーザの眼に入射させる。これによりユーザは、実際の視界と映像光Aに基づく画像とを視認可能となる。 The emission device 210 emits video light A corresponding to the image signal to the half mirror 250. The half mirror 250 is in a fixed position with respect to the emission device 210. The half mirror 250 reflects the image light A emitted from the emission device 210 toward the eyes of the user. The half mirror 250 is formed by evaporating a metal thin film on a transparent resin plate so as to have a predetermined reflectance (for example, 50%). Therefore, the half mirror 250 transmits part of the external light B from the outside and guides it to the user's eyes. In other words, the half mirror 250 causes the image light A incident from the side of the user and the external light B from the outside to enter the user's eyes. As a result, the user can visually recognize the actual field of view and the image based on the video light A.
 図7を参照して、HMD200の電気的構成について説明する。HMD200は、表示部240、機器接続インタフェイス243、フラッシュメモリ249、制御部246、カメラ207、および電源部247を備えている。制御部246は、CPU261、ROM262、およびRAM248を少なくとも備え、HMD200全体を制御する。制御部246は、ROM262に格納された各種プログラムをCPU261が読み出すことにより、後述の各処理を実行する。 The electrical configuration of the HMD 200 will be described with reference to FIG. The HMD 200 includes a display unit 240, a device connection interface 243, a flash memory 249, a control unit 246, a camera 207, and a power supply unit 247. The control unit 246 includes at least a CPU 261, a ROM 262, and a RAM 248, and controls the entire HMD 200. The control unit 246 executes various processes described below when the CPU 261 reads out various programs stored in the ROM 262.
 表示部240は、ユーザに画像を視認させる。表示部240は、画像信号処理部270、レーザ群272、およびレーザドライバ群271を備えている。画像信号処理部270は、画像信号を制御部246から受信する。画像信号処理部270は、受信した画像信号を、ユーザの網膜に直接投影するために必要な各信号に変換する。レーザ群272は、青色出力レーザ(Bレーザ)721、緑色出力レーザ(Gレーザ)722、赤色出力レーザ(Rレーザ)723を含む。レーザ群272は、青色、緑色および赤色のレーザ光を出力する。レーザドライバ群271は、レーザ群272からレーザ光を出力させるための制御を行う。画像信号処理部270は、レーザドライバ群271と電気的に接続している。以下では、Bレーザ721、Gレーザ722、およびRレーザ723を、レーザと総称する。 The display unit 240 allows the user to visually recognize the image. The display unit 240 includes an image signal processing unit 270, a laser group 272, and a laser driver group 271. The image signal processing unit 270 receives an image signal from the control unit 246. The image signal processing unit 270 converts the received image signal into signals necessary for direct projection onto the user's retina. The laser group 272 includes a blue output laser (B laser) 721, a green output laser (G laser) 722, and a red output laser (R laser) 723. The laser group 272 outputs blue, green, and red laser beams. The laser driver group 271 performs control for outputting laser light from the laser group 272. The image signal processing unit 270 is electrically connected to the laser driver group 271. Hereinafter, the B laser 721, the G laser 722, and the R laser 723 are collectively referred to as lasers.
 表示部240は、垂直走査ミラー812、垂直走査制御回路811、水平走査ミラー792、および水平走査制御回路791を備えている。垂直走査ミラー812は、レーザより出力されたレーザ光を垂直方向に反射させることによって走査を行う。垂直走査制御回路811は、垂直走査ミラー812の駆動制御を行う。水平走査ミラー792は、レーザより出力されたレーザ光を水平方向に反射させることによって走査を行う。水平走査制御回路791は、水平走査ミラー792の駆動制御を行う。画像信号処理部270は、垂直走査制御回路811および水平走査制御回路791と電気的に接続している。 The display unit 240 includes a vertical scanning mirror 812, a vertical scanning control circuit 811, a horizontal scanning mirror 792, and a horizontal scanning control circuit 791. The vertical scanning mirror 812 performs scanning by reflecting the laser beam output from the laser in the vertical direction. The vertical scanning control circuit 811 performs drive control of the vertical scanning mirror 812. The horizontal scanning mirror 792 performs scanning by reflecting the laser beam output from the laser in the horizontal direction. The horizontal scanning control circuit 791 performs drive control of the horizontal scanning mirror 792. The image signal processing unit 270 is electrically connected to the vertical scanning control circuit 811 and the horizontal scanning control circuit 791.
 画像信号処理部270は、制御部246とバスを介して電気的に接続している。従って、画像信号処理部270は、制御部246から受信した画像信号に応じた色およびタイミングで、各レーザよりレーザ光を出力させることができる。さらに、画像信号処理部270は、制御部246から受信した画像信号に応じた方向へ、レーザ光を反射させることができる。これにより、HMD200は、画像信号に応じた光束を2次元方向に走査し、走査した光をユーザの眼に導いて網膜上に表示画像を形成することができる。 The image signal processing unit 270 is electrically connected to the control unit 246 via a bus. Therefore, the image signal processing unit 270 can cause each laser to output laser light with a color and timing according to the image signal received from the control unit 246. Further, the image signal processing unit 270 can reflect the laser light in a direction corresponding to the image signal received from the control unit 246. As a result, the HMD 200 can scan a light beam corresponding to the image signal in a two-dimensional direction, and guide the scanned light to the user's eyes to form a display image on the retina.
 カメラ制御部299は、カメラ207およびカメラ制御回路208を備えている。カメラ207は、ユーザの視線と同一方向を撮影する(図6参照)。カメラ制御回路208は、カメラ207を制御する。カメラ制御回路208は、バスを介して制御部246と電気的に接続している。従って、制御部246は、カメラ207に撮影を実行させたり、カメラ207で撮影された画像データを取得したりすることができる。 The camera control unit 299 includes a camera 207 and a camera control circuit 208. The camera 207 captures the same direction as the user's line of sight (see FIG. 6). A camera control circuit 208 controls the camera 207. The camera control circuit 208 is electrically connected to the control unit 246 via a bus. Therefore, the control unit 246 can cause the camera 207 to perform shooting or acquire image data shot by the camera 207.
 GPS制御部289は、GPS受信機288およびGPS制御回路287を備えている。GPS受信機288は、HMD200の現在位置を示す緯度および経度を取得する。GPS制御回路287は、GPS受信機288を制御する。GPS制御回路287は、バスを介して制御部246と電気的に接続している。従って、制御部246は、GPS受信機288から現在位置の緯度および経度を取得することができる。 The GPS control unit 289 includes a GPS receiver 288 and a GPS control circuit 287. The GPS receiver 288 acquires the latitude and longitude indicating the current position of the HMD 200. The GPS control circuit 287 controls the GPS receiver 288. The GPS control circuit 287 is electrically connected to the control unit 246 through a bus. Therefore, the control unit 246 can acquire the latitude and longitude of the current position from the GPS receiver 288.
 加速度計制御部239は、加速度計238および加速度計制御回路237を備えている。加速度計238は、HMD200を装着したユーザが進行している方向(つまり、カメラ207の撮影方向)を取得する。加速度計制御回路237は、加速度計238を制御する。加速度計制御回路237は、バスを介して制御部246と電気的に接続している。従って、制御部246は、加速度計238からHMD200の向き(つまり、カメラ207の撮影方向)を取得することができる。 The accelerometer control unit 239 includes an accelerometer 238 and an accelerometer control circuit 237. The accelerometer 238 acquires the direction in which the user wearing the HMD 200 is traveling (that is, the shooting direction of the camera 207). The accelerometer control circuit 237 controls the accelerometer 238. The accelerometer control circuit 237 is electrically connected to the control unit 246 via a bus. Therefore, the control unit 246 can acquire the direction of the HMD 200 (that is, the shooting direction of the camera 207) from the accelerometer 238.
 機器接続インタフェイス243は、ケーブル190を介したデータ通信を制御するコントローラである。機器接続インタフェイス243は、バスを介して制御部246と電気的に接続している。従って、制御部246は、HMD制御端末100から画像データを受信して、その画像データをユーザに視認させることができる。また、制御部246は、カメラ207で撮影された画像データ、および撮影時の位置情報を、HMD制御端末100に送信することができる。 The device connection interface 243 is a controller that controls data communication via the cable 190. The device connection interface 243 is electrically connected to the control unit 246 via a bus. Therefore, the control unit 246 can receive image data from the HMD control terminal 100 and allow the user to visually recognize the image data. In addition, the control unit 246 can transmit image data captured by the camera 207 and position information at the time of capturing to the HMD control terminal 100.
 電源部247は、電池259および充電制御回路260を備えている。電池259は、HMD200を駆動する電源である。充電制御回路260は、電池259の電力をHMD200に供給する。フラッシュメモリ249、ビデオRAM244、およびフォントROM245は、制御部246とバスを介して電気的に接続されている。制御部246は、フラッシュメモリ249、ビデオRAM244、およびフォントROM245に記憶された情報を適宜参照可能である。 The power supply unit 247 includes a battery 259 and a charge control circuit 260. The battery 259 is a power source that drives the HMD 200. The charge control circuit 260 supplies the power of the battery 259 to the HMD 200. The flash memory 249, the video RAM 244, and the font ROM 245 are electrically connected to the control unit 246 via a bus. The control unit 246 can appropriately refer to information stored in the flash memory 249, the video RAM 244, and the font ROM 245.
 図8を参照して、HMD制御端末100の電気的構成について説明する。HMD制御端末100は、HMD200に画像データを供給する外部機器である。HMD制御端末100は、CPU120、ROM121、RAM122、HDD123、通信装置124、操作キー125、機器接続インタフェイス126、およびI/Oインタフェイス129を備える。ROM121、RAM122、およびI/Oインタフェイス129は、それぞれCPU120に接続されている。HDD123、通信装置124、操作キー125、および機器接続インタフェイス126は、I/Oインタフェイス129に接続されている。 The electrical configuration of the HMD control terminal 100 will be described with reference to FIG. The HMD control terminal 100 is an external device that supplies image data to the HMD 200. The HMD control terminal 100 includes a CPU 120, ROM 121, RAM 122, HDD 123, communication device 124, operation key 125, device connection interface 126, and I / O interface 129. The ROM 121, RAM 122, and I / O interface 129 are each connected to the CPU 120. The HDD 123, the communication device 124, the operation keys 125, and the device connection interface 126 are connected to the I / O interface 129.
 通信装置124は、ネットワーク4を介したデータ通信を制御するコントローラである。機器接続インタフェイス126は、ケーブル190を介したデータ通信を制御するコントローラである。HDD133は、大容量のハードディスクドライブであって、HMD制御端末100を動作させる各種プログラムや、HMD200に表示させる画像データが記憶されている。また、HDD133は、後述のアップロード処理(図10参照)や画像表示処理(図14参照)を実行するためのプログラムも記憶されている。 The communication device 124 is a controller that controls data communication via the network 4. The device connection interface 126 is a controller that controls data communication via the cable 190. The HDD 133 is a large-capacity hard disk drive, and stores various programs for operating the HMD control terminal 100 and image data to be displayed on the HMD 200. The HDD 133 also stores a program for executing an upload process (see FIG. 10) and an image display process (see FIG. 14) described later.
 図9~図17を参照して、画像提供システム1で実行される処理について説明する。より詳細には、画像管理サーバ2、位置管理サーバ3、ユーザ端末5、および定点カメラ10でそれぞれ実行される処理を説明する。 Processing executed in the image providing system 1 will be described with reference to FIGS. More specifically, the processes executed by the image management server 2, the position management server 3, the user terminal 5, and the fixed point camera 10 will be described.
 図9を参照して、定点カメラ10で実行されるアップロード処理について説明する。定点カメラ10に電源が投入されると、定点カメラ10のROMに記憶されている制御プログラムに基づいて、定点カメラ10のCPUがアップロード処理(図9)を繰り返し実行する。 Referring to FIG. 9, the upload process executed by the fixed point camera 10 will be described. When the fixed point camera 10 is powered on, the CPU of the fixed point camera 10 repeatedly executes the upload process (FIG. 9) based on the control program stored in the ROM of the fixed point camera 10.
 図9に示すように、定点カメラ10のアップロード処理では、画像データのアップロードを行う所定タイミングであるか否かが判断される(S1)。具体的には、前回のデータアップロード(後述のステップS9)から所定の時間(例えば10分)経過するごとに、所定タイミングであると判断される(ステップS1:YES)。この場合、設置地点を常時撮影しているカメラ部から、現在撮影している映像を示す画像データが取得される(S3)。定点カメラ10に内蔵されているタイマ(図示外)の示す現在日時が、タイムスタンプとして取得される(S5)。定点カメラ10のROMに記憶されている位置情報(つまり、定点カメラ10の緯度、経度、および方向)が取得される(S7)。 As shown in FIG. 9, in the upload process of the fixed point camera 10, it is determined whether or not it is a predetermined timing for uploading the image data (S1). Specifically, every time a predetermined time (for example, 10 minutes) elapses from the previous data upload (step S9 described later), it is determined that the predetermined timing is reached (step S1: YES). In this case, image data indicating the video currently being shot is acquired from the camera unit that is always shooting the installation point (S3). The current date and time indicated by a timer (not shown) built in the fixed point camera 10 is acquired as a time stamp (S5). The positional information (that is, the latitude, longitude, and direction of the fixed point camera 10) stored in the ROM of the fixed point camera 10 is acquired (S7).
 ステップS7の実行後、画像管理サーバ2へのデータアップロードが実行される(S9)。具体的には、ステップS3で取得された画像データと、ステップS5で取得されたタイムスタンプと、ステップS7で取得された位置情報とを含むデータファイルが、ネットワーク4を介して画像管理サーバ2へ送信される。ステップS9の実行後、または所定タイミングでない場合(S1:NO)、処理はステップS1に戻る。以上の処理により、定点カメラ10では、設置地点の現在の様子を映し出した画像データが、10分間隔で画像管理サーバ2にアップロードされる。 After execution of step S7, data upload to the image management server 2 is executed (S9). Specifically, a data file including the image data acquired in step S3, the time stamp acquired in step S5, and the position information acquired in step S7 is sent to the image management server 2 via the network 4. Sent. After the execution of step S9 or when it is not a predetermined timing (S1: NO), the process returns to step S1. With the above processing, the fixed point camera 10 uploads the image data showing the current state of the installation point to the image management server 2 at intervals of 10 minutes.
 図10を参照して、ユーザ端末5で実行されるアップロード処理について説明する。ユーザ端末5(つまり、HMD制御端末100およびHMD200)に電源が投入されると、HDD133に記憶されているプログラムに基づいて、CPU120が本処理を繰り返し実行する。 Referring to FIG. 10, the upload process executed on the user terminal 5 will be described. When the user terminal 5 (that is, the HMD control terminal 100 and the HMD 200) is powered on, the CPU 120 repeatedly executes this process based on a program stored in the HDD 133.
 図10に示すように、ユーザ端末5のアップロード処理では、図9のステップS1~9と同様の処理が実行される(S11~S19)。具体的には、前回のデータアップロード(後述のステップS19)から10分経過するごとに、所定タイミングであると判断される(ステップS11:YES)。なお、ステップS11では、経過時間以外の条件が満たされた場合に(例えば、ユーザが操作キー125を介して、カメラ207を用いて撮像を行った場合)、所定タイミングであると判断されてもよい。この場合、HMD200のカメラ207に現在位置を撮影させ、カメラ207で撮影された画像データがHMD200から取得される(S13)。HMD制御端末100に内蔵されているタイマ(図示外)の示す現在日時が、タイムスタンプとして取得される(S15)。 As shown in FIG. 10, in the upload process of the user terminal 5, processes similar to steps S1 to S9 in FIG. 9 are executed (S11 to S19). Specifically, every time 10 minutes have passed since the previous data upload (step S19 described later), it is determined that the predetermined timing is reached (step S11: YES). In step S11, even when a condition other than the elapsed time is satisfied (for example, when the user performs imaging using the camera 207 via the operation key 125), it is determined that the predetermined timing is reached. Good. In this case, the camera 207 of the HMD 200 captures the current position, and image data captured by the camera 207 is acquired from the HMD 200 (S13). The current date and time indicated by a timer (not shown) built in the HMD control terminal 100 is acquired as a time stamp (S15).
 さらに、GPS受信機288に現在位置の緯度および経度を計測させる。加速度計238にユーザ端末5の方向を計測させる。このように計測された緯度、経度および方向が、HMD200から現在位置の位置情報として取得される(S17)。ステップS13、S15、S17でそれぞれ取得された情報を含むデータファイルが、ネットワーク4を介して画像管理サーバ2へ送信される(S19)。 Further, the GPS receiver 288 is caused to measure the latitude and longitude of the current position. The accelerometer 238 is caused to measure the direction of the user terminal 5. The latitude, longitude, and direction measured in this way are acquired from the HMD 200 as position information of the current position (S17). A data file including the information acquired in steps S13, S15, and S17 is transmitted to the image management server 2 via the network 4 (S19).
 ただし、ステップS19の実行後、現在位置通知がネットワーク4を介して位置管理サーバ3へ送信される(S21)。現在位置通知は、ユーザ端末5に固有の端末IDと、ステップS17で取得された位置情報とを含む。ステップS21の実行後、または所定タイミングでない場合(S11:NO)、処理はステップS11に戻る。以上の処理により、ユーザ端末5では、ユーザ端末5の現在位置の様子を映し出した画像データが、10分間隔で画像管理サーバ2にアップロードされる。また、ユーザ端末5の現在位置が、10分間隔で位置管理サーバ3に通知される。 However, after the execution of step S19, a current position notification is transmitted to the position management server 3 via the network 4 (S21). The current position notification includes the terminal ID unique to the user terminal 5 and the position information acquired in step S17. After execution of step S21, or when it is not a predetermined timing (S11: NO), the process returns to step S11. Through the above processing, the user terminal 5 uploads the image data showing the current position of the user terminal 5 to the image management server 2 at intervals of 10 minutes. Further, the current position of the user terminal 5 is notified to the position management server 3 at intervals of 10 minutes.
 図11を参照して、画像管理サーバ2で実行されるDB更新処理について説明する。画像管理サーバ2に電源が投入されると、HDD33に記憶されているプログラムに基づいて、CPU20が本処理を繰り返し実行する。 Referring to FIG. 11, the DB update process executed by the image management server 2 will be described. When the image management server 2 is powered on, the CPU 20 repeatedly executes this process based on a program stored in the HDD 33.
 図11に示すように、画像管理サーバ2のDB更新処理では、データファイルのアップロードがあるか否かが判断される(S31)。具体的には、ユーザ端末5または定点カメラ10からデータファイルの送信がある場合、アップロードがあると判断される(S31:YES)。この場合、アップロードされたデータファイルが受信され(S33)、画像管理データベース70が更新される(S35)。ステップS35の実行後、またはアップロードがない場合(S31:NO)、処理はステップS31に戻って、データファイルのアップロードを待ち受ける。以上の処理により、画像管理サーバ2は、定点カメラ10およびユーザ端末5から各地の画像データを収集することができる。 As shown in FIG. 11, in the DB update process of the image management server 2, it is determined whether or not there is a data file upload (S31). Specifically, when a data file is transmitted from the user terminal 5 or the fixed point camera 10, it is determined that there is an upload (S31: YES). In this case, the uploaded data file is received (S33), and the image management database 70 is updated (S35). After execution of step S35 or when there is no upload (S31: NO), the process returns to step S31 and waits for the upload of the data file. Through the above processing, the image management server 2 can collect image data of various locations from the fixed point camera 10 and the user terminal 5.
 図4に示す例では、定点カメラ10「camera_a」からアップロードされたデータファイルに基づいて、画像データ「camera_a.jpg」を含むレコードが画像管理データベース70に登録されている。このレコードには、定点カメラ10「camera_a」の設置位置を示す位置情報と、画像データ「camera_a.jpg」の取得日時を示すタイムスタンプとが設定されている。 In the example shown in FIG. 4, a record including the image data “camera_a.jpg” is registered in the image management database 70 based on the data file uploaded from the fixed point camera 10 “camera_a”. In this record, position information indicating the installation position of the fixed point camera 10 “camera_a” and a time stamp indicating the acquisition date and time of the image data “camera_a.jpg” are set.
 また、ユーザA(図2参照)が装着しているユーザ端末5「hmd001」からアップロードされたデータファイルに基づいて、画像データ「hmd001_a.jpg」を含むレコードが画像管理データベース70に登録されている。このレコードには、画像データ「camera_a.jpg」について、その取得時のユーザ端末5「hmd001」の位置を示す位置情報と、その取得日時を示すタイムスタンプとが設定されている。 Further, a record including the image data “hmd001_a.jpg” is registered in the image management database 70 based on the data file uploaded from the user terminal 5 “hmd001” worn by the user A (see FIG. 2). . In this record, for image data “camera_a.jpg”, position information indicating the position of the user terminal 5 “hmd001” at the time of acquisition and a time stamp indicating the acquisition date and time are set.
 図12を参照して、位置管理サーバ3で実行されるDB更新処理について説明する。位置管理サーバ3に電源が投入されると、HDDに記憶されているプログラムに基づいて、CPUが本処理を繰り返し実行する。 With reference to FIG. 12, the DB update process executed by the location management server 3 will be described. When the position management server 3 is powered on, the CPU repeatedly executes this process based on a program stored in the HDD.
 図12に示すように、位置管理サーバ3のDB更新処理では、現在位置通知があるか否かが判断される(S41)。具体的には、ユーザ端末5から現在位置通知を受信した場合、現在位置通知があると判断される(S41:YES)。この場合、受信した現在位置通知に基づいて、位置管理データベース80が更新される(S43)。ステップS43の実行後、または現在位置通知がない場合(S43:NO)、処理はステップS41に戻って、現在位置通知の受信を待ち受ける。以上の処理により、位置管理サーバ3は、各地に固定されている定点カメラ10の設置位置、および各地に移動可能なユーザ端末5の現在位置を常時把握することができる。 As shown in FIG. 12, in the DB update process of the location management server 3, it is determined whether or not there is a current location notification (S41). Specifically, when the current position notification is received from the user terminal 5, it is determined that there is a current position notification (S41: YES). In this case, the location management database 80 is updated based on the received current location notification (S43). After execution of step S43 or when there is no current position notification (S43: NO), the process returns to step S41 and waits for reception of the current position notification. With the above processing, the position management server 3 can always grasp the installation position of the fixed point camera 10 fixed in each place and the current position of the user terminal 5 that can move to each place.
 図5に示す例では、ユーザA(図2参照)が装着しているユーザ端末5「hmd001」から受信した現在位置通知に基づいて、端末ID「hmd001」を含むレコードが位置管理データベース80に登録されている。このレコードには、ユーザ端末5「hmd001」の現在位置を示す位置情報が設定されている。 In the example shown in FIG. 5, a record including the terminal ID “hmd001” is registered in the location management database 80 based on the current location notification received from the user terminal 5 “hmd001” worn by the user A (see FIG. 2). Has been. In this record, position information indicating the current position of the user terminal 5 “hmd001” is set.
 図13を参照して、画像管理サーバ2で実行される画像提供処理について説明する。画像管理サーバ2に電源が投入されると、HDD33に記憶されているプログラムに基づいて、CPU20が本処理を繰り返し実行する。 Referring to FIG. 13, the image providing process executed by the image management server 2 will be described. When the image management server 2 is powered on, the CPU 20 repeatedly executes this process based on a program stored in the HDD 33.
 図13に示すように、画像管理サーバ2の画像提供処理では、地図要求を受信したか否かが判断される(S51)。地図要求は、ユーザ端末5が画像管理サーバ2に対して地図データを要求する信号であり、ユーザが指定した地図画像の表示範囲(以下、地図表示範囲という。)を含む。地図要求を受信した場合(S51:YES)、地図データ記憶エリア232から地図要求に含まれる地図表示範囲を示す地図データが取得される(S53)。 As shown in FIG. 13, in the image providing process of the image management server 2, it is determined whether a map request is received (S51). The map request is a signal for requesting map data from the image management server 2 by the user terminal 5, and includes a display range of a map image designated by the user (hereinafter referred to as a map display range). When a map request is received (S51: YES), map data indicating the map display range included in the map request is acquired from the map data storage area 232 (S53).
 ステップS53で取得された地図データの地図表示範囲に対応するタイムスタンプが、画像管理データベース70から取得される(S55)。具体的には、地図データの地図表示範囲に含まれる緯度および経度を有する全ての画像データが、画像管理データベース70から検索される。検索にヒットした画像データに対応付けられているタイムスタンプが、画像管理データベース70から取得される。ステップS55の実行後、地図描画情報がネットワーク4を介して要求元のユーザ端末5へ送信される(S57)。地図描画情報は、ユーザ端末5で地図画像を表示するのに用いられる情報であり、ステップS53で取得された地図データと、ステップS55で取得されたタイムスタンプとを含む。 The time stamp corresponding to the map display range of the map data acquired in step S53 is acquired from the image management database 70 (S55). Specifically, all image data having latitude and longitude included in the map display range of the map data is searched from the image management database 70. A time stamp associated with the image data hit in the search is acquired from the image management database 70. After execution of step S55, map drawing information is transmitted to the requesting user terminal 5 via the network 4 (S57). The map drawing information is information used to display a map image on the user terminal 5, and includes the map data acquired in step S53 and the time stamp acquired in step S55.
 例えば、地図表示範囲が緯度「+33 35 42.374」および経度「+130 24 21.035」を含む場合、この緯度および経度と合致する位置情報を含む全てのレコードからタイムスタンプが取得される(S55)。図4に示す画像管理データベース70の例では、タイムスタンプ「100707120230」が取得される。ステップS57では、地図表示範囲に対応する地図データが、地図表示範囲内の各地点における画像取得日時を示すタイムスタンプとともに、要求元のユーザ端末5へ送信される。 For example, when the map display range includes latitude “+33 35 42.374” and longitude “+130 24 21.35”, time stamps are acquired from all records including position information matching the latitude and longitude (S55). ). In the example of the image management database 70 illustrated in FIG. 4, the time stamp “100707120230” is acquired. In step S57, the map data corresponding to the map display range is transmitted to the requesting user terminal 5 together with the time stamp indicating the image acquisition date and time at each point in the map display range.
 ステップS57の実行後、画像要求を受信したか否かが判断される(S59)。画像要求は、ユーザ端末5が画像管理サーバ2に対して画像データを要求する信号であり、ユーザが指定した地点(以下、指定地点という。)の位置情報を含む。画像要求を受信した場合(S59:YES)、指定地点の位置情報に対応付けられている画像データが、画像管理データベース70を参照して特定される(S61)。ステップS61で特定された画像データのリストである画像データ一覧が、ネットワーク4を介して要求元のユーザ端末5へ送信される(S63)。 After step S57 is executed, it is determined whether an image request has been received (S59). The image request is a signal for the user terminal 5 to request image data from the image management server 2 and includes position information of a point designated by the user (hereinafter referred to as a designated point). When the image request is received (S59: YES), the image data associated with the position information of the designated point is specified with reference to the image management database 70 (S61). A list of image data, which is a list of image data specified in step S61, is transmitted to the requesting user terminal 5 via the network 4 (S63).
 例えば、画像要求が緯度「+33 35 42.374」、経度「+130 24 21.035」、方向「北」を含む場合、この緯度、経度および方向と合致する位置情報を含む全てのレコードから画像データが特定される(S61)。図4に示す画像管理データベース70の例では、画像データ「camera_a.jpg」が特定される。ステップS63では、これらの画像データをそれぞれ縮小表示したサムネイル画像およびその取得日時を示す画像データ一覧が、要求元のユーザ端末5へ送信される。 For example, when the image request includes latitude “+33 35 42.374”, longitude “+130 24 21.35”, and direction “north”, image data from all records including position information that matches the latitude, longitude, and direction. Is identified (S61). In the example of the image management database 70 illustrated in FIG. 4, the image data “camera_a.jpg” is specified. In step S63, thumbnail images obtained by reducing and displaying each of these image data and an image data list indicating the acquisition date and time are transmitted to the user terminal 5 that has made the request.
 ステップS63の実行後、画像選択指示を受信したか否かが判断される(S65)。画像選択指示は、画像データ一覧からユーザが選択した画像データの識別情報を含む。CPU20は、画像選択指示に含まれる識別情報に基づいて、ステップS61で特定された画像データのうちでユーザが選択したものを特定できる。画像選択指示を受信した場合(S65:YES)、ユーザが選択した画像データが現在の画像であるか否かが判断される(S67)。具体的には、具体的には、ユーザが選択した画像データのタイムスタンプが、画像管理データベース70を参照して特定される。特定されたタイムスタンプから現在日時までの経過時間が所定値内(本実施形態では、10分未満)を示す場合、画像データが現在の画像であると判断される(S67:YES)。 After execution of step S63, it is determined whether an image selection instruction has been received (S65). The image selection instruction includes identification information of image data selected by the user from the image data list. Based on the identification information included in the image selection instruction, the CPU 20 can identify the image data selected by the user from among the image data identified in step S61. When an image selection instruction is received (S65: YES), it is determined whether the image data selected by the user is the current image (S67). Specifically, the time stamp of the image data selected by the user is specified with reference to the image management database 70. If the elapsed time from the identified time stamp to the current date and time is within a predetermined value (in this embodiment, less than 10 minutes), it is determined that the image data is the current image (S67: YES).
 この場合、指定地点に対して、ネットワーク4を介して画像提供指示が送信される(S69)。画像提供指示は、指定地点に存在する画像取得装置に対して、要求元のユーザ端末5で撮影中の画像データ(本実施形態では、リアルタイムに再生可能な動画である映像データ)を提供することを指示する信号である。より具体的に、ステップS69では、CPU20は、指定地点に存在する画像取得装置を、位置管理サーバ3に問い合わせる。位置管理サーバ3のCPUは、画像管理サーバ2からの問い合わせに応じて、位置管理データベース80を参照して指定地点に対応する画像取得装置を特定して、画像管理サーバ2に通知する。CPU20は、位置管理サーバ3から通知された画像取得装置に対して、要求元のユーザ端末5に映像データをリアルタイムで送信するリクエストを送信する。 In this case, an image provision instruction is transmitted to the designated point via the network 4 (S69). The image providing instruction provides image data being captured by the requesting user terminal 5 (in this embodiment, video data that is a moving image that can be played back in real time) to the image acquisition device existing at the designated point. Is a signal for instructing. More specifically, in step S <b> 69, the CPU 20 inquires of the position management server 3 about the image acquisition device existing at the designated point. In response to an inquiry from the image management server 2, the CPU of the position management server 3 refers to the position management database 80 to identify the image acquisition device corresponding to the designated point and notifies the image management server 2 of the image acquisition device. The CPU 20 transmits a request for transmitting video data in real time to the requesting user terminal 5 to the image acquisition apparatus notified from the position management server 3.
 例えば、画像選択指示が示す画像データが「camera_a.jpg」である場合、タイムスタンプ「100707120230」と現在日時との時間差が10分未満である。そのため、指定地点の位置情報が緯度「+33 35 42.374」、経度「+130 24 21.035」、方向「北」である場合、この位置情報を含むレコードから画像取得装置が特定される(S67:YES、S69)。図5に示す位置管理データベース80の例では、画像取得装置として定点カメラ10「camera_a」が特定される。ステップS71では、定点カメラ10「camera_a」にリクエストを送信することで、定点カメラ10「camera_a」の映像データを要求元のユーザ端末5へリアルタイムに提供させる。 For example, when the image data indicated by the image selection instruction is “camera_a.jpg”, the time difference between the time stamp “100707120230” and the current date is less than 10 minutes. Therefore, when the position information of the designated point is latitude “+33 35 42.374”, longitude “+130 24 21.35”, and direction “north”, the image acquisition device is identified from the record including this position information (S67). : YES, S69). In the example of the position management database 80 illustrated in FIG. 5, the fixed point camera 10 “camera_a” is specified as the image acquisition device. In step S71, a request is transmitted to the fixed point camera 10 “camera_a”, thereby providing the requesting user terminal 5 with the video data of the fixed point camera 10 “camera_a” in real time.
 一方、画像データが現在の画像でない場合(S67:NO)、ユーザが選択した画像データが画像管理データベース70から読み出されて、要求元のユーザ端末5にネットワーク4を介して送信される(S71)。例えば、画像選択指示が示す画像データが「hmd_001.jpg」である場合、タイムスタンプ「100707080513」と現在日時との時間差が10分以上である(S67:NO)。そのため、ステップS71では、画像管理データベース70から画像データ「hmd_001.jpg」が読み出されて、要求元のユーザ端末5に送信される。 On the other hand, if the image data is not the current image (S67: NO), the image data selected by the user is read from the image management database 70 and transmitted to the requesting user terminal 5 via the network 4 (S71). ). For example, when the image data indicated by the image selection instruction is “hmd — 001.jpg”, the time difference between the time stamp “100707080513” and the current date and time is 10 minutes or more (S67: NO). Therefore, in step S71, the image data “hmd — 001.jpg” is read from the image management database 70 and transmitted to the requesting user terminal 5.
 なお、ステップS69またはステップS71の実行後、処理はステップS51に戻る。同様に、地図要求を受信しない場合(S51:NO)、画像要求を受信しない場合(S59:NO)、および画像選択指示を受信しない場合も(S65:NO)、処理はステップS51に戻る。以上の処理により、画像管理サーバ2では、ユーザ端末5の要求に応じて、指定地点の地図データや画像データを提供できる。また、指定地点に画像取得装置が存在する場合には、指定地点の画像取得装置からユーザ端末5に映像データをリアルタイムに提供させることができる。 In addition, after execution of step S69 or step S71, the process returns to step S51. Similarly, if a map request is not received (S51: NO), an image request is not received (S59: NO), and an image selection instruction is not received (S65: NO), the process returns to step S51. With the above processing, the image management server 2 can provide map data and image data of a designated point in response to a request from the user terminal 5. When the image acquisition device is present at the designated point, the video data can be provided to the user terminal 5 in real time from the image acquisition device at the designated point.
 図14を参照して、ユーザ端末5で実行される画像表示処理について説明する。HDD133には、画像管理サーバ2から提供される画像データや地図データなどを表示するアプリケーションプログラムが記憶されている。画像表示処理は、このアプリケーションプログラムが起動されると、CPU120によって繰り返し実行される。 With reference to FIG. 14, the image display process executed by the user terminal 5 will be described. The HDD 133 stores an application program that displays image data, map data, and the like provided from the image management server 2. The image display process is repeatedly executed by the CPU 120 when the application program is activated.
 図14に示すように、ユーザ端末5の画像表示処理では、地図表示操作があるか否かが判断される(S101)。地図表示操作は、HMD200を装着しているユーザが、眼に映し出される地図画像(詳細には、後述のヒートマップ)を、操作キー125(図8参照)を使用して移動・拡大・縮小等する操作である。地図表示操作がある場合(S101:YES)、地図要求がネットワーク4を介して画像管理サーバ2へ送信される(S103)。ステップS103の実行後、地図描画情報を受信したか否かが判断される(S105:YES)。地図描画情報を受信した場合(S105)、ヒートマップが作成される(S107)。ヒートマップは、画像データの時間的信頼性を示す特殊表示が地図データに付加された画像である。時間的信頼性とは、画像データの取得時から起算した経過時間が小さいか否か(つまり、画像データが新しいか否か)をいう。 As shown in FIG. 14, in the image display process of the user terminal 5, it is determined whether or not there is a map display operation (S101). The map display operation is such that the user wearing the HMD 200 moves, enlarges, reduces, etc. a map image (in detail, a heat map described later) projected on the eyes using the operation keys 125 (see FIG. 8). It is an operation to do. When there is a map display operation (S101: YES), a map request is transmitted to the image management server 2 via the network 4 (S103). After execution of step S103, it is determined whether map drawing information has been received (S105: YES). When the map drawing information is received (S105), a heat map is created (S107). The heat map is an image in which a special display indicating temporal reliability of the image data is added to the map data. Temporal reliability means whether or not the elapsed time calculated from the acquisition of image data is small (that is, whether or not the image data is new).
 例えば、ユーザ端末5を装着しているユーザB(図2参照)が、緯度「+33 35 42.374」および経度「+130 24 21.035」を含む地図表示範囲を設定した場合、この地図表示範囲を示す地図要求が送信される(S103)。先述したように、地図データとタイムスタンプとを含む地図描画情報が、画像管理サーバ2から返信される。CPU120は、画像管理サーバ2から受信した地図描画情報に基づいてヒートマップを作成する(S107)。 For example, when the user B wearing the user terminal 5 (see FIG. 2) sets a map display range including latitude “+33 35 42.374” and longitude “+130 24 21.35”, this map display range Is transmitted (S103). As described above, the map drawing information including the map data and the time stamp is returned from the image management server 2. The CPU 120 creates a heat map based on the map drawing information received from the image management server 2 (S107).
 図15に示すように、地図描画情報に含まれる地図データ300は、地図表示範囲を示す地図画像である。ステップS107では、地図表示範囲に含まれる各地ごとの表示領域が、地図描画情報に含まれる各地点のタイムスタンプに応じて色分けするように配色される。本実施形態では、地図表示範囲内に含まれる各地点の画像データの更新日時の新しさを色分けして段階的に示すヒートマップが作成される。このヒートマップでは、タイムスタンプが新しい地点には暖色系で配色される一方、タイムスタンプが古い地点には寒色系で配色される。 As shown in FIG. 15, the map data 300 included in the map drawing information is a map image showing a map display range. In step S107, the display area for each place included in the map display range is color-coded according to the time stamp of each point included in the map drawing information. In the present embodiment, a heat map is created that indicates the newness of the update date and time of image data at each point included in the map display range in a color-coded manner. In this heat map, a point where the time stamp is new is colored in a warm color, while a point where the time stamp is old is colored in a cold color.
 具体的には、図16に示すヒートマップ310では、タイムスタンプが現在日時から10分未満以内を示す場合、そのタイムスタンプに対応する地点の表示領域が「赤」で配色される。タイムスタンプが現在日時から10分以上、且つ1時間未満を示す場合、そのタイムスタンプに対応する地点の表示領域が「ピンク」で配色される。タイムスタンプが現在日時から1時間以上、且つ1日未満を示す場合、そのタイムスタンプに対応する地点の表示領域が「黄」で配色される。タイムスタンプが現在日時から1日以上を示す場合、そのタイムスタンプに対応する地点の表示領域が「無色」で配色される。タイムスタンプが存在しない地点の表示領域は、その地点を撮影した画像データが存在しないため、その地点の表示領域が「黒」で配色される。 Specifically, in the heat map 310 shown in FIG. 16, when the time stamp indicates less than 10 minutes from the current date and time, the display area of the point corresponding to the time stamp is colored in “red”. When the time stamp indicates 10 minutes or more from the current date and time and less than 1 hour, the display area of the point corresponding to the time stamp is colored “pink”. When the time stamp indicates one hour or more from the current date and time and less than one day, the display area of the point corresponding to the time stamp is colored “yellow”. When the time stamp indicates one day or more from the current date and time, the display area of the point corresponding to the time stamp is colored “colorless”. Since the display area of the point where the time stamp does not exist does not include image data obtained by photographing the point, the display area of the point is colored “black”.
 よって、ユーザはヒートマップ310を参照して、地図表示範囲に含まれる各地の画像データが新しいか否かを視覚的に容易に判別できる。言い換えると、ユーザは、画像データの取得日時が新しい地点と、画像データの取得日時が古い地点とを、ヒートマップ310を参照して視覚的に判別できる。 Therefore, the user can easily visually determine whether or not the image data of each place included in the map display range is new with reference to the heat map 310. In other words, the user can visually discriminate between a point where the image data acquisition date is new and a point where the image data acquisition date is old with reference to the heat map 310.
 ステップS107で作成されたヒートマップは、HMD200で表示される(S109)。具体的には、HMD200を装着しているユーザの眼にヒートマップが映し出される。ステップS109の実行後、指定地点・方向の入力ありか否かが判断される(S111)。例えばユーザが操作キー125でカーソル操作を行って、ヒートマップ上で座標位置および撮影方向を指定した場合、指定地点・方向の入力ありと判断される(S111:YES)。この場合、指定地点の画像データを要求する画像要求が、ネットワーク4を介して画像管理サーバ2へ送信される(S113)。ステップS113の実行後、画像データ一覧を受信したか否かが判断される(S115)。画像データ一覧を受信した場合(S115:YES)、画像データ一覧がHMD200で表示される(S117)。 The heat map created in step S107 is displayed on the HMD 200 (S109). Specifically, the heat map is displayed on the eyes of the user wearing the HMD 200. After execution of step S109, it is determined whether or not there is an input of a designated point / direction (S111). For example, when the user performs a cursor operation with the operation key 125 and designates the coordinate position and the shooting direction on the heat map, it is determined that there is an input of the designated point / direction (S111: YES). In this case, an image request for requesting image data at the designated point is transmitted to the image management server 2 via the network 4 (S113). After execution of step S113, it is determined whether an image data list has been received (S115). When the image data list is received (S115: YES), the image data list is displayed on the HMD 200 (S117).
 例えば、ユーザ端末5を装着しているユーザB(図2参照)が、ヒートマップ上で緯度「+33 35 42.374」、経度「+130 24 21.035」、方向「北」を指定した場合、これらの位置情報を含む画像要求が送信される(S113)。先述したように、指定地点の画像データ(図4の例では、画像データ「camera_a.jpg」など)のサムネイル画像および取得日時を含む画像データ一覧が、画像管理サーバ2から返信される。CPU120は、画像管理サーバ2から受信した画像データ一覧に基づいて、HMD200を装着しているユーザの眼に指定地点の画像データをサムネイル状に映し出す(S117)。 For example, when the user B wearing the user terminal 5 (see FIG. 2) specifies the latitude “+33 35 42.374”, the longitude “+130 24 21.35”, and the direction “north” on the heat map, An image request including the position information is transmitted (S113). As described above, the image management server 2 returns a list of image data including the thumbnail image of the image data of the designated point (image data “camera_a.jpg” in the example of FIG. 4) and the acquisition date and time. Based on the image data list received from the image management server 2, the CPU 120 displays the image data of the designated point in the form of thumbnails on the eyes of the user wearing the HMD 200 (S117).
 ステップS117の実行後、画像データの選択ありか否かが判断される(S119)。例えばユーザが操作キー125でカーソル操作を行って、画像データ一覧からサムネイル画像を選択した場合、画像データの選択ありと判断される(S119:YES)。この場合、選択された画像データの識別情報を含む画像選択指示が、ネットワーク4を介して画像管理サーバ2へ送信される(S121)。ステップS121の実行後、後述のデータ表示処理が実行されて(S123)、処理がステップS101に戻る。 After execution of step S117, it is determined whether image data has been selected (S119). For example, when the user performs a cursor operation with the operation key 125 and selects a thumbnail image from the image data list, it is determined that image data has been selected (S119: YES). In this case, an image selection instruction including identification information of the selected image data is transmitted to the image management server 2 via the network 4 (S121). After execution of step S121, a data display process described later is executed (S123), and the process returns to step S101.
 同様に、地図表示操作がない場合(S101:NO)、地図描画情報を受信していない場合(S105:NO)、指定地点・方向の入力がない場合(S111:NO)、画像データ一覧を受信していない場合(S115:NO)、および対象画像の選択がない場合も(S119:NO)、それぞれ、処理はステップS101に戻る。以上の処理により、ユーザ端末5では、ユーザの要求に応じて、地図画像上の各地点を撮影した画像データの時間的信頼性を示すヒートマップを表示できる。 Similarly, when there is no map display operation (S101: NO), when map drawing information is not received (S105: NO), when there is no input of a designated point / direction (S111: NO), an image data list is received. If not (S115: NO) and if no target image is selected (S119: NO), the process returns to step S101. With the above processing, the user terminal 5 can display a heat map indicating the temporal reliability of image data obtained by photographing each point on the map image in response to a user request.
 図17に示すように、データ表示処理(S123)では、画像データを受信したか否かが判断される(S151)。画像管理サーバ2から画像データを受信した場合(S151:YES)、その画像データが所定時間以上古いデータであるか否かが判断される(S153)。具体的には、画像データのタイムスタンプが示す取得日時が、現在日時から1月以上前を示す場合、画像データが所定時間以上古いデータであると判断される(S153:YES)。この場合、HMD200で所定の警告が表示される(S155)。所定の警告としては、「この画像は1月以上前に取得されたものですから、現在の風景とは異なる場合があります。」などが例示される。 As shown in FIG. 17, in the data display process (S123), it is determined whether image data has been received (S151). When image data is received from the image management server 2 (S151: YES), it is determined whether the image data is data older than a predetermined time (S153). Specifically, when the acquisition date and time indicated by the time stamp of the image data indicates one month or more before the current date and time, it is determined that the image data is data older than a predetermined time (S153: YES). In this case, a predetermined warning is displayed on the HMD 200 (S155). Examples of the predetermined warning include “This image was acquired more than a month ago and may be different from the current landscape”.
 ステップS155の実行後、または画像データが所定時間以上古いデータでない場合(S153:NO)、受信した画像データがHMD200で表示される(S157)。なお、ステップS155で警告が表示されている場合は、その警告と併せて画像データが表示される。一方、画像データを受信していない場合(S151:NO)、映像データを受信したか否かが判断される(S159)。定点カメラ10から映像データを受信した場合(S159:YES)、受信した映像データがHMD200でストリーミング再生される(S161)。 After execution of step S155 or when the image data is not older than the predetermined time (S153: NO), the received image data is displayed on the HMD 200 (S157). If a warning is displayed in step S155, image data is displayed together with the warning. On the other hand, if no image data has been received (S151: NO), it is determined whether video data has been received (S159). When video data is received from the fixed point camera 10 (S159: YES), the received video data is streamed and played back by the HMD 200 (S161).
 例えば、ユーザBが画像データ一覧から画像データ「camera_a.jpg」を選択した場合、その識別情報を含む画像選択指示が送信される(S121)。先述したように、画像データ「camera_a.jpg」が、画像管理サーバ2から返信される。CPU120は、画像管理サーバ2から受信した画像データ「camera_a.jpg」を、HMD200を装着しているユーザの眼に映し出す(S157)。 For example, when the user B selects the image data “camera_a.jpg” from the image data list, an image selection instruction including the identification information is transmitted (S121). As described above, the image data “camera_a.jpg” is returned from the image management server 2. The CPU 120 displays the image data “camera_a.jpg” received from the image management server 2 on the eyes of the user wearing the HMD 200 (S157).
 一方、ユーザBが画像データ一覧から画像データ「hmd_001.jpg」を選択した場合、その識別情報を含む画像選択指示が送信される(S121)。先述したように、リアルタイムの映像データが、定点カメラ10「hmd_001」から送信される。CPU120は、定点カメラ10「hmd_001」から受信した映像データを、HMD200を装着しているユーザの眼に映し出す(S161)。 On the other hand, when the user B selects the image data “hmd_001.jpg” from the image data list, an image selection instruction including the identification information is transmitted (S121). As described above, real-time video data is transmitted from the fixed point camera 10 “hmd_001”. The CPU 120 displays the video data received from the fixed point camera 10 “hmd_001” on the eyes of the user wearing the HMD 200 (S161).
 なお、ステップS157またはステップS161の実行後、あるいは映像データを受信していない場合(S159:NO)、それぞれデータ表示処理(図15)が終了して、処理が画像表示処理(図14)に戻る。以上の処理により、ユーザ端末5では、ヒートマップから指定された指定地点の画像データまたは映像データを、画像管理サーバ2または定点カメラ10から取得して表示できる。 In addition, after execution of step S157 or step S161, or when no video data is received (S159: NO), the data display process (FIG. 15) ends, and the process returns to the image display process (FIG. 14). . Through the above processing, the user terminal 5 can acquire and display image data or video data of a designated point designated from the heat map from the image management server 2 or the fixed point camera 10.
 以上説明したように、本実施形態に係る画像提供システム1によれば、定点カメラ10やユーザ端末5で取得された画像データが、その取得時の位置情報およびタイムスタンプとともに、画像管理データベース70に記憶される。画像管理サーバ2では、ユーザ端末5からの要求に応じて、地図データ記憶エリア232に記憶されている地図データが、ユーザ端末5に返信される。このとき、地図データの表示範囲に含まれる位置情報と対応付けられているタイムスタンプも、ユーザ端末5に返信される。 As described above, according to the image providing system 1 according to the present embodiment, the image data acquired by the fixed point camera 10 or the user terminal 5 is stored in the image management database 70 together with the position information and time stamp at the time of acquisition. Remembered. In the image management server 2, the map data stored in the map data storage area 232 is returned to the user terminal 5 in response to a request from the user terminal 5. At this time, the time stamp associated with the position information included in the display range of the map data is also returned to the user terminal 5.
 ユーザ端末5では、画像管理サーバ2から取得した地図データおよびタイムスタンプに基づいてヒートマップが生成されて、HMD200に表示される。ヒートマップは、画像データの時間的信頼性を示す特殊表示が、地図データに付加された画像である。したがって、ユーザ端末5のユーザは、HMD200に表示されたヒートマップを参照することで、地図画像上の各地点を撮影した画像データの信頼性を容易に把握することができる。 In the user terminal 5, a heat map is generated based on the map data and time stamp acquired from the image management server 2, and displayed on the HMD 200. A heat map is an image in which a special display indicating temporal reliability of image data is added to the map data. Therefore, the user of the user terminal 5 can easily grasp the reliability of the image data obtained by photographing each point on the map image by referring to the heat map displayed on the HMD 200.
 さらに、ユーザ端末5のユーザは、ヒートマップ上で選択された指定地点の画像データが、HMD200に表示される。よって、ユーザは各地点の画像データの時間的信頼性をヒートマップ上で確認したうえで、HMD200で画像データを閲覧できる。また、ヒートマップ上で選択された指定地点の画像データが所定時間以上古い場合、HMD200に警告が表示される。よって、ユーザは、指定地点の画像データの時間的信頼性が低いことを確実に把握できる。また、ユーザは指定地点の画像データが複数存在する場合、画像データ一覧からHMD200に表示する画像データを自由に選択できる。 Further, the user of the user terminal 5 displays the image data of the designated point selected on the heat map on the HMD 200. Therefore, the user can browse the image data with the HMD 200 after confirming the temporal reliability of the image data at each point on the heat map. Further, when the image data of the designated point selected on the heat map is older than a predetermined time, a warning is displayed on the HMD 200. Therefore, the user can surely grasp that the temporal reliability of the image data at the designated point is low. In addition, when there are a plurality of image data at a designated point, the user can freely select image data to be displayed on the HMD 200 from the image data list.
 上記実施形態において、定点カメラ10およびユーザ端末5が、本発明の「画像取得装置」にそれぞれ相当する。画像管理サーバ2が、本発明の「サーバ」に相当する。ユーザ端末5が、本発明の「画像表示装置」に相当する。ステップS3を実行する定点カメラ10のCPU、およびステップS13を実行するCPU120が、本発明の「画像データ取得手段」にそれぞれ相当する。ステップS7を実行する定点カメラ10のCPU、およびステップS17を実行するCPU120が、本発明の「位置情報取得手段」にそれぞれ相当する。ステップS5を実行する定点カメラ10のCPU、およびステップS15を実行するCPU120が、本発明の「信頼性情報生成手段」にそれぞれ相当する。ステップS9を実行する定点カメラ10のCPU、およびステップS19を実行するCPU120が、本発明の「情報送信手段」にそれぞれ相当する。 In the above embodiment, the fixed point camera 10 and the user terminal 5 correspond to the “image acquisition device” of the present invention. The image management server 2 corresponds to the “server” of the present invention. The user terminal 5 corresponds to the “image display device” of the present invention. The CPU of the fixed point camera 10 that executes step S3 and the CPU 120 that executes step S13 correspond to the “image data acquisition unit” of the present invention. The CPU of the fixed point camera 10 that executes step S7 and the CPU 120 that executes step S17 correspond to the “positional information acquisition unit” of the present invention. The CPU of the fixed point camera 10 that executes step S5 and the CPU 120 that executes step S15 correspond to the “reliability information generation unit” of the present invention. The CPU of the fixed-point camera 10 that executes step S9 and the CPU 120 that executes step S19 correspond to the “information transmitting unit” of the present invention.
 ステップS35を実行するCPU20が、本発明の「記憶制御手段」に相当する。ステップS57を実行するCPU20が、本発明の「情報送信手段」に相当する。ステップS103,S105を実行するCPU120が、本発明の「情報取得手段」に相当する。ステップS107を実行するCPU120が、本発明の「合成画像生成手段」に相当する。ステップS109を実行するCPU120が、本発明の「合成画像表示手段」に相当する。ステップS111を実行するCPU120が、本発明の「位置指定手段」に相当する。ステップS151、S159を実行するCPU120が、本発明の「対象データ取得手段」にそれぞれ相当する。ステップS157、S161を実行するCPU120が、本発明の「対象データ表示手段」に相当する。ステップS155を実行するCPU120が、本発明の「警告表示手段」に相当する。ステップS117を実行するCPU120が、本発明の「対象一覧表示手段」に相当する。 The CPU 20 executing step S35 corresponds to the “storage control means” of the present invention. The CPU 20 that executes step S57 corresponds to the “information transmitting unit” of the present invention. The CPU 120 executing steps S103 and S105 corresponds to the “information acquisition unit” of the present invention. The CPU 120 that executes step S107 corresponds to the “composite image generation unit” of the present invention. The CPU 120 executing step S109 corresponds to the “composite image display unit” of the present invention. The CPU 120 that executes step S111 corresponds to the “position specifying means” of the present invention. The CPU 120 that executes steps S151 and S159 corresponds to the “target data acquisition unit” of the present invention. The CPU 120 that executes steps S157 and S161 corresponds to the “target data display unit” of the present invention. The CPU 120 executing step S155 corresponds to the “warning display means” of the present invention. The CPU 120 executing step S117 corresponds to the “target list display unit” of the present invention.
 ステップS103,S105を実行するCPU120が、本発明の「地図データ取得手段」および「信頼性情報取得手段」に相当する。ステップS103,S105が、本発明の「地図データ取得ステップ」および「信頼性情報取得ステップ」に相当する。ステップS107が、本発明の「合成画像生成ステップ」に相当する。ステップS109が、本発明の「合成画像表示ステップ」に相当する。 The CPU 120 that executes steps S103 and S105 corresponds to the “map data acquisition means” and “reliability information acquisition means” of the present invention. Steps S103 and S105 correspond to the “map data acquisition step” and “reliability information acquisition step” of the present invention. Step S107 corresponds to the “composite image generation step” of the present invention. Step S109 corresponds to the “composite image display step” of the present invention.
 なお、本発明は上記実施形態に限定されるものではなく、発明の要旨を変更しない範囲での変更が可能である。上記実施形態では、HMD200として網膜走査型ディスプレイを例示したが、表示方式は変更可能である。例えば、液晶ディスプレイ、有機EL(ElectroLuminesence)ディスプレイ等、他の表示方式のヘッドマウントディスプレイであってもよい。また、HMD200に代えて、携帯電話、ノートパソコン、PDA等、ユーザが携行可能な表示装置を用いてもよい。 In addition, this invention is not limited to the said embodiment, The change in the range which does not change the summary of invention is possible. In the above embodiment, the retinal scanning display is exemplified as the HMD 200, but the display method can be changed. For example, it may be a head mounted display of another display method such as a liquid crystal display or an organic EL (ElectroLuminescence) display. Further, instead of the HMD 200, a display device that can be carried by the user, such as a mobile phone, a notebook computer, or a PDA, may be used.
 また、上記実施形態では、画像データの時間的信頼性を示す信頼性情報として、画像データの取得日時を示すタイムスタンプを例示した。例えば、画像データの更新頻度を、信頼性情報として用いてもよい。この場合、画像管理データベース70では、画像データの更新頻度(例えば、10分間隔、1時間間隔、1日間隔など)がタイムスタンプに代えて管理されればよい。 In the above embodiment, the time stamp indicating the acquisition date and time of the image data is exemplified as the reliability information indicating the temporal reliability of the image data. For example, the update frequency of the image data may be used as the reliability information. In this case, in the image management database 70, the update frequency of the image data (for example, 10 minute interval, 1 hour interval, 1 day interval, etc.) may be managed instead of the time stamp.
 そして、ヒートマップの作成時(S107)には、地図表示範囲内に含まれる各地点の画像データの更新頻度の時間的間隔(つまり、単位時間当たりの更新回数)を色分けして段階的に示すヒートマップが作成される。このヒートマップでは、更新頻度が多い地点には暖色系で配色される一方、更新頻度が少ない地点には寒色系で配色される。これにより、ユーザはヒートマップを参照して、地図表示範囲に含まれる各地の画像データが新しいか否かを視覚的に容易に判別できる。言い換えると、ユーザは、画像データの更新頻度が多い地点と、画像データの更新頻度が少ない地点とを、ヒートマップを参照して視覚的に判別できる。 At the time of creating the heat map (S107), the time interval of the update frequency of the image data at each point included in the map display range (that is, the number of updates per unit time) is color-coded and shown stepwise. A heat map is created. In this heat map, warm colors are used for points with a high update frequency, while cool colors are used for points with a low update frequency. As a result, the user can easily visually determine whether or not the image data of each place included in the map display range is new with reference to the heat map. In other words, the user can visually discriminate between a point where the image data update frequency is high and a point where the image data update frequency is low with reference to the heat map.
 また、ユーザ端末5の画像表示処理(図14)の一部を、他のコンピュータで実行されるようにしてもよい。例えば、画像管理サーバ2では、地図描画情報の送信(S57)に代えて、ヒートマップの作成(S107)が実行されてもよい。そして、画像管理サーバ2で作成されたヒートマップが、ユーザ端末5に送信されるようにしてもよい。この場合、ユーザ端末5におけるヒートマップの作成処理を省略しつつ、HMD200でヒートマップを表示することができる。 Further, a part of the image display process (FIG. 14) of the user terminal 5 may be executed by another computer. For example, in the image management server 2, instead of transmitting the map drawing information (S57), heat map creation (S107) may be executed. Then, the heat map created by the image management server 2 may be transmitted to the user terminal 5. In this case, the heat map can be displayed on the HMD 200 while omitting the heat map creation process in the user terminal 5.
1  画像提供システム
2  画像管理サーバ
3  位置管理サーバ
4  ネットワーク
5  ユーザ端末
10  定点カメラ
70  画像管理データベース
80  位置管理データベース
100  HMD制御端末
120  CPU
121  ROM
122  RAM
123  HDD
200  ヘッドマウントディスプレイ
300  地図データ
310  ヒートマップ
DESCRIPTION OF SYMBOLS 1 Image provision system 2 Image management server 3 Position management server 4 Network 5 User terminal 10 Fixed point camera 70 Image management database 80 Position management database 100 HMD control terminal 120 CPU
121 ROM
122 RAM
123 HDD
200 Head Mount Display 300 Map Data 310 Heat Map

Claims (10)

  1.  画像データを取得する画像取得装置と、前記画像データを管理するサーバと、前記画像データを表示する画像表示装置とが、ネットワークを介して接続された画像提供システムであって、
     前記画像取得装置は、
     前記画像取得装置の周辺を撮像する撮像手段から、前記画像データを取得する画像データ取得手段と、
     前記画像取得装置の現在位置を取得する位置取得手段から、前記画像データの取得位置を示す位置情報を取得する位置情報取得手段と、
     前記画像データ取得手段が前記画像データを取得するタイミングに基づいて、前記画像データの時間的な信頼性を示す信頼性情報を生成する信頼性情報生成手段と、
     前記画像データ取得手段によって取得された前記画像データと、前記位置情報取得手段によって取得された前記位置情報と、前記信頼性情報生成手段によって生成された前記信頼性情報とを、前記サーバに送信する情報送信手段とを備え、
     前記サーバは、
     前記画像取得装置から受信した前記画像データ、前記位置情報、および前記信頼性情報を対応付けて、第一記憶装置に記憶させる記憶制御手段と、
     前記画像表示装置からの要求に応じて、第二記憶装置に記憶されている地図データと、前記地図データの表示範囲に含まれる前記位置情報と対応付けられている前記信頼性情報とを、前記画像表示装置に返信する情報送信手段とを備え、
     前記画像表示装置は、
     前記サーバに前記要求を送信することで、前記サーバから前記地図データおよび前記信頼性情報を取得する情報取得手段と、
     前記情報取得手段によって取得された前記地図データおよび前記信頼性情報に基づいて、前記画像データの時間的信頼性を示す特殊表示が前記地図データに付加された合成画像を生成する合成画像生成手段と、
     前記合成画像生成手段によって生成された前記合成画像を、第一表示装置に表示する合成画像表示手段と、
     を備えたことを特徴とする画像提供システム。
    An image providing system in which an image acquisition device that acquires image data, a server that manages the image data, and an image display device that displays the image data are connected via a network,
    The image acquisition device includes:
    Image data acquisition means for acquiring the image data from an imaging means for imaging the periphery of the image acquisition device;
    Position information acquisition means for acquiring position information indicating an acquisition position of the image data from position acquisition means for acquiring a current position of the image acquisition device;
    Reliability information generation means for generating reliability information indicating temporal reliability of the image data based on the timing at which the image data acquisition means acquires the image data;
    The image data acquired by the image data acquisition means, the position information acquired by the position information acquisition means, and the reliability information generated by the reliability information generation means are transmitted to the server. An information transmission means,
    The server
    Storage control means for associating the image data received from the image acquisition device, the position information, and the reliability information, and storing them in a first storage device;
    In response to a request from the image display device, the map data stored in the second storage device, and the reliability information associated with the position information included in the display range of the map data, An information transmission means for replying to the image display device,
    The image display device includes:
    Information acquisition means for acquiring the map data and the reliability information from the server by transmitting the request to the server;
    Based on the map data and the reliability information acquired by the information acquisition means, a composite image generation means for generating a composite image in which a special display indicating temporal reliability of the image data is added to the map data; ,
    A composite image display means for displaying the composite image generated by the composite image generation means on a first display device;
    An image providing system comprising:
  2.  画像データを取得する画像取得装置と、前記画像データを管理するサーバと、前記画像データを表示する画像表示装置とが、ネットワークを介して接続された画像提供システムであって、
     前記画像取得装置は、
     前記画像取得装置の周辺を撮像する撮像手段から、前記画像データを取得する画像データ取得手段と、
     前記画像取得装置の現在位置を取得する位置取得手段から、前記画像データの取得位置を示す位置情報を取得する位置情報取得手段と、
     前記画像データ取得手段が前記画像データを取得するタイミングに基づいて、前記画像データの時間的な信頼性を示す信頼性情報を生成する信頼性情報生成手段と、
     前記画像データ取得手段によって取得された前記画像データと、前記位置情報取得手段によって取得された前記位置情報と、前記信頼性情報生成手段によって生成された前記信頼性情報とを、前記サーバに送信する情報送信手段とを備え、
     前記サーバは、
     前記画像取得装置から受信した前記画像データ、前記位置情報、および前記信頼性情報を対応付けて、第一記憶装置に記憶させる記憶制御手段と、
     前記画像表示装置からの要求に応じて、第二記憶装置に記憶されている地図データと、前記地図データの表示範囲に含まれる前記位置情報と対応付けられている前記信頼性情報とに基づいて、前記画像データの時間的信頼性を示す特殊表示が前記地図データに付加された合成画像を生成する合成画像生成手段と、
     前記合成画像生成手段によって生成された前記合成画像を、前記画像表示装置に返信する情報送信手段とを備え、
     前記画像表示装置は、
     前記サーバに前記要求を送信することで、前記サーバから前記合成画像を取得する情報取得手段と、
     前記情報取得手段によって取得された前記合成画像を、第一表示装置に表示する合成画像表示手段と、
     を備えたことを特徴とする画像提供システム。
    An image providing system in which an image acquisition device that acquires image data, a server that manages the image data, and an image display device that displays the image data are connected via a network,
    The image acquisition device includes:
    Image data acquisition means for acquiring the image data from an imaging means for imaging the periphery of the image acquisition device;
    Position information acquisition means for acquiring position information indicating an acquisition position of the image data from position acquisition means for acquiring a current position of the image acquisition device;
    Reliability information generation means for generating reliability information indicating temporal reliability of the image data based on the timing at which the image data acquisition means acquires the image data;
    The image data acquired by the image data acquisition means, the position information acquired by the position information acquisition means, and the reliability information generated by the reliability information generation means are transmitted to the server. An information transmission means,
    The server
    Storage control means for associating the image data received from the image acquisition device, the position information, and the reliability information, and storing them in a first storage device;
    In response to a request from the image display device, based on the map data stored in the second storage device and the reliability information associated with the position information included in the display range of the map data A composite image generating means for generating a composite image in which a special display indicating temporal reliability of the image data is added to the map data;
    An information transmission means for returning the composite image generated by the composite image generation means to the image display device;
    The image display device includes:
    Information acquisition means for acquiring the composite image from the server by transmitting the request to the server;
    A composite image display means for displaying the composite image acquired by the information acquisition means on a first display device;
    An image providing system comprising:
  3.  前記信頼性情報生成手段は、前記画像データが取得された日時を示すタイムスタンプを、前記信頼性情報として生成し、
     前記合成画像生成手段は、前記タイムスタンプが示す取得日時が新しい前記画像データと、前記タイムスタンプが示す取得日時が古い前記画像データとを、前記地図データ上で視覚的に区別可能とする前記特殊表示を前記地図データに付加することで、前記合成画像を生成することを特徴とする請求項1または2に記載の画像提供システム。
    The reliability information generating means generates a time stamp indicating the date and time when the image data was acquired as the reliability information,
    The composite image generating means can visually distinguish the image data having a new acquisition date and time indicated by the time stamp and the image data having an old acquisition date and time indicated by the time stamp on the map data. The image providing system according to claim 1, wherein the composite image is generated by adding a display to the map data.
  4.  前記信頼性情報生成手段は、前記画像データが更新される頻度を示す更新頻度データを前記信頼性情報として生成し、
     前記合成画像生成手段は、前記更新頻度データが示す更新頻度が多い前記画像データと、前記更新頻度データが示す更新頻度が多い前記画像データとを、前記地図データ上で視覚的に区別可能とする前記特殊表示を前記地図データに付加することで、前記合成画像を生成することを特徴とする請求項1または2に記載の画像提供システム。
    The reliability information generation means generates update frequency data indicating the frequency with which the image data is updated as the reliability information,
    The composite image generation means can visually distinguish the image data having a high update frequency indicated by the update frequency data and the image data having a high update frequency indicated by the update frequency data on the map data. The image providing system according to claim 1, wherein the composite image is generated by adding the special display to the map data.
  5.  前記画像表示装置は、
     前記第一表示装置に表示された前記合成画像において、ユーザが対象位置を指定するための位置指定手段と、
     前記位置指定手段によって指定された前記対象位置に対応する前記画像データである対象データを、前記第一記憶装置から取得する対象データ取得手段と、
     前記対象データ取得手段によって取得された前記対象データを、第二表示装置に表示する対象データ表示手段と、
     を備えたことを特徴とする請求項1または2に記載の画像提供システム。
    The image display device includes:
    In the composite image displayed on the first display device, position specifying means for the user to specify a target position;
    Target data acquisition means for acquiring target data, which is the image data corresponding to the target position specified by the position specifying means, from the first storage device;
    Target data display means for displaying the target data acquired by the target data acquisition means on a second display device;
    The image providing system according to claim 1, further comprising:
  6.  前記画像表示装置は、前記対象データに対応する前記信頼性情報が示す前記時間的信頼性が所定値よりも低い場合、前記第二表示装置に警告を表示する警告表示手段を備えたことを特徴とする請求項5に記載の画像提供システム。 The image display device includes warning display means for displaying a warning on the second display device when the temporal reliability indicated by the reliability information corresponding to the target data is lower than a predetermined value. The image providing system according to claim 5.
  7.  前記画像表示装置は、前記対象データ取得手段によって複数の前記対象データが取得された場合に、前記複数の対象データをリスト状またはサムネイル状に前記第一表示装置に表示する対象一覧表示手段を備え、
     前記対象データ表示手段は、前記対象一覧表示手段によって表示された前記複数の対象データのうちで、ユーザが選択した前記対象データを前記第二表示装置に表示することを特徴とする請求項5に記載の画像提供システム。
    The image display device includes target list display means for displaying the plurality of target data in a list or thumbnail form on the first display device when a plurality of the target data is acquired by the target data acquisition means. ,
    6. The target data display unit displays the target data selected by a user on the second display device among the plurality of target data displayed by the target list display unit. The image providing system described.
  8.  地図データを記憶する記憶装置を参照して、前記地図データを取得する地図データ取得手段と、
     画像データ、前記画像データの取得位置を示す位置情報、および前記画像データの時間的な信頼性を示す信頼性情報を対応付けて記憶する記憶装置を参照して、前記地図データの表示範囲に含まれる前記位置情報と対応付けられている前記信頼性情報を取得する信頼性情報取得手段と、
     前記地図データ取得手段によって取得された前記地図データと、前記信頼性情報取得手段によって取得された前記信頼性情報とに基づいて、前記画像データの時間的信頼性を示す特殊表示が前記地図データに付加された合成画像を生成する合成画像生成手段と、
     前記合成画像生成手段によって生成された前記合成画像を、表示装置に表示する合成画像表示手段と
     を備えたことを特徴とする画像表示装置。
    With reference to a storage device that stores map data, map data acquisition means for acquiring the map data;
    Referring to a storage device that associates and stores image data, position information indicating the acquisition position of the image data, and reliability information indicating temporal reliability of the image data, and included in the display range of the map data Reliability information acquisition means for acquiring the reliability information associated with the position information,
    Based on the map data acquired by the map data acquisition means and the reliability information acquired by the reliability information acquisition means, a special display indicating temporal reliability of the image data is displayed on the map data. A composite image generating means for generating the added composite image;
    An image display device comprising: a composite image display unit configured to display the composite image generated by the composite image generation unit on a display device.
  9.  コンピュータに、
     地図データを記憶する記憶装置を参照して、前記地図データを取得する地図データ取得ステップと、
     画像データ、前記画像データの取得位置を示す位置情報、および前記画像データの時間的な信頼性を示す信頼性情報を対応付けて記憶する記憶装置を参照して、前記地図データの表示範囲に含まれる前記位置情報と対応付けられている前記信頼性情報を取得する信頼性情報取得ステップと、
     前記地図データ取得ステップによって取得された前記地図データと、前記信頼性情報取得ステップによって取得された前記信頼性情報とに基づいて、前記画像データの時間的信頼性を示す特殊表示が前記地図データに付加された合成画像を生成する合成画像生成ステップと、
     前記合成画像生成ステップによって生成された前記合成画像を、表示装置に表示する合成画像表示ステップと、
     を実行させることを特徴とする画像表示プログラム。
    On the computer,
    With reference to a storage device that stores map data, a map data acquisition step of acquiring the map data;
    Referring to a storage device that associates and stores image data, position information indicating the acquisition position of the image data, and reliability information indicating temporal reliability of the image data, and included in the display range of the map data A reliability information acquisition step of acquiring the reliability information associated with the position information
    Based on the map data acquired by the map data acquisition step and the reliability information acquired by the reliability information acquisition step, a special display indicating temporal reliability of the image data is displayed on the map data. A composite image generation step for generating the added composite image;
    A composite image display step of displaying the composite image generated by the composite image generation step on a display device;
    An image display program for executing
  10.  画像データを取得する画像取得装置と、前記画像データを管理するサーバと、前記画像データを表示する画像表示装置とが、ネットワークを介して接続された画像提供システムにおいて実行される画像表示方法であって、
     前記画像データを取得する画像データ取得ステップと、
     前記画像データの取得位置を示す位置情報を取得する位置情報取得ステップと、
     前記画像データが取得されるタイミングに基づいて、前記画像データの時間的な信頼性を示す信頼性情報を生成する信頼性情報生成ステップと、
     地図データと、前記地図データの表示範囲に含まれる前記位置情報と対応付けられている前記信頼性情報とに基づいて、前記画像データの時間的信頼性を示す特殊表示が前記地図データに付加された合成画像を生成する合成画像生成ステップと、
     前記合成画像生成ステップによって生成された前記合成画像を、前記画像表示装置に表示する合成画像表示ステップと、
    を備えたことを特徴とする画像表示方法。
    An image display method executed in an image providing system in which an image acquisition device that acquires image data, a server that manages the image data, and an image display device that displays the image data are connected via a network. And
    An image data acquisition step of acquiring the image data;
    A position information acquisition step of acquiring position information indicating an acquisition position of the image data;
    A reliability information generation step for generating reliability information indicating temporal reliability of the image data based on the timing at which the image data is acquired;
    Based on the map data and the reliability information associated with the position information included in the display range of the map data, a special display indicating the temporal reliability of the image data is added to the map data. A composite image generation step for generating a combined image;
    A composite image display step of displaying the composite image generated by the composite image generation step on the image display device;
    An image display method comprising:
PCT/JP2011/066833 2010-07-30 2011-07-25 Image providing system, image display device, image display program, and image display method WO2012014837A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010172521A JP2012034216A (en) 2010-07-30 2010-07-30 Image providing system, image display device, and image display program
JP2010-172521 2010-07-30

Publications (1)

Publication Number Publication Date
WO2012014837A1 true WO2012014837A1 (en) 2012-02-02

Family

ID=45530045

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/066833 WO2012014837A1 (en) 2010-07-30 2011-07-25 Image providing system, image display device, image display program, and image display method

Country Status (2)

Country Link
JP (1) JP2012034216A (en)
WO (1) WO2012014837A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016033611A (en) * 2014-07-31 2016-03-10 セイコーエプソン株式会社 Information provision system, display device, and method of controlling display device
JP6424100B2 (en) * 2015-01-29 2018-11-14 株式会社ゼンリンデータコム NAVIGATION SYSTEM, NAVIGATION DEVICE, GLASS TYPE DEVICE, AND METHOD FOR LINKING THE DEVICE

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11143941A (en) * 1997-11-06 1999-05-28 Hitachi Ltd Damage display device and medium recording processing program of its display device
JP2001336945A (en) * 2000-05-26 2001-12-07 Alpine Electronics Inc Navigation device
JP2003299156A (en) * 2002-04-05 2003-10-17 Matsushita Electric Ind Co Ltd External image acquisition system and apparatus used therefor
JP2008170930A (en) * 2006-12-12 2008-07-24 Asia Air Survey Co Ltd System for displaying image data associated with map information, and program for displaying image data associated with map information
JP2009058922A (en) * 2007-09-04 2009-03-19 Sony Corp Map information display device, map information display method, and program

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4750014B2 (en) * 2006-12-28 2011-08-17 シャープ株式会社 Information display device, information providing server, information display system, information display device control method, information providing server control method, control program, and recording medium
JP5097507B2 (en) * 2007-11-05 2012-12-12 オリンパスイメージング株式会社 Information processing apparatus and control program for information processing apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11143941A (en) * 1997-11-06 1999-05-28 Hitachi Ltd Damage display device and medium recording processing program of its display device
JP2001336945A (en) * 2000-05-26 2001-12-07 Alpine Electronics Inc Navigation device
JP2003299156A (en) * 2002-04-05 2003-10-17 Matsushita Electric Ind Co Ltd External image acquisition system and apparatus used therefor
JP2008170930A (en) * 2006-12-12 2008-07-24 Asia Air Survey Co Ltd System for displaying image data associated with map information, and program for displaying image data associated with map information
JP2009058922A (en) * 2007-09-04 2009-03-19 Sony Corp Map information display device, map information display method, and program

Also Published As

Publication number Publication date
JP2012034216A (en) 2012-02-16

Similar Documents

Publication Publication Date Title
RU2670784C9 (en) Orientation and visualization of virtual object
US10554829B2 (en) Information processing device, photographing device, image sharing system, and method of information processing
JP5423716B2 (en) Head mounted display
US10924691B2 (en) Control device of movable type imaging device and control method of movable type imaging device
WO2020261927A1 (en) Presentation system, presentation device, and presentation method
JP5532026B2 (en) Display device, display method, and program
EP3766255B1 (en) Method for obtaining information about a luminaire
US9699366B2 (en) Image providing apparatus, image display device, imaging system, image display system, and image providing method in which composite image data is generated using auxiliary image data generated by at least one auxiliary imaging unit
CN109495686A (en) Image pickup method and equipment
CN104487982B (en) The optical detection of wearable object based on user&#39;s wearing provides service
US8903957B2 (en) Communication system, information terminal, communication method and recording medium
JP7079125B2 (en) Aerial photography management system and program
WO2012014837A1 (en) Image providing system, image display device, image display program, and image display method
JP2013168854A (en) Imaging device, server device, and management system
JP5007631B2 (en) Electronic camera
JP6677684B2 (en) Video distribution system
JP2013021473A (en) Information processing device, information acquisition method, and computer program
CN114584702A (en) Method and system for shooting visible light and thermal imaging overlay
CN108012141A (en) The control method of display device, display system and display device
JP7365783B2 (en) Field information management system, control method for the field information management system, and control program for the field information management system
JP6450890B2 (en) Image providing system, image providing method, and program
CN111162840B (en) Method and system for setting virtual objects around optical communication device
US20180067623A1 (en) Headset device and visual feedback method and apparatus thereof
US20240134195A1 (en) Electronic device and method for obtaining media corresponding to location by controlling camera based on location
US20240133996A1 (en) Identification of wearable device locations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11812427

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11812427

Country of ref document: EP

Kind code of ref document: A1