WO2012014837A1 - Système de fourniture d'images, dispositif d'affichage d'images, programme d'affichage d'images et procédé d'affichage d'images - Google Patents

Système de fourniture d'images, dispositif d'affichage d'images, programme d'affichage d'images et procédé d'affichage d'images Download PDF

Info

Publication number
WO2012014837A1
WO2012014837A1 PCT/JP2011/066833 JP2011066833W WO2012014837A1 WO 2012014837 A1 WO2012014837 A1 WO 2012014837A1 JP 2011066833 W JP2011066833 W JP 2011066833W WO 2012014837 A1 WO2012014837 A1 WO 2012014837A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
data
image data
acquisition
reliability
Prior art date
Application number
PCT/JP2011/066833
Other languages
English (en)
Japanese (ja)
Inventor
知裕 佐藤
Original Assignee
ブラザー工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ブラザー工業株式会社 filed Critical ブラザー工業株式会社
Publication of WO2012014837A1 publication Critical patent/WO2012014837A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6581Reference data, e.g. a movie identifier for ordering a movie or a product identifier in a home shopping application

Definitions

  • the present disclosure relates to an image providing system that provides images acquired in various places, an image display device that displays images acquired in various places, an image display program, and an image display method.
  • Patent Document 1 a system that can obtain images from various locations is known (for example, see Patent Document 1).
  • the image display device when a location is specified and a current video acquisition instruction is input, imaging data of the current video is acquired from an external video acquisition device existing at the specified location.
  • imaging data of the past video is acquired from the imaging data storage unit or the center station. Thereby, the image display apparatus can display the present image or the past image of the designated place.
  • the newer the video when the current video or past video at the specified location is displayed, the newer the video, the higher the reliability of the information.
  • the older the video the lower the reliability of the information. That is, the acquisition date and time of video data greatly affects the identity (hereinafter referred to as video reliability) with the cityscape, landscape, state, etc. at the current designated location.
  • video reliability the identity with the cityscape, landscape, state, etc. at the current designated location.
  • the present disclosure has been made in order to solve the above-described problem, and an image providing system, an image display device, and an image that allow a user to easily grasp the reliability of image data obtained by photographing each point on a map image
  • An object is to provide a display program.
  • an image acquisition device that acquires image data, a server that manages the image data, and an image display device that displays the image data are connected via a network.
  • the image acquisition device includes: an image data acquisition unit that acquires the image data from an imaging unit that images the periphery of the image acquisition device; and a position that acquires a current position of the image acquisition device. Based on timing information acquisition means for acquiring position information indicating the acquisition position of the image data from the acquisition means, and timing at which the image data acquisition means acquires the image data, the temporal reliability of the image data is increased.
  • Reliability information generating means for generating reliability information to be shown, the image data acquired by the image data acquiring means, and the position information acquisition Information transmission means for transmitting the position information acquired by the stage and the reliability information generated by the reliability information generation means to the server, the server receiving from the image acquisition device
  • the image data, the position information, and the reliability information are associated with each other and stored in the first storage device, and stored in the second storage device in response to a request from the image display device.
  • Map information, and information transmission means for returning the reliability information associated with the position information included in the display range of the map data to the image display device, the image display device, By transmitting the request to a server, the information acquisition means for acquiring the map data and the reliability information from the server, and the information acquired by the information acquisition means Based on the map data and the reliability information, a special display indicating the temporal reliability of the image data is generated by the composite image generation means for generating a composite image added to the map data, and the composite image generation means And a composite image display means for displaying the composite image on the first display device.
  • imaging the periphery of the image acquisition device is not limited to only the periphery in terms of the distance of the image acquisition device. Imaging performed in a predetermined direction around the image acquisition apparatus (the captured image includes a distance up to infinity) is also included in the range of “imaging the periphery of the image acquisition apparatus”.
  • the image data acquired by the image acquisition device is stored in the first storage device together with the position information and the reliability information.
  • the map data stored in the second storage device is returned to the image display device in response to a request from the image display device.
  • the reliability information associated with the position information included in the display range of the map data is also returned to the image display device.
  • a composite image is generated based on the map data and reliability information acquired from the server and displayed on the first display device.
  • the composite image is an image in which a special display indicating temporal reliability of the image data is added to the map data. Therefore, the user of the image display device can easily grasp the reliability of the image data obtained by photographing each point on the map image by referring to the composite image displayed on the first display device.
  • an image acquisition device that acquires image data, a server that manages the image data, and an image display device that displays the image data are connected via a network.
  • the image acquisition device includes: an image data acquisition unit that acquires the image data from an imaging unit that images the periphery of the image acquisition device; and a position that acquires a current position of the image acquisition device. Based on timing information acquisition means for acquiring position information indicating the acquisition position of the image data from the acquisition means, and timing at which the image data acquisition means acquires the image data, the temporal reliability of the image data is increased.
  • Reliability information generation means for generating the reliability information to be shown, the image data acquired by the image data acquisition means, and the position information acquisition.
  • Information transmission means for transmitting the position information acquired by the means and the reliability information generated by the reliability information generation means to the server, the server receiving from the image acquisition device.
  • the image data, the position information, and the reliability information are associated with each other and stored in the first storage device, and stored in the second storage device in response to a request from the image display device.
  • a special display indicating the temporal reliability of the image data is added to the map data.
  • Composite image generation means for generating a composite image
  • information transmission means for returning the composite image generated by the composite image generation means to the image display device.
  • the image display device transmits the request to the server to acquire the composite image from the server, and the composite image acquired by the information acquisition device to the first display device.
  • a composite image display means for displaying.
  • the image data acquired by the image acquisition device is stored in the first storage device together with the position information and the reliability information.
  • the server synthesizes the map data stored in the second storage device and the reliability information associated with the position information included in the display range of the map data.
  • An image is generated.
  • the composite image is an image in which a special display indicating temporal reliability of the image data is added to the map data.
  • the composite image generated by the server is returned to the image display device and displayed on the first display device. Therefore, the user of the image display device can easily grasp the reliability of the image data obtained by photographing each point on the map image by referring to the composite image displayed on the first display device.
  • the reliability information generation means generates a time stamp indicating the date and time when the image data was acquired as the reliability information, and the composite image generation means and the image data with a new acquisition date and time indicated by the time stamp
  • the composite image may be generated by adding to the map data the special display that visually distinguishes the image data with the old acquisition date and time indicated by the time stamp on the map data. .
  • the user of the image display device can visually discriminate between a point where the acquisition date / time of the image data is new and a point where the acquisition date / time of the image data is old with reference to the composite image.
  • the generation unit generates update frequency data indicating a frequency at which the image data is updated as the reliability information, and the composite image generation unit includes the image data having a high update frequency indicated by the update frequency data,
  • the composite image may be generated by adding to the map data the special display that allows the image data having a high update frequency indicated by the update frequency data to be visually distinguishable on the map data.
  • the user of the image display device can visually determine a point where the update frequency of the image data is high and a point where the update frequency of the image data is low with reference to the composite image.
  • the image display device includes a position specifying unit for a user to specify a target position, and the image corresponding to the target position specified by the position specifying unit.
  • Target data acquisition means for acquiring target data as data from the first storage device
  • target data display means for displaying the target data acquired by the target data acquisition means on a second display device Also good.
  • the user of the image display device can view the image data on the second display device after confirming the temporal reliability of the image data at each point on the composite image.
  • the image display device may include warning display means for displaying a warning on the second display device when the temporal reliability indicated by the reliability information corresponding to the target data is lower than a predetermined value. In this case, the user of the image display device can surely grasp that the temporal reliability of the image data at the designated point is low.
  • the image display device includes target list display means for displaying the plurality of target data in a list or thumbnail form on the first display device when a plurality of the target data is acquired by the target data acquisition means.
  • the target data display means may display the target data selected by the user among the plurality of target data displayed by the target list display means on the second display device. In this case, when there are a plurality of image data at the designated point, the user of the image display device can freely select the image data to be displayed on the second display device.
  • An image display device refers to a storage device that stores map data, map data acquisition means for acquiring the map data, image data, and position information indicating an acquisition position of the image data And the storage device that stores the reliability information indicating the temporal reliability of the image data in association with each other, and the reliability associated with the position information included in the display range of the map data Based on the reliability information acquisition means for acquiring information, the map data acquired by the map data acquisition means, and the reliability information acquired by the reliability information acquisition means, the time of the image data A composite image generating means for generating a composite image in which a special display indicating reliability is added to the map data; and the composite image generated by the composite image generating means And a composite image display means for displaying on the display device.
  • the map data and the reliability information of the image data associated with the position information included in the display range of the map data are acquired from the storage device.
  • a composite image is generated based on the acquired map data and reliability information and displayed on the display device.
  • the composite image is an image in which a special display indicating temporal reliability of the image data is added to the map data. Therefore, the user can easily grasp the reliability of the image data obtained by photographing each point on the map image by referring to the composite image displayed on the display device.
  • An image display program refers to a storage device that stores map data in a computer, and includes a map data acquisition step of acquiring the map data, an image data, and an acquisition position of the image data.
  • the storage device stores the positional information to be displayed and the reliability information indicating the temporal reliability of the image data in association with each other, and is associated with the positional information included in the display range of the map data.
  • the image data based on the reliability information acquisition step for acquiring the reliability information, the map data acquired by the map data acquisition step, and the reliability information acquired by the reliability information acquisition step.
  • a composite image generating step for generating a composite image in which a special display indicating temporal reliability of the map is added to the map data; and the composite The combined image generated by the image generating step, characterized in that to perform the composite image display step of displaying on the display device.
  • the map data and the reliability information of the image data associated with the position information included in the display range of the map data are acquired from the storage device.
  • a composite image is generated based on the acquired map data and reliability information and displayed on the display device.
  • the composite image is an image in which a special display indicating temporal reliability of the image data is added to the map data. Therefore, the user can easily grasp the reliability of the image data obtained by photographing each point on the map image by referring to the composite image displayed on the display device.
  • an image acquisition device that acquires image data
  • a server that manages the image data
  • an image display device that displays the image data
  • An image display method executed in the image providing system, the image data acquiring step for acquiring the image data, the position information acquiring step for acquiring the position information indicating the acquisition position of the image data, and the image data
  • the reliability information generation step for generating reliability information indicating temporal reliability of the image data based on the acquired timing, map data, and the position information included in the display range of the map data
  • a special display indicating the temporal reliability of the image data is added to the map data based on the reliability information attached.
  • FIG. 1 is an overall configuration diagram of an image providing system 1.
  • FIG. 3 is a block diagram showing an electrical configuration of the image management server 2.
  • FIG. 3 is a diagram illustrating a data configuration of an image management database 70.
  • FIG. 3 is a diagram showing a data configuration of a position management database 80.
  • FIG. It is the perspective view seen from the slanting upper part of user terminal 5.
  • It is a block diagram which shows the electric constitution of HMD200.
  • 2 is a block diagram showing an electrical configuration of an HMD control terminal 100.
  • FIG. 4 is a flowchart showing upload processing of a fixed point camera 10. It is a flowchart which shows the upload process of the user terminal 5.
  • FIG. 4 is a flowchart showing DB update processing of the image management server 2. It is a flowchart which shows DB update processing of the position management server 3.
  • FIG. 4 is a flowchart showing image providing processing of the image management server 2.
  • 5 is a flowchart showing image display processing of the user terminal 5. It is a figure which shows one specific example of the map data. It is a figure which shows one specific example of the heat map. It is a flowchart which shows a data display process.
  • the image providing system 1 is a system that provides image data taken at a point designated on a map image (hereinafter referred to as a designated point) to a user via a network.
  • a designated point a point designated on a map image
  • an image management server 2 a position management server 3 a plurality of user terminals 5, and a plurality of fixed point cameras 10 are connected via a network 4.
  • the image management server 2 is a computer that manages image data taken in various places, and is connected to the network 4 by wire.
  • the location management server 3 is a computer that manages the current location of the user terminal 5, and is connected to the network 4 by wire.
  • the network 4 is the Internet that can be connected via a public network.
  • the user terminal 5 is a small and lightweight terminal device carried by a user who uses the image providing system 1, and is connected to the network 4 wirelessly.
  • the user terminal 5 acquires the image data of the point designated by the user on the map image from the image management server 2 or the fixed point camera 10 and displays it.
  • the user terminal 5 captures the current position at a predetermined time interval and transmits the captured image data to the image management server 2. For this reason, the image data captured by the user terminal 5 displays a cityscape, a landscape, and the like in the vicinity of the current position of the user terminal 5.
  • the fixed point camera 10 is a camera installed in various places, and is connected to the network 4 by wire.
  • the fixed point camera 10 transmits the captured image data to the image management server 2 and the user terminal 5.
  • the fixed point camera 10 of the present embodiment is provided at main intersections at street corners, and always shoots the four directions of east, west, south, and north of the intersection. Therefore, the image data photographed by the fixed point camera 10 shows the road traffic situation near the intersection where the fixed point camera 10 is set, the flow of pedestrians, and the like.
  • the image management server 2 is a general-purpose server, and includes a CPU 20, a ROM 21, a RAM 22, an HDD 23, a communication device 24, and an I / O interface 29.
  • the ROM 21, RAM 22, and I / O interface 29 are each connected to the CPU 20.
  • the HDD 23 and the communication device 24 are connected to the I / O interface 29.
  • the communication device 24 is a controller that controls data communication via the network 4.
  • the HDD 23 is a large-capacity hard disk drive, and is provided with a program storage area 231, a map data storage area 232, an image data storage area 233, and the like.
  • Various programs for operating the image management server 2 are stored in the program storage area 231, and programs for executing DB update processing (see FIG. 11) and image providing processing (see FIG. 13) described later are also stored. ing.
  • the map data storage area 232 stores map data indicating a map image for the user to specify a point.
  • a plurality of map data are prepared in advance so that at least the entire map of Japan and various places can be displayed at a plurality of scales.
  • CPU20 can specify the positional information (latitude and longitude) of an actual point corresponding to the display position on the map image displayed based on map data.
  • the image data storage area 233 image data obtained by shooting various locations is stored.
  • the image data storage area 233 is provided with an image management database 70 for managing image data.
  • the image management database 70 stores a record indicating image data, position information indicating an acquisition position (that is, a shooting point) of image data, and a time stamp indicating the acquisition date and time of image data. Store every data.
  • the position information indicates the latitude and longitude of the shooting point and the camera direction at the time of shooting.
  • the position management server 3 is a general-purpose server, and includes a CPU, a ROM, a RAM, an HDD, a communication device, and an I / O interface (not shown), like the image management server 2 (FIG. 3).
  • the HDD of the location management server 3 is provided with a current location storage area (not shown) in addition to the program storage area. In this program storage area, various programs for operating the location management server 3 are stored.
  • position information indicating the current position of the user terminal 5 (hereinafter referred to as current position data) is stored.
  • a position management database 80 for managing current position data is provided in the current position data storage area. As shown in FIG. 5, in the position management database 80, a record including a terminal ID unique to the user terminal 5 and the current position of the user terminal 5 is provided for each user terminal 5. The current position is the latest position information (latitude, longitude, and direction) acquired from the user terminal 5.
  • the fixed-point camera 10 includes a camera unit that captures an installation point, and a controller that controls the camera unit and the like.
  • the controller includes a CPU, a ROM, a RAM, and the like, and a control program is stored in the ROM. This control program causes the CPU to execute an image upload process (see FIG. 9) described later.
  • the ROM stores a terminal ID unique to the fixed point camera 10, position information of the fixed point camera 10, and the like.
  • the user terminal 5 of the present embodiment includes a head mounted display (hereinafter referred to as HMD) 200 and an HMD control terminal 100.
  • the HMD 200 is a glasses-type display device that a user can wear on the head and view an image.
  • the HMD control terminal 100 is a small and lightweight electronic device connected to the HMD 200 via a cable 190, and controls display of the HMD 200 and the like.
  • the HMD 200 of this embodiment is a so-called retinal scanning display. That is, the HMD 200 scans a laser beam (hereinafter referred to as video light A) modulated in accordance with a signal of image data (hereinafter referred to as an image signal) that is visually recognized by the user, and the at least one eye of the user. Emits the retina. Thereby, HMD200 can project an image directly on a user's retina, and can visually recognize an image.
  • the HMD 200 includes at least an emission device 210, a half mirror 250, and a head mounting portion 220.
  • the head mounting unit 220 supports the emission device 210 and fixes the HMD 200 to the user's head.
  • the half mirror 250 is disposed at the light emission port of the emission device 210.
  • the emission device 210 emits video light A corresponding to the image signal to the half mirror 250.
  • the half mirror 250 is in a fixed position with respect to the emission device 210.
  • the half mirror 250 reflects the image light A emitted from the emission device 210 toward the eyes of the user.
  • the half mirror 250 is formed by evaporating a metal thin film on a transparent resin plate so as to have a predetermined reflectance (for example, 50%). Therefore, the half mirror 250 transmits part of the external light B from the outside and guides it to the user's eyes.
  • the half mirror 250 causes the image light A incident from the side of the user and the external light B from the outside to enter the user's eyes. As a result, the user can visually recognize the actual field of view and the image based on the video light A.
  • the electrical configuration of the HMD 200 will be described with reference to FIG.
  • the HMD 200 includes a display unit 240, a device connection interface 243, a flash memory 249, a control unit 246, a camera 207, and a power supply unit 247.
  • the control unit 246 includes at least a CPU 261, a ROM 262, and a RAM 248, and controls the entire HMD 200.
  • the control unit 246 executes various processes described below when the CPU 261 reads out various programs stored in the ROM 262.
  • the display unit 240 allows the user to visually recognize the image.
  • the display unit 240 includes an image signal processing unit 270, a laser group 272, and a laser driver group 271.
  • the image signal processing unit 270 receives an image signal from the control unit 246.
  • the image signal processing unit 270 converts the received image signal into signals necessary for direct projection onto the user's retina.
  • the laser group 272 includes a blue output laser (B laser) 721, a green output laser (G laser) 722, and a red output laser (R laser) 723.
  • the laser group 272 outputs blue, green, and red laser beams.
  • the laser driver group 271 performs control for outputting laser light from the laser group 272.
  • the image signal processing unit 270 is electrically connected to the laser driver group 271.
  • the B laser 721, the G laser 722, and the R laser 723 are collectively referred to as lasers.
  • the display unit 240 includes a vertical scanning mirror 812, a vertical scanning control circuit 811, a horizontal scanning mirror 792, and a horizontal scanning control circuit 791.
  • the vertical scanning mirror 812 performs scanning by reflecting the laser beam output from the laser in the vertical direction.
  • the vertical scanning control circuit 811 performs drive control of the vertical scanning mirror 812.
  • the horizontal scanning mirror 792 performs scanning by reflecting the laser beam output from the laser in the horizontal direction.
  • the horizontal scanning control circuit 791 performs drive control of the horizontal scanning mirror 792.
  • the image signal processing unit 270 is electrically connected to the vertical scanning control circuit 811 and the horizontal scanning control circuit 791.
  • the image signal processing unit 270 is electrically connected to the control unit 246 via a bus. Therefore, the image signal processing unit 270 can cause each laser to output laser light with a color and timing according to the image signal received from the control unit 246. Further, the image signal processing unit 270 can reflect the laser light in a direction corresponding to the image signal received from the control unit 246. As a result, the HMD 200 can scan a light beam corresponding to the image signal in a two-dimensional direction, and guide the scanned light to the user's eyes to form a display image on the retina.
  • the camera control unit 299 includes a camera 207 and a camera control circuit 208.
  • the camera 207 captures the same direction as the user's line of sight (see FIG. 6).
  • a camera control circuit 208 controls the camera 207.
  • the camera control circuit 208 is electrically connected to the control unit 246 via a bus. Therefore, the control unit 246 can cause the camera 207 to perform shooting or acquire image data shot by the camera 207.
  • the GPS control unit 289 includes a GPS receiver 288 and a GPS control circuit 287.
  • the GPS receiver 288 acquires the latitude and longitude indicating the current position of the HMD 200.
  • the GPS control circuit 287 controls the GPS receiver 288.
  • the GPS control circuit 287 is electrically connected to the control unit 246 through a bus. Therefore, the control unit 246 can acquire the latitude and longitude of the current position from the GPS receiver 288.
  • the accelerometer control unit 239 includes an accelerometer 238 and an accelerometer control circuit 237.
  • the accelerometer 238 acquires the direction in which the user wearing the HMD 200 is traveling (that is, the shooting direction of the camera 207).
  • the accelerometer control circuit 237 controls the accelerometer 238.
  • the accelerometer control circuit 237 is electrically connected to the control unit 246 via a bus. Therefore, the control unit 246 can acquire the direction of the HMD 200 (that is, the shooting direction of the camera 207) from the accelerometer 238.
  • the device connection interface 243 is a controller that controls data communication via the cable 190.
  • the device connection interface 243 is electrically connected to the control unit 246 via a bus. Therefore, the control unit 246 can receive image data from the HMD control terminal 100 and allow the user to visually recognize the image data. In addition, the control unit 246 can transmit image data captured by the camera 207 and position information at the time of capturing to the HMD control terminal 100.
  • the power supply unit 247 includes a battery 259 and a charge control circuit 260.
  • the battery 259 is a power source that drives the HMD 200.
  • the charge control circuit 260 supplies the power of the battery 259 to the HMD 200.
  • the flash memory 249, the video RAM 244, and the font ROM 245 are electrically connected to the control unit 246 via a bus.
  • the control unit 246 can appropriately refer to information stored in the flash memory 249, the video RAM 244, and the font ROM 245.
  • the HMD control terminal 100 is an external device that supplies image data to the HMD 200.
  • the HMD control terminal 100 includes a CPU 120, ROM 121, RAM 122, HDD 123, communication device 124, operation key 125, device connection interface 126, and I / O interface 129.
  • the ROM 121, RAM 122, and I / O interface 129 are each connected to the CPU 120.
  • the HDD 123, the communication device 124, the operation keys 125, and the device connection interface 126 are connected to the I / O interface 129.
  • the communication device 124 is a controller that controls data communication via the network 4.
  • the device connection interface 126 is a controller that controls data communication via the cable 190.
  • the HDD 133 is a large-capacity hard disk drive, and stores various programs for operating the HMD control terminal 100 and image data to be displayed on the HMD 200.
  • the HDD 133 also stores a program for executing an upload process (see FIG. 10) and an image display process (see FIG. 14) described later.
  • Processing executed in the image providing system 1 will be described with reference to FIGS. More specifically, the processes executed by the image management server 2, the position management server 3, the user terminal 5, and the fixed point camera 10 will be described.
  • the upload process executed by the fixed point camera 10 will be described.
  • the CPU of the fixed point camera 10 When the fixed point camera 10 is powered on, the CPU of the fixed point camera 10 repeatedly executes the upload process (FIG. 9) based on the control program stored in the ROM of the fixed point camera 10.
  • step S1 YES
  • image data indicating the video currently being shot is acquired from the camera unit that is always shooting the installation point (S3).
  • the current date and time indicated by a timer (not shown) built in the fixed point camera 10 is acquired as a time stamp (S5).
  • the positional information (that is, the latitude, longitude, and direction of the fixed point camera 10) stored in the ROM of the fixed point camera 10 is acquired (S7).
  • step S7 data upload to the image management server 2 is executed (S9). Specifically, a data file including the image data acquired in step S3, the time stamp acquired in step S5, and the position information acquired in step S7 is sent to the image management server 2 via the network 4. Sent. After the execution of step S9 or when it is not a predetermined timing (S1: NO), the process returns to step S1. With the above processing, the fixed point camera 10 uploads the image data showing the current state of the installation point to the image management server 2 at intervals of 10 minutes.
  • the upload process executed on the user terminal 5 will be described.
  • the CPU 120 When the user terminal 5 (that is, the HMD control terminal 100 and the HMD 200) is powered on, the CPU 120 repeatedly executes this process based on a program stored in the HDD 133.
  • step S11 In the upload process of the user terminal 5, processes similar to steps S1 to S9 in FIG. 9 are executed (S11 to S19). Specifically, every time 10 minutes have passed since the previous data upload (step S19 described later), it is determined that the predetermined timing is reached (step S11: YES). In step S11, even when a condition other than the elapsed time is satisfied (for example, when the user performs imaging using the camera 207 via the operation key 125), it is determined that the predetermined timing is reached. Good. In this case, the camera 207 of the HMD 200 captures the current position, and image data captured by the camera 207 is acquired from the HMD 200 (S13). The current date and time indicated by a timer (not shown) built in the HMD control terminal 100 is acquired as a time stamp (S15).
  • the GPS receiver 288 is caused to measure the latitude and longitude of the current position.
  • the accelerometer 238 is caused to measure the direction of the user terminal 5.
  • the latitude, longitude, and direction measured in this way are acquired from the HMD 200 as position information of the current position (S17).
  • a data file including the information acquired in steps S13, S15, and S17 is transmitted to the image management server 2 via the network 4 (S19).
  • a current position notification is transmitted to the position management server 3 via the network 4 (S21).
  • the current position notification includes the terminal ID unique to the user terminal 5 and the position information acquired in step S17.
  • the process returns to step S11.
  • the user terminal 5 uploads the image data showing the current position of the user terminal 5 to the image management server 2 at intervals of 10 minutes. Further, the current position of the user terminal 5 is notified to the position management server 3 at intervals of 10 minutes.
  • the DB update process executed by the image management server 2 will be described.
  • the CPU 20 When the image management server 2 is powered on, the CPU 20 repeatedly executes this process based on a program stored in the HDD 33.
  • step S31 the image management server 2 can collect image data of various locations from the fixed point camera 10 and the user terminal 5.
  • a record including the image data “camera_a.jpg” is registered in the image management database 70 based on the data file uploaded from the fixed point camera 10 “camera_a”.
  • position information indicating the installation position of the fixed point camera 10 “camera_a”
  • a time stamp indicating the acquisition date and time of the image data “camera_a.jpg” are set.
  • a record including the image data “hmd001_a.jpg” is registered in the image management database 70 based on the data file uploaded from the user terminal 5 “hmd001” worn by the user A (see FIG. 2). .
  • image data “camera_a.jpg” position information indicating the position of the user terminal 5 “hmd001” at the time of acquisition and a time stamp indicating the acquisition date and time are set.
  • the DB update process executed by the location management server 3 will be described.
  • the CPU When the position management server 3 is powered on, the CPU repeatedly executes this process based on a program stored in the HDD.
  • the position management server 3 determines whether or not there is a current location notification (S41). Specifically, when the current position notification is received from the user terminal 5, it is determined that there is a current position notification (S41: YES). In this case, the location management database 80 is updated based on the received current location notification (S43). After execution of step S43 or when there is no current position notification (S43: NO), the process returns to step S41 and waits for reception of the current position notification. With the above processing, the position management server 3 can always grasp the installation position of the fixed point camera 10 fixed in each place and the current position of the user terminal 5 that can move to each place.
  • a record including the terminal ID “hmd001” is registered in the location management database 80 based on the current location notification received from the user terminal 5 “hmd001” worn by the user A (see FIG. 2). Has been.
  • position information indicating the current position of the user terminal 5 “hmd001” is set.
  • the image providing process executed by the image management server 2 will be described.
  • the CPU 20 When the image management server 2 is powered on, the CPU 20 repeatedly executes this process based on a program stored in the HDD 33.
  • the map request is a signal for requesting map data from the image management server 2 by the user terminal 5, and includes a display range of a map image designated by the user (hereinafter referred to as a map display range).
  • map data indicating the map display range included in the map request is acquired from the map data storage area 232 (S53).
  • the time stamp corresponding to the map display range of the map data acquired in step S53 is acquired from the image management database 70 (S55). Specifically, all image data having latitude and longitude included in the map display range of the map data is searched from the image management database 70. A time stamp associated with the image data hit in the search is acquired from the image management database 70.
  • map drawing information is transmitted to the requesting user terminal 5 via the network 4 (S57). The map drawing information is information used to display a map image on the user terminal 5, and includes the map data acquired in step S53 and the time stamp acquired in step S55.
  • time stamps are acquired from all records including position information matching the latitude and longitude (S55). ).
  • the time stamp “100707120230” is acquired.
  • the map data corresponding to the map display range is transmitted to the requesting user terminal 5 together with the time stamp indicating the image acquisition date and time at each point in the map display range.
  • step S57 it is determined whether an image request has been received (S59).
  • the image request is a signal for the user terminal 5 to request image data from the image management server 2 and includes position information of a point designated by the user (hereinafter referred to as a designated point).
  • the image data associated with the position information of the designated point is specified with reference to the image management database 70 (S61).
  • a list of image data which is a list of image data specified in step S61, is transmitted to the requesting user terminal 5 via the network 4 (S63).
  • image data from all records including position information that matches the latitude, longitude, and direction. Is identified (S61).
  • the image data “camera_a.jpg” is specified.
  • step S63 thumbnail images obtained by reducing and displaying each of these image data and an image data list indicating the acquisition date and time are transmitted to the user terminal 5 that has made the request.
  • step S63 it is determined whether an image selection instruction has been received (S65).
  • the image selection instruction includes identification information of image data selected by the user from the image data list. Based on the identification information included in the image selection instruction, the CPU 20 can identify the image data selected by the user from among the image data identified in step S61.
  • an image selection instruction is received (S65: YES)
  • the time stamp of the image data selected by the user is specified with reference to the image management database 70. If the elapsed time from the identified time stamp to the current date and time is within a predetermined value (in this embodiment, less than 10 minutes), it is determined that the image data is the current image (S67: YES).
  • an image provision instruction is transmitted to the designated point via the network 4 (S69).
  • the image providing instruction provides image data being captured by the requesting user terminal 5 (in this embodiment, video data that is a moving image that can be played back in real time) to the image acquisition device existing at the designated point. Is a signal for instructing.
  • the CPU 20 inquires of the position management server 3 about the image acquisition device existing at the designated point. In response to an inquiry from the image management server 2, the CPU of the position management server 3 refers to the position management database 80 to identify the image acquisition device corresponding to the designated point and notifies the image management server 2 of the image acquisition device.
  • the CPU 20 transmits a request for transmitting video data in real time to the requesting user terminal 5 to the image acquisition apparatus notified from the position management server 3.
  • the image acquisition device is identified from the record including this position information (S67). : YES, S69).
  • the fixed point camera 10 “camera_a” is specified as the image acquisition device.
  • a request is transmitted to the fixed point camera 10 “camera_a”, thereby providing the requesting user terminal 5 with the video data of the fixed point camera 10 “camera_a” in real time.
  • the image data selected by the user is read from the image management database 70 and transmitted to the requesting user terminal 5 via the network 4 (S71).
  • the image data indicated by the image selection instruction is “hmd — 001.jpg”
  • the time difference between the time stamp “100707080513” and the current date and time is 10 minutes or more (S67: NO). Therefore, in step S71, the image data “hmd — 001.jpg” is read from the image management database 70 and transmitted to the requesting user terminal 5.
  • step S69 or step S71 the process returns to step S51.
  • step S51: NO a map request is not received
  • S59: NO an image request is not received
  • S65: NO an image selection instruction is not received
  • the process returns to step S51.
  • the image management server 2 can provide map data and image data of a designated point in response to a request from the user terminal 5.
  • the video data can be provided to the user terminal 5 in real time from the image acquisition device at the designated point.
  • the HDD 133 stores an application program that displays image data, map data, and the like provided from the image management server 2.
  • the image display process is repeatedly executed by the CPU 120 when the application program is activated.
  • a map display operation (S101).
  • the map display operation is such that the user wearing the HMD 200 moves, enlarges, reduces, etc. a map image (in detail, a heat map described later) projected on the eyes using the operation keys 125 (see FIG. 8). It is an operation to do.
  • a map display operation (S101: YES)
  • a map request is transmitted to the image management server 2 via the network 4 (S103).
  • map drawing information is received (S105: YES).
  • a heat map is created (S107).
  • the heat map is an image in which a special display indicating temporal reliability of the image data is added to the map data. Temporal reliability means whether or not the elapsed time calculated from the acquisition of image data is small (that is, whether or not the image data is new).
  • this map display range Is transmitted (S103).
  • the map drawing information including the map data and the time stamp is returned from the image management server 2.
  • the CPU 120 creates a heat map based on the map drawing information received from the image management server 2 (S107).
  • the map data 300 included in the map drawing information is a map image showing a map display range.
  • the display area for each place included in the map display range is color-coded according to the time stamp of each point included in the map drawing information.
  • a heat map is created that indicates the newness of the update date and time of image data at each point included in the map display range in a color-coded manner. In this heat map, a point where the time stamp is new is colored in a warm color, while a point where the time stamp is old is colored in a cold color.
  • the display area of the point corresponding to the time stamp is colored in “red”.
  • the display area of the point corresponding to the time stamp is colored “pink”.
  • the display area of the point corresponding to the time stamp is colored “yellow”.
  • the display area of the point corresponding to the time stamp is colored “colorless”. Since the display area of the point where the time stamp does not exist does not include image data obtained by photographing the point, the display area of the point is colored “black”.
  • the user can easily visually determine whether or not the image data of each place included in the map display range is new with reference to the heat map 310. In other words, the user can visually discriminate between a point where the image data acquisition date is new and a point where the image data acquisition date is old with reference to the heat map 310.
  • the heat map created in step S107 is displayed on the HMD 200 (S109). Specifically, the heat map is displayed on the eyes of the user wearing the HMD 200.
  • step S109 it is determined whether or not there is an input of a designated point / direction (S111). For example, when the user performs a cursor operation with the operation key 125 and designates the coordinate position and the shooting direction on the heat map, it is determined that there is an input of the designated point / direction (S111: YES).
  • an image request for requesting image data at the designated point is transmitted to the image management server 2 via the network 4 (S113).
  • step S113 it is determined whether an image data list has been received (S115). When the image data list is received (S115: YES), the image data list is displayed on the HMD 200 (S117).
  • An image request including the position information is transmitted (S113).
  • the image management server 2 returns a list of image data including the thumbnail image of the image data of the designated point (image data “camera_a.jpg” in the example of FIG. 4) and the acquisition date and time.
  • the CPU 120 displays the image data of the designated point in the form of thumbnails on the eyes of the user wearing the HMD 200 (S117).
  • step S117 After execution of step S117, it is determined whether image data has been selected (S119). For example, when the user performs a cursor operation with the operation key 125 and selects a thumbnail image from the image data list, it is determined that image data has been selected (S119: YES). In this case, an image selection instruction including identification information of the selected image data is transmitted to the image management server 2 via the network 4 (S121). After execution of step S121, a data display process described later is executed (S123), and the process returns to step S101.
  • step S101: NO when there is no map display operation (S101: NO), when map drawing information is not received (S105: NO), when there is no input of a designated point / direction (S111: NO), an image data list is received. If not (S115: NO) and if no target image is selected (S119: NO), the process returns to step S101.
  • the user terminal 5 can display a heat map indicating the temporal reliability of image data obtained by photographing each point on the map image in response to a user request.
  • image data As shown in FIG. 17, in the data display process (S123), it is determined whether image data has been received (S151).
  • image data is received from the image management server 2 (S151: YES)
  • a predetermined warning is displayed on the HMD 200 (S155). Examples of the predetermined warning include “This image was acquired more than a month ago and may be different from the current landscape”.
  • step S155 After execution of step S155 or when the image data is not older than the predetermined time (S153: NO), the received image data is displayed on the HMD 200 (S157). If a warning is displayed in step S155, image data is displayed together with the warning. On the other hand, if no image data has been received (S151: NO), it is determined whether video data has been received (S159). When video data is received from the fixed point camera 10 (S159: YES), the received video data is streamed and played back by the HMD 200 (S161).
  • an image selection instruction including the identification information is transmitted (S121).
  • the image data “camera_a.jpg” is returned from the image management server 2.
  • the CPU 120 displays the image data “camera_a.jpg” received from the image management server 2 on the eyes of the user wearing the HMD 200 (S157).
  • an image selection instruction including the identification information is transmitted (S121).
  • real-time video data is transmitted from the fixed point camera 10 “hmd_001”.
  • the CPU 120 displays the video data received from the fixed point camera 10 “hmd_001” on the eyes of the user wearing the HMD 200 (S161).
  • step S157 or step S161 or when no video data is received (S159: NO)
  • the data display process ends, and the process returns to the image display process (FIG. 14).
  • the user terminal 5 can acquire and display image data or video data of a designated point designated from the heat map from the image management server 2 or the fixed point camera 10.
  • the image data acquired by the fixed point camera 10 or the user terminal 5 is stored in the image management database 70 together with the position information and time stamp at the time of acquisition.
  • the map data stored in the map data storage area 232 is returned to the user terminal 5 in response to a request from the user terminal 5.
  • the time stamp associated with the position information included in the display range of the map data is also returned to the user terminal 5.
  • a heat map is generated based on the map data and time stamp acquired from the image management server 2, and displayed on the HMD 200.
  • a heat map is an image in which a special display indicating temporal reliability of image data is added to the map data. Therefore, the user of the user terminal 5 can easily grasp the reliability of the image data obtained by photographing each point on the map image by referring to the heat map displayed on the HMD 200.
  • the user of the user terminal 5 displays the image data of the designated point selected on the heat map on the HMD 200. Therefore, the user can browse the image data with the HMD 200 after confirming the temporal reliability of the image data at each point on the heat map. Further, when the image data of the designated point selected on the heat map is older than a predetermined time, a warning is displayed on the HMD 200. Therefore, the user can surely grasp that the temporal reliability of the image data at the designated point is low. In addition, when there are a plurality of image data at a designated point, the user can freely select image data to be displayed on the HMD 200 from the image data list.
  • the fixed point camera 10 and the user terminal 5 correspond to the “image acquisition device” of the present invention.
  • the image management server 2 corresponds to the “server” of the present invention.
  • the user terminal 5 corresponds to the “image display device” of the present invention.
  • the CPU of the fixed point camera 10 that executes step S3 and the CPU 120 that executes step S13 correspond to the “image data acquisition unit” of the present invention.
  • the CPU of the fixed point camera 10 that executes step S7 and the CPU 120 that executes step S17 correspond to the “positional information acquisition unit” of the present invention.
  • the CPU of the fixed point camera 10 that executes step S5 and the CPU 120 that executes step S15 correspond to the “reliability information generation unit” of the present invention.
  • the CPU of the fixed-point camera 10 that executes step S9 and the CPU 120 that executes step S19 correspond to the “information transmitting unit” of the present invention.
  • the CPU 20 executing step S35 corresponds to the “storage control means” of the present invention.
  • the CPU 20 that executes step S57 corresponds to the “information transmitting unit” of the present invention.
  • the CPU 120 executing steps S103 and S105 corresponds to the “information acquisition unit” of the present invention.
  • the CPU 120 that executes step S107 corresponds to the “composite image generation unit” of the present invention.
  • the CPU 120 executing step S109 corresponds to the “composite image display unit” of the present invention.
  • the CPU 120 that executes step S111 corresponds to the “position specifying means” of the present invention.
  • the CPU 120 that executes steps S151 and S159 corresponds to the “target data acquisition unit” of the present invention.
  • the CPU 120 that executes steps S157 and S161 corresponds to the “target data display unit” of the present invention.
  • the CPU 120 executing step S155 corresponds to the “warning display means” of the present invention.
  • the CPU 120 executing step S117 corresponds to the “target list display unit” of the present invention.
  • the CPU 120 that executes steps S103 and S105 corresponds to the “map data acquisition means” and “reliability information acquisition means” of the present invention.
  • Steps S103 and S105 correspond to the “map data acquisition step” and “reliability information acquisition step” of the present invention.
  • Step S107 corresponds to the “composite image generation step” of the present invention.
  • Step S109 corresponds to the “composite image display step” of the present invention.
  • the retinal scanning display is exemplified as the HMD 200, but the display method can be changed.
  • the display method can be changed.
  • it may be a head mounted display of another display method such as a liquid crystal display or an organic EL (ElectroLuminescence) display.
  • a display device that can be carried by the user, such as a mobile phone, a notebook computer, or a PDA, may be used.
  • the time stamp indicating the acquisition date and time of the image data is exemplified as the reliability information indicating the temporal reliability of the image data.
  • the update frequency of the image data may be used as the reliability information.
  • the update frequency of the image data (for example, 10 minute interval, 1 hour interval, 1 day interval, etc.) may be managed instead of the time stamp.
  • the time interval of the update frequency of the image data at each point included in the map display range (that is, the number of updates per unit time) is color-coded and shown stepwise.
  • a heat map is created. In this heat map, warm colors are used for points with a high update frequency, while cool colors are used for points with a low update frequency.
  • the user can easily visually determine whether or not the image data of each place included in the map display range is new with reference to the heat map. In other words, the user can visually discriminate between a point where the image data update frequency is high and a point where the image data update frequency is low with reference to the heat map.
  • a part of the image display process (FIG. 14) of the user terminal 5 may be executed by another computer.
  • the image management server 2 instead of transmitting the map drawing information (S57), heat map creation (S107) may be executed. Then, the heat map created by the image management server 2 may be transmitted to the user terminal 5. In this case, the heat map can be displayed on the HMD 200 while omitting the heat map creation process in the user terminal 5.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Processing Or Creating Images (AREA)
  • Instructional Devices (AREA)
  • Navigation (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

L'invention concerne un système de fourniture d'images dans lequel un dispositif d'acquisition d'images pour acquérir des données d'image, un serveur pour gérer les données d'image, et un dispositif d'affichage d'images pour afficher les données d'image sont reliés par un réseau. Sur la base du rythme auquel les données d'image sont acquises, des informations de fiabilité indiquant la fiabilité en fonction du temps des données d'image sont générées. Les données d'image, les informations de position, et les informations de fiabilité sont stockées sur le serveur. Le dispositif d'affichage d'image affiche une image composite où une image spéciale indiquant la fiabilité en fonction du temps des données d'image est ajoutée à des données cartographiques, sur la base des données cartographiques et des informations de fiabilité.
PCT/JP2011/066833 2010-07-30 2011-07-25 Système de fourniture d'images, dispositif d'affichage d'images, programme d'affichage d'images et procédé d'affichage d'images WO2012014837A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010172521A JP2012034216A (ja) 2010-07-30 2010-07-30 画像提供システム、画像表示装置、および画像表示プログラム
JP2010-172521 2010-07-30

Publications (1)

Publication Number Publication Date
WO2012014837A1 true WO2012014837A1 (fr) 2012-02-02

Family

ID=45530045

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/066833 WO2012014837A1 (fr) 2010-07-30 2011-07-25 Système de fourniture d'images, dispositif d'affichage d'images, programme d'affichage d'images et procédé d'affichage d'images

Country Status (2)

Country Link
JP (1) JP2012034216A (fr)
WO (1) WO2012014837A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016033611A (ja) * 2014-07-31 2016-03-10 セイコーエプソン株式会社 情報提供システム、表示装置、および、表示装置の制御方法
JP6424100B2 (ja) * 2015-01-29 2018-11-14 株式会社ゼンリンデータコム ナビゲーションシステム、ナビゲーション装置、グラス型デバイス及び装置間連携方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11143941A (ja) * 1997-11-06 1999-05-28 Hitachi Ltd 被害表示装置及びその処理プログラムを記録した媒体
JP2001336945A (ja) * 2000-05-26 2001-12-07 Alpine Electronics Inc ナビゲーション装置
JP2003299156A (ja) * 2002-04-05 2003-10-17 Matsushita Electric Ind Co Ltd 外部映像入手システムおよびそれに用いられる装置
JP2008170930A (ja) * 2006-12-12 2008-07-24 Asia Air Survey Co Ltd 地図情報関連付き画像データ表示システムおよび地図情報関連付き画像データ表示のプログラム
JP2009058922A (ja) * 2007-09-04 2009-03-19 Sony Corp 地図情報表示装置、地図情報表示方法、及びプログラム

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4750014B2 (ja) * 2006-12-28 2011-08-17 シャープ株式会社 情報表示装置、情報提供サーバ、情報表示システム、情報表示装置の制御方法、情報提供サーバの制御方法、制御プログラム、および、記録媒体
JP5097507B2 (ja) * 2007-11-05 2012-12-12 オリンパスイメージング株式会社 情報処理装置、情報処理装置の制御プログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11143941A (ja) * 1997-11-06 1999-05-28 Hitachi Ltd 被害表示装置及びその処理プログラムを記録した媒体
JP2001336945A (ja) * 2000-05-26 2001-12-07 Alpine Electronics Inc ナビゲーション装置
JP2003299156A (ja) * 2002-04-05 2003-10-17 Matsushita Electric Ind Co Ltd 外部映像入手システムおよびそれに用いられる装置
JP2008170930A (ja) * 2006-12-12 2008-07-24 Asia Air Survey Co Ltd 地図情報関連付き画像データ表示システムおよび地図情報関連付き画像データ表示のプログラム
JP2009058922A (ja) * 2007-09-04 2009-03-19 Sony Corp 地図情報表示装置、地図情報表示方法、及びプログラム

Also Published As

Publication number Publication date
JP2012034216A (ja) 2012-02-16

Similar Documents

Publication Publication Date Title
RU2670784C9 (ru) Ориентация и визуализация виртуального объекта
US10554829B2 (en) Information processing device, photographing device, image sharing system, and method of information processing
JP5423716B2 (ja) ヘッドマウントディスプレイ
US10924691B2 (en) Control device of movable type imaging device and control method of movable type imaging device
WO2020261927A1 (fr) Système de présentation, dispositif de présentation et procédé de présentation
JP5532026B2 (ja) 表示装置、表示方法及びプログラム
EP3766255B1 (fr) Procédé d'obtention d'informations concernant un luminaire
US9699366B2 (en) Image providing apparatus, image display device, imaging system, image display system, and image providing method in which composite image data is generated using auxiliary image data generated by at least one auxiliary imaging unit
CN104487982B (zh) 基于用户穿戴的可穿戴物的光学检测提供服务
US8903957B2 (en) Communication system, information terminal, communication method and recording medium
JP7079125B2 (ja) 空中写真撮影管理システム及びプログラム
WO2012014837A1 (fr) Système de fourniture d'images, dispositif d'affichage d'images, programme d'affichage d'images et procédé d'affichage d'images
JP2013168854A (ja) 撮影装置、サーバ装置及び管理システム
JP5007631B2 (ja) 電子カメラ
JP6677684B2 (ja) 映像配信システム
JP2013021473A (ja) 情報処理装置、情報取得方法およびコンピュータプログラム
CN108012141A (zh) 显示装置、显示系统和显示装置的控制方法
CN114584702A (zh) 一种拍摄可见光和热成像重叠图的方法及系统
JP7365783B2 (ja) 圃場情報管理システム、圃場情報管理システムの制御方法及び圃場情報管理システムの制御プログラム
JP6450890B2 (ja) 画像提供システム、画像提供方法、およびプログラム
CN111162840B (zh) 用于设置光通信装置周围的虚拟对象的方法和系统
JP2012191261A (ja) 情報提供装置、システム、方法及びプログラム
US20180067623A1 (en) Headset device and visual feedback method and apparatus thereof
US20240134195A1 (en) Electronic device and method for obtaining media corresponding to location by controlling camera based on location
US20240133996A1 (en) Identification of wearable device locations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11812427

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11812427

Country of ref document: EP

Kind code of ref document: A1