WO2008068990A1 - 画像表示システム、表示装置、表示方法 - Google Patents
画像表示システム、表示装置、表示方法 Download PDFInfo
- Publication number
- WO2008068990A1 WO2008068990A1 PCT/JP2007/071491 JP2007071491W WO2008068990A1 WO 2008068990 A1 WO2008068990 A1 WO 2008068990A1 JP 2007071491 W JP2007071491 W JP 2007071491W WO 2008068990 A1 WO2008068990 A1 WO 2008068990A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display
- imaging
- image data
- server device
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 38
- 238000003384 imaging method Methods 0.000 claims description 267
- 238000012545 processing Methods 0.000 claims description 99
- 238000004891 communication Methods 0.000 claims description 66
- 230000005540 biological transmission Effects 0.000 claims description 41
- 230000008569 process Effects 0.000 claims description 23
- 238000001514 detection method Methods 0.000 claims description 17
- 230000004044 response Effects 0.000 claims description 11
- 238000010586 diagram Methods 0.000 description 12
- 238000005286 illumination Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 9
- 238000007726 management method Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 230000001133 acceleration Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 210000003128 head Anatomy 0.000 description 5
- 239000004973 liquid crystal related substance Substances 0.000 description 4
- 230000035945 sensitivity Effects 0.000 description 4
- 239000011521 glass Substances 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 241000282412 Homo Species 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000001454 recorded image Methods 0.000 description 2
- 239000000523 sample Substances 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 238000002834 transmittance Methods 0.000 description 2
- 206010000117 Abnormal behaviour Diseases 0.000 description 1
- 241000251468 Actinopterygii Species 0.000 description 1
- 241000271566 Aves Species 0.000 description 1
- 241000938605 Crocodylia Species 0.000 description 1
- 241000238631 Hexapoda Species 0.000 description 1
- 241000124008 Mammalia Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000017531 blood circulation Effects 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 238000003331 infrared imaging Methods 0.000 description 1
- 238000012905 input function Methods 0.000 description 1
- 230000003137 locomotive effect Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000000241 respiratory effect Effects 0.000 description 1
- 230000036387 respiratory rate Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 210000002784 stomach Anatomy 0.000 description 1
- 230000035900 sweating Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000029305 taxis Effects 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
- 238000009423 ventilation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
- G06F16/48—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
- G06F16/48—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/487—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/64—Constructional details of receivers, e.g. cabinets or dust covers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/78—Television signal recording using magnetic recording
- H04N5/782—Television signal recording using magnetic recording on tape
Definitions
- Image display system display device, and display method
- the present invention relates to an image display system, a display device, and a display method, and in particular, based on a position designation on a map image on the display device side! /, An image display captured by an external imaging device. It relates to the technology to be performed.
- Japanese Patent Application Laid-Open No. 2005-341604 describes an example of a data communication system.
- Japanese Patent Laid-Open No. 2005-337863 discloses a technique for updating a three-dimensional map of a carna vision system using an image captured by a camera.
- No. 79 discloses an attempt to expand and enjoy broadcast / recorded programs on the stomach W (World Wide Web).
- an object of the present invention is to allow a user to see an image actually captured at a certain position by designating a certain position on a map image.
- the image display system of the present invention includes a display device, an image pickup device that is mounted on a moving body and picks up an image, and a server device.
- the display device and the image pickup device are respectively provided. It is possible to communicate with the server device.
- the image pickup apparatus includes an image pickup unit that picks up an image, a position detection unit that detects position information, image data obtained by the image pickup by the image pickup unit, and at least the position detection unit at the time of shooting the image data.
- the server device stores the image data and the additional data transmitted from the imaging device, and the storage device stores the image data and the additional data based on the position designation information transmitted from the display device.
- the display device includes: display means for displaying an image; map display processing for displaying a map image on the display means; position specifying processing for setting position specifying information based on an input to the map image; An image request transmission process for requesting image data by transmitting designation information, and a display operation based on the received image data received from the server apparatus in response to the image request transmission process. And a control means for performing display processing to be executed.
- the moving body to which the imaging device is attached is a person, a living organism other than a person, a ground mobile device, a marine mobile device, a marine mobile device, an aeronautical mobile device, or a space mobile device. Either.
- the imaging apparatus further includes date and time detection means for detecting the current date and time, and the control means of the imaging apparatus detects the additional data detected by the date and time detection means when imaging image data.
- the control means of the display device performs date and time designation processing for setting date and time designation information for designating the date and time. In the image request transmission process, the date and time designation information is included with the position designation information.
- the server device control means stores the information in the storage device based on the position designation information and date / time designation information transmitted from the display device during the search transmission control process. Search for image data.
- the display device of the present invention includes a display unit that displays an image, a communication unit that performs data communication with an external server device, a map display process that displays a map image on the display unit, Location designation processing for setting location designation information based on input to a map image, Image request transmission processing for requesting image data by transmitting the location designation information to the server device by the communication means, and image request transmission processing And control means for performing display processing for receiving the image data transmitted from the server device by the communication means and causing the display means to perform a display operation based on the received image data.
- control means uses the latitude / longitude information of the point specified and input to the map image in the position specifying process as the position specifying information.
- control means further performs a date and time designation process for setting date and time designation information for designating a date and time, and in the image request transmission process, the date and time designation information together with the position designation information is sent to the server device. Process to send.
- the display means is arranged to display an image by being disposed in front of the user's eyes.
- the image processing apparatus further includes an imaging unit that captures an image and a position detection unit that detects position information.
- the control unit captures image data obtained by imaging of the imaging unit, and at least captures the image data. At this time, it is also possible to execute a transmission control process in which additional data including the position information detected by the position detecting means is transmitted from the communication means to the server device.
- the display method of the present invention includes a map display processing step for displaying a map image, a location specification processing step for setting location specification information based on an input to the map image, and the location specification on an external server device.
- An image request transmission processing step for transmitting information and requesting image data, and a display for receiving image data transmitted from the server device in response to the image request transmission processing and performing display based on the received image data Steps.
- the user (user) of the display device designates the position on the map image, so that the image actually taken by the imaging device at the designated position! Can be seen.
- an imaging device worn by another person, an imaging device attached to a car, a train, or the like, an imaging device attached to an animal, a bird, or the like is considered.
- image data (moving image or still image) captured by these image capturing apparatuses is transmitted to the server apparatus together with additional data including the position information of the image capturing point, and stored in the server apparatus! Therefore, in the server device, image data captured by various image capturing devices at various locations is accumulated together with the position information.
- the server device can search for image data based on the position designation information by designating the position on the display device side and transmitting the position designation information to the server device.
- the server device searches for image data corresponding to the position designation information and sends it to the display device. I believe.
- the display device displays the received image data. Then, the user of the display device can see a scene captured at the designated point as the display image.
- map image referred to in the present invention is simply an image of a "map" representing roads, topography, buildings, natural objects, etc. on the ground, ocean maps, underwater topographic maps, aerial maps, space. It is assumed that images of various figures that can designate a specific place such as a figure are included.
- the user of the display device can specify the position on the map image to see the image actually captured by the imaging device at the specified position! it can.
- the present invention can respond to various needs of the user, such as when he / she wants to see and enjoy a scene of a certain place which can be designated on the map, knows the situation of the certain place which can be designated on the map! System and apparatus.
- being able to specify a position on a map image and view a captured image of that position means that the specified position status includes, for example, terrain, surrounding buildings and scenery of natural objects, road width and lanes. Since it is possible to know the situation according to the number and time of day, it is also possible to provide expanded information as a so-called navigation system.
- FIG. 1 is an explanatory diagram of an external appearance example of an imaging display device according to an embodiment of the present invention.
- FIG. 2 is an explanatory diagram of an appearance example of the imaging device and the display device according to the embodiment.
- FIG. 3 is an explanatory diagram of a system configuration example according to the embodiment.
- FIG. 4 is a block diagram of the imaging display device of the embodiment.
- FIG. 5 is a block diagram of the imaging apparatus according to the embodiment.
- FIG. 6 is a block diagram of a display device according to an embodiment.
- FIG. 7 is a block diagram of an imaging device and a display device according to the embodiment.
- FIG. 8 is a block diagram of the server device according to the embodiment.
- FIG. 9 is an explanatory diagram of an example of system operation according to the embodiment.
- FIG. 10 is an explanatory diagram of a point image database according to the embodiment.
- FIG. 11 is a flowchart of an example of system operation according to the embodiment.
- FIG. 12 is an explanatory diagram of an image at the time of designation using a map image in the embodiment.
- FIG. 13 is an explanatory diagram of a captured image of a designated point displayed in the embodiment.
- the imaging display device 1 or the display device 40 corresponds to a display device as claimed in the present invention, and the display method of the present invention is executed as the processing of the imaging display device 1 or the display device 40.
- the imaging display device 1 or the imaging device 30 corresponds to the imaging device referred to in the claims of the present invention. Therefore, the imaging display device 1 of the embodiment can function as both the V display device and the imaging device in the claims of the present invention.
- imaging display device imaging device, display device, server device
- FIG. 1 shows an example of the appearance of the imaging display device 1 of the embodiment.
- the imaging display device 1 can be worn by a user as a glasses-type display camera.
- the imaging display device 1 has a mounting unit having a frame structure that, for example, makes a half turn around the head from both sides to the back of the head, and is mounted on the user by being placed on the binaural shell as shown in the figure.
- the imaging display device 1 is a pair of left-eye and right-eye pairs in front of the user's eyes, that is, at a position where a lens in normal glasses is located.
- the display panel portions 2a and 2b are arranged.
- a liquid crystal panel is used for the display panel portions 2a and 2b.
- the imaging lens 3a is arranged facing forward so that imaging is performed with the user's field of view as the subject direction.
- the light emitting unit 4a that performs illumination in the imaging direction of the imaging lens 3a is provided by, for example, an LED (Light Emitting Diode).
- a pair of earphone speakers 5a that can be inserted into the right ear hole and the left ear hole of the user when worn are provided.
- Microphones 6a and 6b for collecting external sound are arranged on the right side of the display unit 2 for the right eye and on the left side of the display unit 2 for the left eye.
- FIG. 1 is an example, and various structures for the user to wear the imaging display device 1 can be considered.
- the display panel portions 2a and 2b are provided in the vicinity of the front of the user's eyes, and the imaging lens 3a is used. It suffices if the imaging direction is the user's field of view, that is, the front of the user.
- the display panels 2a and 2b are provided with a pair of display units corresponding to both eyes, a configuration in which one display unit is provided corresponding to one eye may be used.
- the imaging direction of the imaging lens 3a may be a direction other than the viewing direction, such as the rear or side of the user.
- the earphone speaker 5a may not be a left and right stereo speaker, but only one earphone speaker 5a may be provided to be attached only to one ear.
- the microphone may be one of the microphones 6a and 6b. Furthermore, a configuration in which the microphone and the earphone speaker are not provided as the imaging display device 1 is also conceivable.
- imaging display device 1 has a spectacle-type or head-mounted type mounting unit is shown.
- a headphone type, a neckband type, an ear hook type, or any other mounting unit may be worn by the user. Further, for example, it may be configured to be attached to the user by attaching to a normal eyeglass, visor, headphones, or the like with an attachment such as a clip. Moreover, it does not necessarily have to be worn on the user's head.
- the imaging display device 1 of FIG. 1 is an example in which the components for imaging and the display panel portions 2a and 2b that can monitor an image are formed in a body and worn by the user.
- the equipment worn by the user includes an imaging device 30 and a display device as shown in FIGS. 40 is also possible.
- the imaging device 30 in FIG. 2A is mounted on the user's temporal region by a predetermined mounting frame. Then, the imaging lens 3a and the light emitting unit 4a are arranged facing forward so that the user's visual field direction is taken as the subject direction in the mounted state. There is also a microphone 6a that collects external audio!
- the imaging device 30 is a device that does not have a display function! /, And that captures the user's field of view scene while being worn by the user. Note that the same shape, mounting structure, components, and the like as those of the imaging display device 1 can be considered for the imaging device 30 as well.
- the display device 40 of FIG. 2 (b) is an example of a wristwatch type display device, and a display panel unit 2a that can be visually recognized by the user when worn on the wrist of the user with a wristband. It has been formed.
- the force S indicating the wristwatch-type display device 40 and various shapes and mounting structures are assumed as the display device 40 worn or possessed by the user.
- the display device 40 may be a portable device that can be carried by a user.
- a display device 40 (a device in which the imaging function is eliminated from the imaging display device 1 in FIG. 1) that can be worn by the user in the shape of glasses as shown in FIG. 1 is also conceivable.
- a display device 40 that can be carried by the user
- devices having a display function such as a mobile phone, a portable game machine, and a PDA (Personal Digital Assistant) that can be used only with a dedicated device for monitoring display can be used. It can be the display device 40.
- stationary display devices In addition to devices worn or carried by the user, stationary display devices, computer devices, television receivers, in-vehicle display monitor devices, and the like can also be employed as the display device 40 of this example.
- the imaging device 30 and the display device 40 in FIGS. 2A and 2B may be used individually, but the user wears both the imaging device 30 and the display device 40. However, a configuration in which two devices are used as an image display device is also assumed. For example, the imaging device 30 and the display device 40 may perform data communication so that the display device 40 performs monitoring of a captured image on the imaging device 30 and display of an image transmitted from an external device.
- the imaging display device 1 and the display device 40 are used by a user (person).
- the imaging device 30 is used by being mounted on various mobile bodies, not limited to humans, and various mobile bodies can be considered including humans.
- the imaging device 30 as shown in FIG. 2 (a) is an example of a device that is worn by a person and images a person's field of view. Think diverse.
- a moving body other than a human is a living organism other than a human being, a ground mobile device, a marine mobile device, an underwater mobile device, an aeronautical mobile device, a space mobile device, or the like.
- Non-human organisms include birds, mammals, reptiles, amphibians, fish, insects, and other types of organisms.
- ground mobile devices examples include passenger cars, trucks, buses, taxis, motorcycles and other automatic vehicles, and bicycles, rickshaws, playground vehicles and other vehicles that move by human power.
- Railway vehicles such as trains and steam locomotives are also assumed.
- vehicles for amusement parks, commercial vehicles for factories and other facilities are also assumed.
- Submarine equipment such as submarines, submersible robots, and aqualungs are envisioned as subsea mobile devices.
- the imaging device 30 may have a shape or a mounting structure corresponding to each moving body to be mounted.
- the user of the imaging display device 1 or the display device 40 designates the position in the displayed map image, so that the image captured at the corresponding position on the other imaging display device 1 or the imaging device 30 is displayed. Can see. That is, when the user of the imaging display device 1 or the display device 40 is viewing the map, the user can see the image of the scene at the location actually shown on the map on his / her imaging display device 1 or the display device 40. .
- System type for this Figure 3 shows an example of the situation.
- map image includes an ocean map, an underwater map, an aerial map, and the like that are connected only by a map on the ground as described above.
- FIG. 3 shows a system configuration in which the imaging display device 1, the display device 40, the imaging device 30, and the server device 70 communicate via the network 60! /.
- the network 60 may be a wide area network such as the Internet, but of course a LAN (Local Area Network) or other narrow area network is also assumed.
- LAN Local Area Network
- the users A, B, and C are wearing the imaging display device 1 as shown in FIG. 1, for example.
- the user D wears the display device 40 as shown in FIG. 2B
- the user E wears the image pickup device 30 as shown in FIG.
- the imaging device 30 mounted on the moving body G and the moving body H is an imaging device that is mounted on a moving body (a moving body other than a person) such as an automobile, a railway vehicle, or an aircraft in a required shape. Suppose there is.
- the imaging device 30 worn by the user E and the imaging device 30 mounted on the moving bodies G and H are always or regularly or at any timing through communication via the network 60. Upload the image data taken to the server device 70 and the additional data including at least the location information of the imaging location.
- the imaging display device 1 worn by the users A, B, and C also captures images to the server device 70 by communication via the network 60 at all times, periodically, or at any timing.
- the power S can be used to upload the additional image data including the image data and at least the position information of the imaging location.
- the server device 70 registers and stores the imaging display device 1 or the imaging device 30 in a point image database to be described later.
- the imaging display device 1 worn by the users A, B, and C and the display device 40 used by the users D and F respond to the user specifying the position on the map image. Access the server device 70 via communication via the network 60, send the location specification information, and Do.
- the server device 70 searches the point image database based on the position designation information, and extracts image data captured at the position designated by the position designation information. Then, the image data is transmitted to the imaging display device 1 or the display device 40.
- the imaging display device 1 or the display device 40 receives the image data transmitted from the server device 70 and displays it.
- FIG. 4 shows a configuration example of the imaging display device 1.
- the system controller 10 includes, for example, a microcomputer including a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), a non-volatile resident memory unit, and an interface unit.
- the control unit controls the entire imaging display device 1.
- the system controller 10 performs various arithmetic processes and exchanges control signals with each unit via the bus 13 based on a program stored in an internal ROM or the like, and causes each unit to execute a required operation.
- an imaging unit 3 is provided as a configuration for imaging a scene in the user's viewing direction.
- the imaging unit 3 includes an imaging optical system, an imaging element unit, and an imaging signal processing unit.
- the imaging lens 3a shown in FIG. 1 a lens system including an aperture, a zoom lens, a focus lens, and the like, a force operation with respect to the lens system, A drive system or the like for performing a zoom operation is provided.
- the image sensor section in the image capturing section 3 is provided with a solid-state image sensor array that detects imaging light obtained by the imaging optical system and generates an imaging signal by performing photoelectric conversion.
- the solid-state imaging device array is, for example, a CCD (Charge Coupled Device) sensor array or a CMOS (Complementary Metal Oxide Semiconductor) sensor array.
- the imaging signal processing unit in the imaging unit 3 includes a sample hold / AGC (Automatic Gain Control) circuit that performs gain adjustment and waveform shaping on a signal obtained by the solid-state imaging device, and a video A / D converter. Obtain captured image data as digital data. Also, white balance processing, luminance processing, color signal processing, blur correction processing, and the like are performed on the captured image data.
- Imaging is performed by the imaging unit 3 having these imaging optical system, imaging element unit, and imaging signal processing unit, and image data obtained by imaging is obtained.
- the system controller 10 controls the on / off of the imaging operation in the imaging unit 3, the zoom lens of the imaging optical system, the drive control of the focus lens, the sensitivity of the imaging element unit, the frame rate control, and the imaging signal processing unit. Set parameters for each process and execution process.
- the captured image data obtained by the imaging operation of the imaging unit 3 can be supplied to the display unit 2, the storage unit 25, and the communication unit 26 via the image processing unit 15.
- the image processing unit 15 performs processing for converting captured image data into a predetermined image data format and required signal processing for display on the display unit 2 under the control of the system controller 10.
- Signal processing for display on the display unit 2 includes, for example, brightness level adjustment, color correction, contrast adjustment, sharpness (contour emphasis) adjustment, screen division processing, character image composition processing, generation of an enlarged or reduced image, Image effect processing such as mosaic image / brightness inverted image / soft focus / partial highlighting in image / variation of color atmosphere of entire image.
- the image processing unit 15 also performs image data transfer processing between the imaging unit 3, the display unit 2, the storage unit 25, and the communication unit 26. That is, processing for supplying captured image data from the imaging unit 3 to the display unit 2, the storage unit 25, and the communication unit 26, processing for supplying image data reproduced by the storage unit 25 to the display unit 2, and communication The image data received by the unit 26 is also supplied to the display unit 2.
- the display unit 2 is provided as a configuration for displaying the image to the user on the imaging display device 1.
- the display unit 2 is provided with display panel units 2a and 2b made of the above-described liquid crystal panel and the like, and a display driving unit for driving the display panel units 2a and 2b.
- the display driving unit is configured by a pixel driving circuit for displaying the image signal supplied from the image processing unit 15 on the display panel units 2a and 2b, which are liquid crystal displays, for example. That is, the display signals are executed by applying a drive signal based on the video signal at a predetermined horizontal / vertical drive timing to each pixel arranged in a matrix in the display panel units 2a and 2b. By this processing, display as an imaging monitor in the imaging unit 3, display of a reproduced image reproduced in the storage unit 25, and display of a received image received in the communication unit 26 are displayed on the display panel units 2a and 2b. Done.
- the display driving unit can also control the transmittance of each pixel of the display panel units 2a and 2b to be in the through state (transparent or translucent state).
- the system controller 10 performs on / off (through) control of the display operation in the display unit 2, instruction of processing parameters for image data to be displayed, screen area setting control, character generation instruction, and the like.
- the imaging display device 1 is provided with an audio input unit 6, an audio processing unit 16, and an audio output unit 5.
- the audio input unit 6 includes the microphones 6a and 6b shown in FIG. 1, and a microphone amplifier unit and an A / D converter for amplifying the audio signals obtained by the microphones 6a and 6b. Is output.
- the voice data obtained by the voice input unit 6 is supplied to the voice processing unit 16.
- the audio processing unit 16 controls the transfer of audio data according to the control of the system controller 10. That is, the audio data obtained by the audio input unit 6 is converted into the audio output unit 5 and the storage unit 25.
- audio data reproduced by the storage unit 25 and audio data received by the communication unit 26 are supplied to the audio output unit 5.
- the sound processing unit 16 performs processing such as volume adjustment, sound quality adjustment, and sound effect based on the control of the system controller 10.
- the audio output unit 5 includes a pair of earphone speakers 5a shown in FIG.
- the audio data supplied from the audio processing unit is converted into an analog audio signal by the D / A converter, amplified by the amplifier circuit, and output as audio from the earphone speaker 5a.
- the user can listen to the external sound, or the sound based on the sound data reproduced by the storage unit 25 or the sound data received by the communication unit 26.
- the audio output unit 5 may be configured using a so-called bone conduction speaker.
- the storage unit 25 is a part that records and reproduces image data (and audio data) on a predetermined recording medium.
- a predetermined recording medium For example, it is realized as an HDD (Hard Disk Drive).
- HDD Hard Disk Drive
- various recording media such as a solid-state memory such as a flash memory, a memory card incorporating a solid-state memory, an optical disk, a magneto-optical disk, and a hologram memory.
- the storage unit 25 performs recording / reproduction according to the recording medium employed. What is necessary is just to be set as the structure which can be performed.
- the storage unit 25 stores image data (and audio data) obtained by imaging and image data (and audio data) received by the communication unit 26 on a recording medium based on the control of the system controller 10. Record. That is, the image data supplied via the image processing unit 15 or the audio data supplied via the audio processing unit 16 or the image data or audio data received by the communication unit 26 to the recording medium. Encoding for recording is performed and recorded on a recording medium.
- the storage unit 25 can reproduce the recorded image data and audio data according to the control of the system controller 10.
- the reproduced image data is supplied to the display unit 2 through the image processing unit 15, and the audio data is supplied to the audio output unit 5 through the audio processing unit 16.
- the reproduced image data / audio data can also be supplied to the communication unit 26 as data to be transmitted to an external device.
- the communication unit 26 transmits / receives data to / from an external device, particularly the server device 70, via the network 60 in FIG.
- the communication unit 26 may be configured to perform network communication via short-range wireless communication with a network access point using a method such as wireless LAN or Bluetooth.
- the captured image data obtained by the imaging unit 3 is supplied to the communication unit 26 via the image processing unit 15.
- the voice data obtained by the voice input unit 6 is supplied via the voice processing unit 16.
- the communication unit 26 can perform a communication encoding process on the image data and audio data, perform modulation for wireless transmission, and transmit the result to an external device. That is, it is possible to transmit the image data / sound data currently captured and collected by the imaging display device 1 to an external device (server device 70).
- the image data and audio data reproduced by the storage unit 25 can be encoded for communication, modulated for wireless transmission, and transmitted to an external device.
- the system controller 10 When image data / audio data currently captured and collected by the imaging display device 1 is transmitted to, for example, the server device 70, the system controller 10 generates additional data, and the image data / It is encoded and transmitted by the communication unit 26 together with the audio data.
- the additional data includes image data management information, current position information detected by the position detector 12 (to be described later), and current date information counted by the date / time counter 28.
- the image data / sound data that has been picked up and collected is temporarily stored in the storage unit 25, read out from the storage unit 25 at a later time, and transmitted from the communication unit 26 to the server device 70.
- the system controller 10 when storing the image data / audio data in the storage unit 25, the system controller 10 generates additional data including the above contents and records the additional data in the storage unit 25.
- the recorded image data / audio data is read from the storage unit 25 and transmitted to the server device 70, the additional data recorded together is also transmitted.
- the communication unit 26 receives image data / audio data transmitted from the external device (server device 70), demodulates it, and supplies it to the image processing unit 15 and the audio processing unit 16. In this case, the display unit 2 and the audio output unit 5 output the received image and audio.
- the image data / audio data received by the communication unit 26 can be supplied to the storage unit 25 and recorded on the recording medium.
- the imaging display device 1 is provided with an illumination unit 4 and an illumination control unit 14.
- the lighting unit 4 includes a light emitting unit 4a shown in FIGS. 1 and 2 and a light emitting circuit that emits light from the light emitting unit 4a (for example, an LED). Become.
- the illumination control unit 14 causes the illumination unit 4 to perform a light emission operation based on an instruction from the system controller 10.
- the illumination unit 4 Since the light emitting unit 4a in the illumination unit 4 is attached as shown in FIG. 1 or 2, the illumination unit 4 performs an illumination operation with respect to the subject direction by the imaging lens 3a.
- an operation input unit 11 is provided for user operation.
- the operation input unit 11 has an operation element such as a key and a dial, for example, so as to detect a user's operation as a key operation or the like. It is also possible to have a configuration that detects the abnormal behavior.
- an acceleration sensor an angular velocity sensor, a vibration sensor, a pressure sensor, and the like may be provided.
- the system controller 10 detects when the lateral acceleration exceeds a certain value, for example, by detecting the speed sensor or vibration sensor. It can be recognized as a user operation. Also
- the user can detect the force with which the user strikes the side part (the part corresponding to the eyeglass vine) from the right side and the side part from the left side by using the acceleration sensor or the angular velocity sensor, It can also be determined as an operation.
- the acceleration sensor or the angular velocity sensor can detect it and the system controller 10 can recognize this as a user operation.
- Biometric information includes pulse rate, heart rate, electrocardiogram information, myoelectricity, respiratory information (for example, respiratory rate, depth, ventilation), sweating, GSR Reaction), blood pressure, blood oxygen saturation, skin surface temperature, electroencephalogram (eg, ⁇ wave, / 3 wave, ⁇ wave, ⁇ wave information), blood flow change, eye condition, etc.
- the system controller 10 may recognize the detection information of the biosensor as a user operation input.
- the user's conscious behavior may be eye movements (gaze direction change or blinking), but if it detects that the user has blinked three times, it will determine that it is a specific operation input.
- the system controller 10 turns on the power in response to the detection. You may make it perform / off etc.
- the operation input unit 11 supplies information obtained as the operation element, the acceleration sensor, the angular velocity sensor, the vibration sensor, the pressure sensor, the biological sensor, and the like to the system controller 10 as described above. The user's operation is detected.
- the imaging display device 1 includes a position detection unit 12.
- the position detector 12 is, for example, a GPS receiver.
- the GPS receiver receives radio waves from a GPS (Global Positioning System) satellite, and outputs latitude / longitude information as the current position to the system controller 10.
- GPS Global Positioning System
- image data / audio data and additional data are transmitted to the server device 70, they are included in the position information power additional data at the time of imaging detected by the position detector 12.
- the location detection unit 12 may use a location information service provided by WiFi (Wireless Fidelity) or a mobile phone company, or may be used in combination with GPS. good.
- WiFi Wireless Fidelity
- a mobile phone company may use a location information service provided by WiFi (Wireless Fidelity) or a mobile phone company, or may be used in combination with GPS. good.
- the detection position can be corrected by detecting the walking speed (the vehicle speed or the like in the case of a device mounted on an automobile or the like).
- the date and time counting unit 28 counts the current date and time (year / month / day / hour / minute / second).
- the system controller 10 can recognize the current date and time from the count value in the date and time counting unit 28.
- image data / audio data and additional data are transmitted to the server device 70, it is included in the date / time information power additional data detected by the date / time counting unit 28 at the time of imaging.
- the imaging display device 1 can display a map image on the display unit 2.
- a map database 29 is provided for displaying the map image.
- the map database 29 may be stored in a partial area when the storage unit 25 is an HDD or the like, for example.
- the map database 29 includes map image generation information corresponding to position information, additional information such as names of points, search information, and the like as information for displaying a map in the same manner as a so-called navigation system. It is a database.
- the system controller 10 can execute map search and map display processing using the map database 29.
- the imaging device 30 in FIG. 5 eliminates the display unit 2 and the audio output unit 5 that are image and audio output systems in the imaging display device 1 in FIG. 4 and the map database 29 that is information for map display. It is a configuration.
- the imaging device 30 is imaged by the imaging unit 3 in a state where the imaging device 30 is worn by the user as shown in FIG. 2 or is worn by various moving bodies as described above, and the captured image data is transmitted to the communication unit. Can be sent from 26 to an external device or recorded in the storage unit 25
- the system controller 10 controls an imaging operation, a communication operation, a recording operation, and the like.
- a configuration example of the display device 40 is as shown in FIG. Again, the same parts as those in FIG.
- the display device 40 in FIG. 6 has a configuration in which the image pickup unit 3 and the sound input unit 6 for image pickup and sound input functions in the image pickup display device 1 in FIG. 4 are eliminated. Also, there is no illumination unit 4 or illumination control unit 14 to assist in imaging!
- the display device 40 is not a device that transmits image data / audio data to the server device 70, in the sense of generating position information and date / time information included in the additional data at the time of transmission,
- the position detection unit 12 and the date / time counting unit 28 need not be provided.
- system controller 10 does not have to have a function of executing transmission control processing of image data / audio data to the server device 70.
- the display device 40 is worn by the user in the form as shown in Fig. 2B, or the user This is a device owned by the user or owned by a user as a device installed at home or in a car, etc., and the communication unit 26 receives the image data / audio data transmitted from the external device.
- the received image data / audio data is output from the display unit 2 and the audio output unit 5 or recorded in the storage unit 25.
- the system controller 10 controls communication operation, display operation, audio output operation, recording operation, and the like.
- the communication unit 26 may be configured to perform network communication by wired connection.
- a configuration example as shown in FIG. 7 is considered as the imaging device 30 and the display device 40.
- the imaging device 30 is configured to have substantially the same configuration as the imaging display device 1 of FIG.
- the display unit 2 is not provided in the imaging device 30, and a transmission unit 27 is provided instead.
- the transmission unit 27 performs encoding processing for transmission to the display device 40 on the image data supplied from the image processing unit 15 for display monitoring.
- the encoded image data is transmitted to the display device 40.
- the display device 40 is provided with a receiving unit 41, a display control unit 42, and a display unit 2.
- the receiving unit 41 performs data communication with the transmitting unit 21 of the imaging device 30. Then, it receives the image data transmitted from the imaging device 30 and performs a decoding process.
- the image data decoded by the receiving unit 41 is supplied to the display control unit 42.
- the display control unit 42 performs processing such as signal processing for display, screen division, and character synthesis on the image data to generate a display image signal.
- the display control unit 42 has a display panel unit 2a that is a liquid crystal display. Supply to part 2.
- a drive signal based on the signal is applied to perform display.
- FIG. 8 shows a configuration example of the server device 70.
- the server device 70 includes a server control unit 72, a network storage unit 71, and a communication unit 73.
- the network storage unit 71 is realized by, for example, an HDD or the like, and stores a point image database to be described later.
- the point image database is a database in which image data / audio data and additional data transmitted from the imaging device 30 or the imaging display device 1 via the network 60 are accumulated.
- the communication unit 73 performs data communication via the network 60 with the communication unit 26 of the imaging display device 1, the imaging device 30, and the display device 40.
- the server control unit 72 performs necessary operation control as the server device 70. Specifically, the communication operation between the image display device 1, the image pickup device 30, and the display device 40, the storage processing of image data / audio data in the network storage unit 71, search processing, and the like are performed.
- the device A is the imaging display device 1 or the display device 40 in FIG.
- the device B is the imaging display device 1 or the imaging device 30 in FIG.
- the device A is a device that is used by a certain user to receive and display an image captured by another mobile body from the server device 70, and is referred to as “display” in the claims of the present invention.
- the device B is a device that uploads an image to the server device 70, and is a device that is mounted / mounted on a human, a living thing, a vehicle, or other moving body as described above. This is a device corresponding to the “imaging device” in the section.
- FIG. 9 shows the imaging display device 1, the display device 40, the server device 70, and the imaging display device 1 and the imaging device 30 as the device A as the device A, respectively.
- the imaging device 30 and the imaging display device 1 as the device B perform processing for transmitting the captured image data (and audio data) to the server device 70.
- imaging may be performed constantly, and the captured image data (and audio data) may be transmitted to the server device 70.
- the captured image is always displayed when the imaging is performed.
- Data (and voice data) may be transmitted to the server device 70.
- imaging / transmission is performed for a limited time, for example, imaging / transmission is periodically performed, or when device B is a device worn by the user, imaging / transmission is performed based on a user operation. That power S is considered.
- the server device 70 designates the position information and transmits the imaging request data, so that when the system controller 10 of the device B determines that the current position is the requested position, it is automatically set. It may be possible to perform imaging / transmission.
- the image pickup device 30 and the image pickup display device 1 as the device B transmit image data / audio data to the server device 70, the additional data is also transmitted.
- the additional data includes image management information regarding image data / audio data to be transmitted, position information of an imaging location, and date / time information at the time of imaging.
- the server device 70 stores image data (and audio data) and additional data transmitted from each of the imaging display device 1 and the imaging device 30 in the point image database in the network storage unit 71.
- the server control unit 72 receives the image data (and audio data) and the additional data from the device B by the communication unit 73, and the received data is received as a point image in the network storage unit 71. Process to register in the database.
- FIG. 10 schematically shows the recorded contents of the point image database.
- each imaging display device 1 or imaging device as the device B is stored.
- the image transmitted from the device 30 is registered as one entry # 1, # 2 ... in a predetermined unit.
- the predetermined unit may be a unit whose upper limit is determined according to the image data size, playback time, etc., or when image data is transmitted as a continuous video from device B, the position information as the imaging position is One unit may be set at the changing point.
- the unit for dividing image data as one entry depends on the image transmission mode from device B, the transmission duration, the convenience of management information in the point image database, the convenience of the image provision service of server device 70, etc. Can be determined appropriately.
- image data (including audio data) VD1, VD2, VD3 '... Is recorded as each entry, and position information PI is associated with each of these image data.
- P2, ⁇ 3 ⁇ , date and time information Datel, Date2, Date3—, image management information CI, C2, C3-• are recorded.
- the position information, date and time information, and image management information are added as additional data together with the image data.
- the position information P1 is position information as a place where the image data VD1 is captured, for example, latitude and longitude information.
- Date / time information Datel is information of year / month / day / hour / minute / second when the image data VD1 was captured.
- Image management information C1 is management information of image data VD1, and includes, for example, image type (video / still image distinction, etc.), data size, compression method, device B identification information, imaging point name information, etc. This is imaging position related information.
- the imaging position related information is searched from the map database 29 in the imaging display device 1 or the imaging device 30 as the device B based on the current position information at the time of imaging, and the system controller 10 of the device B adds this information.
- the data may be transmitted to the server device 70 in addition to the data! /, And the server device 70 may be provided with a map database, and the map database may be searched and acquired from the position information P1.
- the imaging display device 1 or the imaging device 30 as the device B performs the operation of transmitting the image data and the additional data, so that each point is stored in the point image database as shown in Fig. 10. Image data is accumulated.
- the server The device 70 can provide an image providing service to the user of the device A. That is, in response to an image request from apparatus A, the image data stored in the point image database can be reproduced and the reproduced image data can be transmitted to apparatus A.
- the imaging display device 1 or the display device 40 as the device A communicates with the server device 70 to acquire and display image data from the server device 70.
- this is an operation example when the user of the device A enjoys the image providing service by the server device 70.
- the processing of device A shown in FIG. 11 is the control processing of the system controller 10 of the imaging display device 1 or the display device 40, and the processing of the server device 70 is considered as the control processing of the server control unit 72.
- Power S can be.
- step F100 a position designation process using a map image is performed.
- the system controller 10 of the device A displays a map using data in the map database 29.
- a certain area on the map is searched or scrolled according to the user's operation. This allows the user to view the map image of a specific region or region on the display unit 2.
- the system controller 10 changes the displayed region and scale in response to the user specifying a place name and scale or performing a scroll operation, for example, as shown in Fig. 12 (a).
- a map image of a certain area is displayed.
- the system controller 10 displays a pointer PT on such a map image as shown in FIG. 12 (b), for example, and the pointer PT is moved on the map image based on the user operation! So that That is, the user can designate a specific position on the map image by moving the pointer PT to an arbitrary position on the map image and performing a predetermined operation.
- a touch panel operation function is added to the display unit 2 other than using the pointer PT so that the position touched by the user's finger with respect to the map image becomes the designated position.
- the system controller 10 generates position designation information in response to the user performing an operation of designating a certain point on the map image. That is, position designation information including the latitude and longitude values of the point designated by the user on the map is generated. [0177]
- the server device 70 is accessed in step F101. That is, a communication connection state is established with the server device 70. At this time, the image request information and the position designation information are transmitted to the server device 70.
- the server control unit 72 of the server device 70 establishes a communication connection with the device A and accepts an image request and position designation information in step F300.
- step F301 a reproduced image is specified. That is, the point image database in the network storage unit 71 is searched based on the received position designation information. Then, an entry matching the position information power S and the position designation information in the point image database is extracted.
- an entry having position information within a predetermined range that can be regarded as a neighborhood from the position of latitude / longitude specified by the position specification information may be extracted.
- the image data of the extracted entry is specified as a reproduction image.
- the image data of the entry with the latest date and time may be selected as a playback image by referring to the date and time information, or all entries are played back in order. You can decide as a playback image!
- one entry may not be extracted. That is, this is the case where the image data has been uploaded by the device B so far with respect to the position (or near position) indicated by the position designation information.
- the server control unit 72 notifies the device A that the corresponding image data does not exist and the image cannot be provided. In response to this, the device A side notifies the user that the image cannot be provided and ends the process.
- the server control unit 72 After searching the point image database and specifying the entry (image data) to be played back, the server control unit 72 causes the network storage unit 71 to play back the image data specified as the playback image in step F302.
- the reproduced image data and audio data are transmitted from the communication unit 73 to the device A.
- step 102 the image data transmitted from the server device 70 is received and displayed. That is, the system controller 10 of the device A receives the image data (and audio data) received and demodulated by the communication unit 26 as the image processing unit 15 and the audio processing unit 16. And display output on the display unit 2 and sound output on the sound output unit 5 are executed.
- the system controller 10 of the device A monitors whether or not a communication release request has been transmitted from the server device in step F103.
- step F104 it is determined whether or not the image display is finished. For example, when the user performs a display end operation using the operation input unit 11, it is determined that the display is ended. Alternatively, a process may be considered in which reception and display are automatically terminated after a predetermined time.
- step F102 The display output (and audio output) of the data received in step F102 is continuously output until it is determined that the reception of the communication cancellation request is confirmed in step F103, or until the display end is determined in step F104. Execute.
- the server device 70 monitors in step F303 whether or not the reproduction of the image data of the entry being reproduced has been completed, and whether or not there is a communication release request from the device A in step F304. Until these conditions are detected, image playback and transmission of image data (and audio data) are continued.
- the user of the device A can view the actual scene image captured by the device B in the past at the position designated on the map image.
- step F104 determines in step F104 that the display is terminated due to a user operation or other termination condition
- the process proceeds to step F105, and the communication unit 26 transmits the server device.
- the communication cancel request is transmitted to 70, and the process proceeds to Step F106.
- the server device 70 receives the communication cancellation request, the server device 70 proceeds from step F304 to F306.
- step F303 determines that the reproduction and transmission of the entry to be reproduced has been completed
- the server control unit 72 proceeds from step F303 to F305 and causes the communication unit 73 to transmit a communication cancellation request to the device A.
- the process proceeds to step F103 force, F106.
- the system controller 10 of the device A performs communication connection cancellation processing with the server device 70 in step F106. Further, the server control unit 72 of the server device 70 terminates the communication with the device A and the server processing in step F306. This completes the system operation.
- the user of the device A can arbitrarily designate a certain position on the map image so that the user wears or possesses the scene captured by the device B at that position.
- Device A can!
- FIG. 1 An example of an image that can be viewed by the user of apparatus A is shown in FIG.
- an image captured by the device B in the past is viewed at that point as shown in (a) and (b) of FIG. be able to.
- the image is picked up at the designated point by the image pickup device 30 attached to the car, the image pickup device 30 or the image pickup display device 1 worn by the driver.
- the imaging device 30 or An image captured by the device B as the imaging display device 1 can be viewed.
- the user of device A can view an image actually captured at a point by designating a point on the map.
- the user of device A can specify only the position on the map image. As noted in step F100 in Fig. 11, the user can also specify the date and time. You can do it.
- step F100 the system controller 10 generates the specified position information in response to the user's position specifying operation in the state where the map image is displayed as described above.
- the menu format or number input format input that specifies the date and time is also executed.
- a specific date, a specific time, etc. can be specified and input, or a certain date range (for example, from what month to what day) or a time range (for example, from what time to what time) ).
- a search condition based on the input date and time may be selected. Search conditions include “include specified date”, “within specified date range”, “before specified date”, “after specified date”. The specified date and time, or the specified date and search conditions are set as date and time specification information.
- step F301 the server control unit 72 searches the point image database using both the position designation information and the date designation information.
- the position information power in the point image database is extracted as an entry that matches (or is close to) the position designation information, and the date and time information in the point image database matches the condition of the date and time designation information. Narrow down.
- a certain point at a certain point For example, a certain point at a certain point, a recent point at a certain point, a certain point before a certain point, a certain day at a certain point, a certain point at a certain point You can select and view scenes of time, night scenes of a certain point, etc.
- the user of the imaging display device 1 or the display device 40 corresponding to the device A described above designates a position on the map image, so that the designated position is selected.
- an image actually captured by the imaging device 30 can be viewed. This makes it possible to view and enjoy the spectacle of a certain place that can be specified on the map, to know the situation of a certain place that can be specified on the map, and to meet the various needs of users. Can be realized.
- the fact that a position can be specified on a map image and a captured image of that position can be viewed means that the specified position status includes, for example, terrain, surrounding buildings and scenery of natural objects, and the width of the road. It is also possible to provide information in an expanded manner by applying it to a so-called navigation system.
- the imaging display device 1 or the imaging device 30 serving as the device B is attached to a moving body including a person, so that it is not necessary to provide a fixed facility such as a so-called fixed point camera. is there.
- the imaging display device 1 or the imaging device 30 mounted on the moving body captures images while moving at various points, it naturally collects actual captured images at various points and registers the point image database. There is also an advantage that it is easy to enhance the image!
- Configuration examples and processing examples of the imaging display device 1 or the display device 40 as the device A corresponding to the display device of the present invention, and the imaging display device as the device B corresponding to the imaging device according to the claims of the present invention 1 or configuration examples and processing examples of the imaging device 30, and further configuration examples and processing examples of the server device 70 can be considered in various ways.
- image capturing is performed with various image capturing operations that can be performed only by normal image capturing and uploading to the server device 70, or the server device 70 can request various image capturing operations. Also good. For example, telephoto imaging, wide-angle imaging, imaging with zoom-in or zoom-out from telephoto to wide-angle, image enlargement imaging, image reduction imaging, variable frame rate imaging (high frame rate imaging or low frame imaging) High-intensity imaging, low-intensity imaging, variable contrast imaging, variable sharpness imaging, imaging with increased sensitivity, imaging with increased infrared imaging sensitivity, imaging with increased ultraviolet imaging sensitivity, specific wavelength Imaging with a band cut, Mosaic processing for captured image data, Brightness inversion processing, Soft focus processing, Partial enhancement in the image, Imaging with added image effects such as variable processing of the color atmosphere of the entire image, Still image Imaging.
- a special reproduction image may be requested in addition to the normal reproduction image of the image registered in the point image database.
- image data stored as a moving image it is possible to request that only one frame of still image be transmitted, or request image data for slow playback, high speed playback, etc. Even if you do it.
- the image data is captured by the device B attached to the moving body, the image data registered in the point image database is often image data captured while moving at a certain speed. . Therefore, when the server device 70 reproduces such image data at a variable speed and transmits it to the device A, the user of the device A can view an image of the scene when moving at a high speed or when the user moves slowly. You can see the image of the scene in a pseudo manner.
- the image data transmitted from the server device 70 may be recorded in the storage unit 25 together with the position information. That is, the system controller 10 is connected to the storage unit 2
- the received image data is recorded in correspondence with the position information (for example, the position designation information set in step F100 in FIG. 11).
- the image data stored in 25 can be replayed.
- the moving direction from a certain point may be designated.
- the position detection unit 12 also detects the moving direction and includes the additional data in the server device.
- the server device 70 can be requested to select image data picked up in the moving direction desired by the user.
- the position is specified by the map image on the ground.
- a certain position by force such as an ocean chart, an underwater map, an aerial map, and a space chart.
- image data captured by a ship, a submarine, an aircraft, a satellite, or the like is registered in the point image database together with its position information, etc., designation by these ocean charts, etc. Images can be retrieved and provided to device A based on Of course, as position information about the sea, the sky, and space, not only latitude and longitude but also altitude and depth information may be added.
- an image captured by a human satellite can be viewed at or near that point.
- the device A is equipped with the map database 29 on the side of the device A to display the map.
- the server device 70 includes the map database, and the server device 70 is based on the map database.
- a map image may be transmitted and displayed on the device A side.
- the user power S of device A, a specific place name and address are input, and the place name and address are transmitted to server device 70 as position designation information.
- the server device 70 generates map image data to be displayed from the place name and the like, and transmits it to the device A for display.
- the operation of the present invention can be realized in the imaging display device 1 or the display device 40 that is not equipped with the map database 29.
- image data and audio data are uploaded from the device B to the server device 70 and provided from the server device 70 to the device A.
- an example in which only the image data is targeted is also considered. .
- the present invention can be applied as a system that targets only audio data.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Library & Information Science (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Remote Sensing (AREA)
- Computer Hardware Design (AREA)
- Controls And Circuits For Display Device (AREA)
- Studio Devices (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Description
Claims
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020097011531A KR101329419B1 (ko) | 2006-12-07 | 2007-11-05 | 화상 표시 시스템, 표시 장치, 표시 방법 |
EP07831223.8A EP2090989B1 (en) | 2006-12-07 | 2007-11-05 | Image display system, display device and display method |
CA2670744A CA2670744C (en) | 2006-12-07 | 2007-11-05 | Image display system, display apparatus, and display method |
BRPI0719940A BRPI0719940B1 (pt) | 2006-12-07 | 2007-11-05 | sistema de exibição de imagem, aparelho de exibição, e, método de exibição |
US12/312,615 US8473544B2 (en) | 2006-12-07 | 2007-11-05 | Image display system, display apparatus, and display method |
US13/903,302 US8949324B2 (en) | 2006-12-07 | 2013-05-28 | Image display system, display apparatus, and display method |
US14/511,059 US20150022427A1 (en) | 2006-12-07 | 2014-10-09 | Image display system, display apparatus, and display method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006-330832 | 2006-12-07 | ||
JP2006330832A JP5176311B2 (ja) | 2006-12-07 | 2006-12-07 | 画像表示システム、表示装置、表示方法 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/312,615 A-371-Of-International US8473544B2 (en) | 2006-12-07 | 2007-11-05 | Image display system, display apparatus, and display method |
US13/903,302 Continuation US8949324B2 (en) | 2006-12-07 | 2013-05-28 | Image display system, display apparatus, and display method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2008068990A1 true WO2008068990A1 (ja) | 2008-06-12 |
Family
ID=39491896
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2007/071491 WO2008068990A1 (ja) | 2006-12-07 | 2007-11-05 | 画像表示システム、表示装置、表示方法 |
Country Status (10)
Country | Link |
---|---|
US (3) | US8473544B2 (ja) |
EP (1) | EP2090989B1 (ja) |
JP (1) | JP5176311B2 (ja) |
KR (1) | KR101329419B1 (ja) |
CN (1) | CN101578602A (ja) |
BR (1) | BRPI0719940B1 (ja) |
CA (1) | CA2670744C (ja) |
MY (1) | MY151947A (ja) |
TW (1) | TW200837713A (ja) |
WO (1) | WO2008068990A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2136556A1 (en) * | 2008-06-17 | 2009-12-23 | Samsung Electronics Co., Ltd. | Method and apparatus for recording and reproducing motion picture with location information |
Families Citing this family (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5017989B2 (ja) | 2006-09-27 | 2012-09-05 | ソニー株式会社 | 撮像装置、撮像方法 |
JP2008096868A (ja) | 2006-10-16 | 2008-04-24 | Sony Corp | 撮像表示装置、撮像表示方法 |
JP5023663B2 (ja) * | 2006-11-07 | 2012-09-12 | ソニー株式会社 | 撮像装置、撮像方法 |
JP5092357B2 (ja) * | 2006-11-07 | 2012-12-05 | ソニー株式会社 | 撮像表示装置、撮像表示方法 |
JP4367663B2 (ja) * | 2007-04-10 | 2009-11-18 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラム |
JP4506795B2 (ja) * | 2007-08-06 | 2010-07-21 | ソニー株式会社 | 生体運動情報表示処理装置、生体運動情報処理システム |
JP5369974B2 (ja) * | 2009-08-04 | 2013-12-18 | 株式会社リコー | 撮像装置及び画像共有システム |
TWI423112B (zh) * | 2009-12-09 | 2014-01-11 | Ind Tech Res Inst | 可攜式虛擬輸入操作裝置與其操作方法 |
US9152650B1 (en) | 2010-03-31 | 2015-10-06 | Emc Corporation | Mobile device data recovery |
US8694597B1 (en) * | 2010-03-31 | 2014-04-08 | Emc Corporation | Mobile device group-based data sharing |
US9514089B1 (en) | 2010-03-31 | 2016-12-06 | EMC IP Holding Company LLC | Mobile device network data synchronization |
US8786698B2 (en) | 2010-09-23 | 2014-07-22 | Sony Computer Entertainment Inc. | Blow tracking user interface system and method |
US8638364B2 (en) | 2010-09-23 | 2014-01-28 | Sony Computer Entertainment Inc. | User interface system and method using thermal imaging |
JP5677002B2 (ja) * | 2010-09-28 | 2015-02-25 | キヤノン株式会社 | 映像制御装置、及び映像制御方法 |
CN102033918B (zh) * | 2010-12-07 | 2012-11-21 | 武汉若鱼网络科技有限公司 | 基于时间位置信息搜索目标用户的方法及系统 |
JP2012142825A (ja) * | 2011-01-05 | 2012-07-26 | Sony Corp | 情報処理装置、情報表示方法及びコンピュータプログラム |
US8621000B2 (en) * | 2011-03-23 | 2013-12-31 | Verizon Patent And Licensing Inc. | Region of interest streaming |
US9217867B2 (en) | 2011-03-24 | 2015-12-22 | Seiko Epson Corporation | Head-mounted display device and control method for the head-mounted display device |
JP2012203128A (ja) * | 2011-03-24 | 2012-10-22 | Seiko Epson Corp | 頭部装着型表示装置および頭部装着型表示装置の制御方法 |
US8818706B1 (en) | 2011-05-17 | 2014-08-26 | Google Inc. | Indoor localization and mapping |
US8164599B1 (en) | 2011-06-01 | 2012-04-24 | Google Inc. | Systems and methods for collecting and providing map images |
JP5860634B2 (ja) * | 2011-08-23 | 2016-02-16 | 任天堂株式会社 | 情報処理システム、情報処理方法、サーバプログラム、サーバ装置、および、サーバシステム |
US8941560B2 (en) | 2011-09-21 | 2015-01-27 | Google Inc. | Wearable computer with superimposed controls and instructions for external device |
TWI482108B (zh) * | 2011-12-29 | 2015-04-21 | Univ Nat Taiwan | To bring virtual social networks into real-life social systems and methods |
KR20130080186A (ko) * | 2012-01-04 | 2013-07-12 | 삼성전자주식회사 | 위치 정보 태깅을 위한 방법 및 장치 |
WO2013118535A1 (ja) * | 2012-02-06 | 2013-08-15 | ソニー株式会社 | 撮像制御装置、画像処理装置、撮像制御方法及び画像処理方法 |
JP5891843B2 (ja) * | 2012-02-24 | 2016-03-23 | ソニー株式会社 | クライアント端末、サーバ、およびプログラム |
US9170113B2 (en) | 2012-02-24 | 2015-10-27 | Google Inc. | System and method for mapping an indoor environment |
JP2013183433A (ja) * | 2012-03-05 | 2013-09-12 | Sony Corp | クライアント端末、サーバ、およびプログラム |
US9756138B2 (en) * | 2013-04-08 | 2017-09-05 | Here Global B.V. | Desktop application synchronization to process data captured on a mobile device |
FR3006455A1 (fr) * | 2013-05-31 | 2014-12-05 | Optinvent | Systeme de vision oculaire avec guide optique positionnable dans deux positions |
JP5836319B2 (ja) * | 2013-05-31 | 2015-12-24 | 京セラドキュメントソリューションズ株式会社 | 画像形成システム |
JP6299234B2 (ja) * | 2014-01-23 | 2018-03-28 | 富士通株式会社 | 表示制御方法、情報処理装置、および表示制御プログラム |
KR101580617B1 (ko) | 2014-06-05 | 2015-12-28 | 오병오 | 빛의 조사구간 선택이 용이한 실외용 엘이디등기구 |
DE102014213058A1 (de) * | 2014-07-04 | 2016-01-07 | Siemens Aktiengesellschaft | Verfahren zur Ausgabe von Fahrzeuginformation |
JP6070652B2 (ja) * | 2014-07-28 | 2017-02-01 | ヤマハ株式会社 | リファレンス表示装置およびプログラム |
CN104767816A (zh) * | 2015-04-15 | 2015-07-08 | 百度在线网络技术(北京)有限公司 | 拍摄信息采集方法、装置和终端 |
JP2015212828A (ja) * | 2015-06-17 | 2015-11-26 | セイコーエプソン株式会社 | 頭部装着型表示装置および頭部装着型表示装置の制御方法 |
WO2017073775A1 (ja) * | 2015-10-30 | 2017-05-04 | 誠 大島 | 自動撮像装置、および装着型撮像装置 |
US10733456B2 (en) | 2015-12-21 | 2020-08-04 | A9.Com, Inc. | Sharing video footage from audio/video recording and communication devices |
US10447963B2 (en) * | 2015-12-21 | 2019-10-15 | Amazon Technologies, Inc. | Sharing video footage from audio/video recording and communication devices |
CN108924451A (zh) * | 2017-05-09 | 2018-11-30 | 高德信息技术有限公司 | 一种图像获取方法及装置 |
CN110851643A (zh) * | 2019-10-28 | 2020-02-28 | 维沃移动通信有限公司 | 一种信息显示方法及电子设备 |
US20210398322A1 (en) * | 2020-06-17 | 2021-12-23 | Palantir Technologies Inc. | Approaches for compressing and distributing image data |
WO2023084810A1 (ja) * | 2021-11-10 | 2023-05-19 | 日本電信電話株式会社 | 情報処理装置、情報処理方法、及び、情報処理プログラム |
WO2024085353A1 (ko) * | 2022-10-20 | 2024-04-25 | 삼성전자주식회사 | 위치에 기반하여 카메라를 제어하여 위치에 대응하는 미디어를 획득하기 위한 전자 장치 및 그 방법 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003288357A (ja) * | 2002-03-28 | 2003-10-10 | Toshiba Corp | 情報共有システムと情報共有方法 |
JP2004537193A (ja) | 2001-05-10 | 2004-12-09 | インターナショナル・ビジネス・マシーンズ・コーポレーション | 記録されたラジオ番組またはテレビ番組をワールド・ワイド・ウェブ上の情報で拡張するシステムおよび方法 |
JP2004538679A (ja) | 2001-05-10 | 2004-12-24 | インターナショナル・ビジネス・マシーンズ・コーポレーション | 放送されたまたは記録されたラジオ番組またはテレビ番組をワールド・ワイド・ウェブ上の情報で拡張するシステムおよび方法 |
JP2004538681A (ja) | 2001-05-10 | 2004-12-24 | インターナショナル・ビジネス・マシーンズ・コーポレーション | 放送された番組をワールド・ワイド・ウェブ上の情報で拡張するシステムおよび方法 |
JP2005341604A (ja) | 1994-06-20 | 2005-12-08 | Toshiba Corp | 無線通信方法、無線通信基地局および無線端末 |
JP2005337863A (ja) | 2004-05-26 | 2005-12-08 | Fuji Photo Film Co Ltd | 地図更新システム |
Family Cites Families (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6853293B2 (en) * | 1993-05-28 | 2005-02-08 | Symbol Technologies, Inc. | Wearable communication system |
US7084903B2 (en) * | 1998-05-18 | 2006-08-01 | International Business Machines Corporation | Image capturing system and method for automatically watermarking recorded parameters for providing digital image verification |
US7107539B2 (en) | 1998-12-18 | 2006-09-12 | Tangis Corporation | Thematic response to a computer user's context, such as by a wearable personal computer |
US6791580B1 (en) * | 1998-12-18 | 2004-09-14 | Tangis Corporation | Supplying notifications related to supply and consumption of user context data |
JP2001034615A (ja) * | 1999-07-16 | 2001-02-09 | Nippon Telegr & Teleph Corp <Ntt> | 作業支援システム及び作業支援方法 |
JP2001050759A (ja) * | 1999-08-06 | 2001-02-23 | Nissan Motor Co Ltd | ナビゲーション装置 |
JP2001290820A (ja) * | 2000-01-31 | 2001-10-19 | Mitsubishi Electric Corp | 映像収集装置、映像検索装置および映像収集検索システム |
US20020010734A1 (en) * | 2000-02-03 | 2002-01-24 | Ebersole John Franklin | Internetworked augmented reality system and method |
GB0028804D0 (en) * | 2000-11-25 | 2001-01-10 | Hewlett Packard Co | Voice communication concerning a local entity |
US20030014212A1 (en) * | 2001-07-12 | 2003-01-16 | Ralston Stuart E. | Augmented vision system using wireless communications |
US6717737B1 (en) * | 2001-12-21 | 2004-04-06 | Kyle Haglund | Mobile imaging system |
EP1324274A3 (en) | 2001-12-28 | 2005-11-02 | Matsushita Electric Industrial Co., Ltd. | Vehicle information recording system |
JP2003244691A (ja) | 2002-02-20 | 2003-08-29 | Matsushita Electric Ind Co Ltd | 記憶支援システム |
JP3789842B2 (ja) | 2002-03-28 | 2006-06-28 | エヌ・ティ・ティ・コムウェア株式会社 | コンテンツのセキュリティ機能付き管理方法およびセキュリティ機能付きコンテンツ管理装置 |
JP4181372B2 (ja) * | 2002-09-27 | 2008-11-12 | 富士フイルム株式会社 | 表示装置、画像情報管理端末、画像情報管理システム、および画像表示方法 |
JP4508596B2 (ja) * | 2002-11-06 | 2010-07-21 | キヤノン株式会社 | 通信装置、画像記憶装置およびそれらの制御方法 |
TWI235588B (en) * | 2003-01-30 | 2005-07-01 | Casio Computer Co Ltd | Wrist-worn communications apparatus |
US20040174443A1 (en) * | 2003-03-07 | 2004-09-09 | Simske Steven J. | System and method for storing of records in a database |
WO2004107012A1 (ja) * | 2003-05-30 | 2004-12-09 | Vixen Co., Ltd. | 天体の自動導入装置 |
JP4192731B2 (ja) * | 2003-09-09 | 2008-12-10 | ソニー株式会社 | 案内情報提供装置及びプログラム |
US20050060299A1 (en) * | 2003-09-17 | 2005-03-17 | George Filley | Location-referenced photograph repository |
JP3944160B2 (ja) | 2003-12-25 | 2007-07-11 | キヤノン株式会社 | 撮像装置、情報処理装置、それらの制御方法及びプログラム |
JP2005252732A (ja) * | 2004-03-04 | 2005-09-15 | Olympus Corp | 撮像装置 |
US20060187317A1 (en) * | 2005-02-24 | 2006-08-24 | Memory Matrix, Inc. | Systems and methods for processing images with positional data |
US20060197781A1 (en) * | 2005-03-03 | 2006-09-07 | Arutunian Ethan B | System and method utilizing enhanced imagery and associated overlays |
US7848765B2 (en) * | 2005-05-27 | 2010-12-07 | Where, Inc. | Location-based services |
KR20080030086A (ko) * | 2005-07-07 | 2008-04-03 | 퀄컴 인코포레이티드 | 위치 정보 제공 방법 및 장치 |
US20070032945A1 (en) * | 2005-08-02 | 2007-02-08 | Jason Kaufman | System and method for providing location related information to a network user |
US7920854B2 (en) * | 2005-09-19 | 2011-04-05 | Silverbrook Research Pty Ltd | Printing a map using a mobile device |
US20070083329A1 (en) * | 2005-10-07 | 2007-04-12 | Wansoo Im | Location-based interactive web-based multi-user community site |
JP4371127B2 (ja) | 2006-07-14 | 2009-11-25 | ソニー株式会社 | 再生装置、再生方法、プログラム |
US20080082264A1 (en) * | 2006-09-11 | 2008-04-03 | Broadcom Corporation, A California Corporation | GPS route creation, photograph association, and data collection |
JP5092357B2 (ja) | 2006-11-07 | 2012-12-05 | ソニー株式会社 | 撮像表示装置、撮像表示方法 |
JP5023663B2 (ja) | 2006-11-07 | 2012-09-12 | ソニー株式会社 | 撮像装置、撮像方法 |
US9665597B2 (en) * | 2006-12-05 | 2017-05-30 | Qualcomm Incorporated | Method and system for processing images using time and location filters |
JP4961984B2 (ja) | 2006-12-07 | 2012-06-27 | ソニー株式会社 | 画像表示システム、表示装置、表示方法 |
JP4367663B2 (ja) | 2007-04-10 | 2009-11-18 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラム |
-
2006
- 2006-12-07 JP JP2006330832A patent/JP5176311B2/ja not_active Expired - Fee Related
-
2007
- 2007-11-05 WO PCT/JP2007/071491 patent/WO2008068990A1/ja active Application Filing
- 2007-11-05 US US12/312,615 patent/US8473544B2/en not_active Expired - Fee Related
- 2007-11-05 EP EP07831223.8A patent/EP2090989B1/en not_active Not-in-force
- 2007-11-05 TW TW096141749A patent/TW200837713A/zh not_active IP Right Cessation
- 2007-11-05 BR BRPI0719940A patent/BRPI0719940B1/pt not_active IP Right Cessation
- 2007-11-05 CA CA2670744A patent/CA2670744C/en not_active Expired - Fee Related
- 2007-11-05 KR KR1020097011531A patent/KR101329419B1/ko active IP Right Grant
- 2007-11-05 CN CNA2007800451131A patent/CN101578602A/zh active Pending
- 2007-11-05 MY MYPI20091860 patent/MY151947A/en unknown
-
2013
- 2013-05-28 US US13/903,302 patent/US8949324B2/en not_active Expired - Fee Related
-
2014
- 2014-10-09 US US14/511,059 patent/US20150022427A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005341604A (ja) | 1994-06-20 | 2005-12-08 | Toshiba Corp | 無線通信方法、無線通信基地局および無線端末 |
JP2004537193A (ja) | 2001-05-10 | 2004-12-09 | インターナショナル・ビジネス・マシーンズ・コーポレーション | 記録されたラジオ番組またはテレビ番組をワールド・ワイド・ウェブ上の情報で拡張するシステムおよび方法 |
JP2004538679A (ja) | 2001-05-10 | 2004-12-24 | インターナショナル・ビジネス・マシーンズ・コーポレーション | 放送されたまたは記録されたラジオ番組またはテレビ番組をワールド・ワイド・ウェブ上の情報で拡張するシステムおよび方法 |
JP2004538681A (ja) | 2001-05-10 | 2004-12-24 | インターナショナル・ビジネス・マシーンズ・コーポレーション | 放送された番組をワールド・ワイド・ウェブ上の情報で拡張するシステムおよび方法 |
JP2003288357A (ja) * | 2002-03-28 | 2003-10-10 | Toshiba Corp | 情報共有システムと情報共有方法 |
JP2005337863A (ja) | 2004-05-26 | 2005-12-08 | Fuji Photo Film Co Ltd | 地図更新システム |
Non-Patent Citations (3)
Title |
---|
ITO D.: "Chizu Joho to Kakucho Genjitsukan o Mochiita Ichi Izon Joho no Authoring", IEICE TECHNICAL REPORT, vol. 105, no. 566, 19 January 2006 (2006-01-19), pages 1 - 6, XP003023034 * |
KAMISAKA D.: "NAMBA Explorere: Gazo Kyoyu Kano na Real Time Shigaichi Joho Kyoyu System", INFORMATION PROCESSING SOCIETY OF JAPAN KENKYU HOKOKU, vol. 2004, no. 9, 30 January 2004 (2004-01-30), pages 49 - 54, XP003023033 * |
See also references of EP2090989A4 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2136556A1 (en) * | 2008-06-17 | 2009-12-23 | Samsung Electronics Co., Ltd. | Method and apparatus for recording and reproducing motion picture with location information |
CN101610376A (zh) * | 2008-06-17 | 2009-12-23 | 三星电子株式会社 | 用于记录和再现具有位置信息的运动图像的方法和设备 |
US8463101B2 (en) | 2008-06-17 | 2013-06-11 | Samsung Electronics Co., Ltd. | Method and apparatus for recording and reproducing motion picture with location information |
CN101610376B (zh) * | 2008-06-17 | 2016-08-10 | 三星电子株式会社 | 用于记录和再现具有位置信息的运动图像的方法和设备 |
Also Published As
Publication number | Publication date |
---|---|
JP5176311B2 (ja) | 2013-04-03 |
CA2670744C (en) | 2017-10-31 |
US8949324B2 (en) | 2015-02-03 |
TW200837713A (en) | 2008-09-16 |
BRPI0719940B1 (pt) | 2018-11-27 |
JP2008147864A (ja) | 2008-06-26 |
US20150022427A1 (en) | 2015-01-22 |
BRPI0719940A2 (pt) | 2014-03-11 |
US20130335352A1 (en) | 2013-12-19 |
KR20090086092A (ko) | 2009-08-10 |
US20100063997A1 (en) | 2010-03-11 |
EP2090989B1 (en) | 2016-01-27 |
EP2090989A1 (en) | 2009-08-19 |
EP2090989A4 (en) | 2011-11-02 |
TWI379284B (ja) | 2012-12-11 |
CA2670744A1 (en) | 2008-06-12 |
KR101329419B1 (ko) | 2013-11-14 |
MY151947A (en) | 2014-07-31 |
US8473544B2 (en) | 2013-06-25 |
CN101578602A (zh) | 2009-11-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5176311B2 (ja) | 画像表示システム、表示装置、表示方法 | |
JP4961984B2 (ja) | 画像表示システム、表示装置、表示方法 | |
US10187612B2 (en) | Display apparatus for displaying image data received from an image pickup apparatus attached to a moving body specified by specification information | |
JP2008227813A (ja) | 画像処理装置、画像処理方法、画像処理システム | |
JP5630513B2 (ja) | 表示装置、表示方法、画像表示システム | |
JP2013110764A (ja) | 撮像表示装置、撮像表示方法 | |
JP2013034237A (ja) | 画像処理装置、画像処理方法、画像処理システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200780045113.1 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07831223 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2907/DELNP/2009 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12312615 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2670744 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020097011531 Country of ref document: KR Ref document number: 2007831223 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: PI0719940 Country of ref document: BR Kind code of ref document: A2 Effective date: 20090605 |