WO2004111973A1 - 画像サーバ、画像収集装置、及び画像表示端末 - Google Patents
画像サーバ、画像収集装置、及び画像表示端末 Download PDFInfo
- Publication number
- WO2004111973A1 WO2004111973A1 PCT/JP2004/008112 JP2004008112W WO2004111973A1 WO 2004111973 A1 WO2004111973 A1 WO 2004111973A1 JP 2004008112 W JP2004008112 W JP 2004008112W WO 2004111973 A1 WO2004111973 A1 WO 2004111973A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- vehicle
- information
- photographing
- photographed
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
Definitions
- Image server image collection device, and image display terminal
- the present invention relates to an image collection device that captures an image around a vehicle, an image server that distributes the captured image, and an image display terminal that displays an image distributed by the image server.
- a service in which a fixed-point camera is installed on a road in order to convey traffic congestion conditions on the road to a person at a remote place, and an image of the road captured by the camera is distributed to a person who requests the road.
- a service called MONET registered trademark
- MONET distributes road conditions to mobile phone users as images.
- an object of the present invention is to perform efficient image collection or image distribution that reduces a sense of discomfort given to a user who views a captured image.
- an image server includes a storage medium for storing a photographed image of a periphery of a vehicle photographed by a camera mounted on the vehicle and photographing information of the photographed image;
- the captured image and the captured information of the captured image are stored in a storage medium, and the captured image stored in the storage medium is displayed as a received captured image.
- Distribution means for distributing the photographed image to the image display terminal, wherein the photographing information includes information on the photographing point where the photographed image is photographed, and the distribution means determines whether the photographed image belongs to the photographing point of the photographed image. For each division by area, the information selected based on the shooting information and predetermined selection conditions is stored in a storage medium.
- a photographed image is selected and stored in accordance with a predetermined condition for each section, so that the stored photographed image is homogeneous between the sections. Therefore, it is possible to perform efficient image collection that reduces a sense of discomfort given to a user who views a captured image.
- the shooting information of the shot image may be created by the vehicle in which the shot image was shot, or may be created by something else.
- the presence or absence of rain can be obtained from the wiper information inside the vehicle, but detailed weather information can be obtained more automatically by automatically obtaining it from another weather server. Information can be provided.
- the photographing information includes information of a traveling direction of the vehicle at the time of photographing the photographed image
- the distribution unit converts the input photographed image into a photographing location of the photographed image.
- the information selected based on the shooting information and predetermined selection conditions may be stored in a storage medium.
- the image server has a storage medium for storing a photographed image around the vehicle taken by a camera mounted on the vehicle and photographing information of the photographed image, and a storage medium for storing! Distribution means for distributing the photographed image to the image display terminal, wherein the photographing information includes information of a photographing point where the photographed image is photographed, and the distribution means reads the photographed image stored in the storage medium.
- a configuration may be adopted in which, for each section according to the area to which the photographing point of the photographed image belongs, a photographed image and an image selected based on predetermined selection conditions are delivered to the image display terminal.
- a photographed image is selected and distributed under predetermined conditions for each section, so that the distributed photographed image is homogeneous between the sections. Therefore, it is possible to perform efficient image distribution that reduces a sense of discomfort given to a user who views the captured image. Wear.
- the photographing information includes information on a traveling direction of the vehicle at the time of photographing the photographed image
- the distribution unit replaces the photographed image stored in the storage medium with the photographed image of the photographed image.
- the information selected based on the shooting information and predetermined selection conditions may be distributed to the image display terminal.
- the shooting information includes information on the shooting time when the shooting image was shot, and the selection condition may be such that the shooting time is the newest U in the section and the shooting image is preferentially selected. Good.
- the photographing information includes information on the traveling speed of the vehicle when the photographed image was photographed, and the selection condition is that the photographed image with the slowest traveling speed of the vehicle at the time of photographing within the section has priority. It may be one that is selected selectively.
- the shooting information includes information on the weather at the shooting location at the time of shooting the shooting image, and the selection condition is to preferentially select a shooting image at the shooting location where the weather at the shooting time was good in the section. Whatever you choose,
- the photographing information includes information on the inclination of the vehicle at the time of photographing the photographed image, and the selection condition is that the photographed image having the smallest inclination angle of the vehicle at the time of photographing within the section has priority. Whatever you choose,
- the shooting information includes information as to whether or not the vehicle was in the intersection at the time of shooting the shot image, and the selection condition is that the shooting image in which the vehicle was within the intersection at the time of shooting in the section. It may be a matter of priority.
- the distribution unit determines whether the vehicle is in the intersection at the time of capturing the captured image based on the map information and the captured information. Can be determined. Therefore, as a selection condition
- the photographed image that the vehicle was in the intersection at the time of photographing in the section can be preferentially selected.
- the input captured image is an omnidirectional image or an image obtained by expanding an omnidirectional image
- the distribution unit determines that the displayed image is displayed in a predetermined direction. , You can slide the display layout of the image to be expanded. No.
- the in-vehicle image collection device includes a photographing means for photographing an image around the vehicle, a current position specifying means for specifying a current position of the vehicle, and a tilt amount for specifying a tilt amount of the vehicle with respect to a horizontal plane.
- the identifying means and the captured image captured by the capturing means, information on the current position of the vehicle when the captured image was captured, and information on the inclination of the vehicle when the image was captured are transmitted to the image server described above.
- the in-vehicle image collection device can transmit information on the inclination of the vehicle to the image server, so that the image server can select a captured image based on the inclination. Therefore, the in-vehicle image collection device can perform efficient image collection that reduces a sense of discomfort given to a user who views a captured image.
- the in-vehicle image collection device includes a photographing unit that photographs an image around the vehicle, a current position specifying unit that specifies a current position of the vehicle, and an intersection determination unit that determines whether the vehicle is within an intersection.
- a photographed image photographed by the photographing means information on the current position of the vehicle at the time the photographed image was photographed, and information on whether or not the vehicle was within the intersection when the image was photographed.
- the in-vehicle image collection device can transmit information about whether or not the vehicle is in the intersection to the image server. Can be selected based on the image. Therefore, the in-vehicle image collection device can perform efficient image collection that reduces a sense of discomfort given to a user who views a captured image.
- the in-vehicle image collection device includes: a photographing unit that photographs an image around the vehicle; a current position specifying unit that specifies a current position of the vehicle; a traveling speed specifying unit that specifies a traveling speed of the vehicle; And a storage medium for storing information of a current position of the vehicle at the time when the photographed image was photographed by the photographing means, and providing the information to the image server. It may be configured to determine the time interval of repetition of image capturing based on the traveling speed.
- the in-vehicle image collection device can perform an efficient image collection that reduces a sense of discomfort given to a user who views a captured image.
- the in-vehicle image collection device includes a photographing means for photographing an image around the vehicle, a current position specifying means for specifying a current position of the vehicle, and a photographed image and a photographed image photographed by the photographing means.
- the distribution means preferably corrects the information of the shooting location.
- the correction can be performed. Therefore, the image server can reduce the discomfort given to the user who views the photographed image. Image collection and distribution can be performed. In addition, the correction of the information of the photographing point can be performed using map matching.
- the image display terminal provides a photographed image and a photographed image distributed from an image server that distributes photographed images of the periphery of the vehicle photographed by a camera mounted on the vehicle and photographing information of the photographed image.
- a receiving unit that receives the information; and a display control unit that displays the received captured image on the display unit.
- the captured information includes information on a capturing location where the captured image is captured.
- a photographic image taken in a certain search area is searched, and a photographic image to be displayed next on the display means is selected from the searched photographic images based on a predetermined selection rule.
- the selection rule is that the shooting time is the latest in the search area.
- V It may be the one that preferentially selects the shot image! / ,.
- the selection rule may be a rule that preferentially selects a captured image having the slowest traveling speed of the vehicle at the time of capturing in the search area.
- the photographing information includes information of the photographing time when the photographed image was photographed, and the selection condition is the photographing information currently displayed on the display means within the category.
- the image and the photographing time are closest, and the photographed image may be preferentially selected.
- the display control means displays, on the display means, a message indicating that progress cannot be made on the display means if there is no captured image in the search area as a result of the search. You may let it.
- the display control means may search for each of the plurality of directions within a direction area of the direction and a distance of a point force at which a captured image to be displayed on the display means is captured within a predetermined range.
- a search may be made for a captured image captured in the area, and if there is a captured image captured in the search area as a result of the search, a display indicating that the vehicle can proceed in that direction may be displayed on the display means together with the captured image.
- the photographing information includes information on a traveling direction of the vehicle at the time when the photographed image is photographed, and the display control means is next connected to the display means. Based on the relationship between the traveling direction of the vehicle at the time the captured image to be displayed and the current forward direction are taken, the captured image is displayed with the traveling direction or the reverse direction as the next forward direction. Is also good.
- the photographing information includes information on the inclination of the vehicle at the time when the photographed image was photographed, and the display control means performs, based on the information on the inclination, information on the building in the photographed image due to the inclination.
- the captured image may be corrected so as to reduce the inclination, and the corrected captured image may be displayed on the display unit.
- the display unit narrows the display range of the corrected captured image so as to reduce the display of the lack of image information in the corrected captured image.
- FIG. 1 is a conceptual diagram schematically showing an image distribution system according to a first embodiment of the present invention.
- FIG. 2 is a block diagram of a configuration of a device mounted in the vehicle 1 for capturing an image and transmitting the image to an image server 5.
- FIG. 3 is a diagram of an example of photographing information of an image created by an information processing unit 13.
- FIG. 4 is a flowchart of a process in which an information processing unit 13 captures an image and transmits the image to an image server 5.
- FIG. 5 is a block diagram showing a configuration of an image distribution system including each device of the vehicle 1.
- FIG. 6 is a timing chart of communication between the image server 5 and the vehicle 1 and the fixed terminal 36 and the in-vehicle terminal 37.
- FIG. 7 is a conceptual diagram of a process performed by an image conversion program 58 of the image server 5.
- FIG. 8 is a conceptual diagram showing a method of selecting an image of an information processing program 54 by reading a database.
- FIG. 9 is a diagram showing a weather server 9 communicating with the image server 5.
- V is a flowchart of the processing part!
- FIG. 11 is a diagram showing an example of Web data from the image server 5 displayed using a Web browser on the user side.
- FIG. 12 is a diagram showing a configuration of a vehicle-mounted device of a fixed terminal 36 and an in-vehicle terminal 37 in the present embodiment.
- FIG. 13 is a diagram showing a display screen of information to be displayed on the user interface unit 77.
- FIG. 14 is a flowchart showing in more detail a portion related to a case where the up button 84 is pressed in the image display program.
- FIG. 15 is a conceptual diagram of a typical case where the upper button 84 is pressed.
- FIG. 16 is an explanatory diagram of a proximity point search algorithm.
- FIG. 17 is a view showing a stop mark 90 displayed in the image display section 81.
- FIG. 18 is a view showing a guidance arrow 91 displayed in the image display section 81.
- FIG. 19 shows a case where the effect of the third embodiment is characteristically exhibited.
- FIG. 20 is a flowchart of a process for setting the front direction of an image to be displayed next to the direction at the time of capturing the image or the opposite direction.
- FIG. 21 is a schematic diagram illustrating a characteristic portion of a fourth embodiment.
- FIG. 22 is a flowchart of a process of reading a database from the information processing program 54 using a filtering condition.
- FIG. 23 is an explanatory diagram of the operation of a database readout / information processing program 54 when distributing images stored in both directions in one area to a user.
- FIG. 24 is a flowchart of a process performed by the information processing program 54 for reading out a database for determining which direction of image information to return based on information on the front direction.
- FIG. 25 is a flowchart of a process of the information processing unit 13 for not photographing the point twice or more when it can be considered that the own vehicle stops.
- FIG. 9 is a reference diagram for explaining that the shooting interval is lengthened when the speed is low.
- FIG. 27 is a reference diagram for explaining that it is determined that the vehicle is at the intersection when the brake is depressed, the blinker blinks, and the accelerator is depressed in this order.
- FIG. 28 is a view showing a vehicle 1 traveling on an inclined road.
- FIG. 29 is a schematic diagram illustrating a correction process based on the inclination of the vehicle 1.
- FIG. 30 is an overall view of an example of a system for detailed photography at an important point.
- FIG. 31 is a reference diagram for explaining a case in which a determination as to whether or not approaching an important point is made by determining whether or not a point on a road closest to a designated important point is approached.
- FIG. 32 is a flowchart showing a process for map matching when GPS information cannot be obtained.
- FIG. 1 is a conceptual diagram schematically showing an image distribution system according to the first embodiment of the present invention.
- Vehicle 1 (probe car) traveling on road 2 has an omnidirectional imaging function that can capture 360 ° around the vehicle and a communication function that wirelessly communicates with base station 3.
- the vehicle 1 has a photographed image 6 photographed by the omnidirectional photographing function, a position of the vehicle when the photographed image 6 is photographed, a photographing time, a traveling direction (azimuth) of the vehicle at the time of photographing, and a vehicle 1 at the time of photographing.
- the imaging information of the captured image 6 including the traveling speed and the like is transmitted to the image server 5 via the base station 3 and the base station network 4 to which the base station 3 is connected.
- the shooting information of an image is information on a shooting situation when the image was shot.
- the photographed image 6 is a united image in the form of an image as if the surroundings were seen through a fisheye lens as viewed from the vehicle 1 in all directions. This image is very difficult for humans to see because it is distorted from the actual scenery.
- the shooting location is a concept including both the position of the vehicle 1 when the shot image 6 is shot and the position of the shooting target when the shot image 6 is shot.
- the latter position of the object to be photographed may be, for example, road 2 immediately before vehicle 1, or may be a distant landscape directly visible from road 2.
- the image server 5 that has received the photographed image 6 and the photographing information of the photographed image 6 connects to the image server 5 via the wide area network 8 to which the image server 5 is connected, and requests a personal computer to distribute the image.
- the distribution image 7 and the imaging information of the distribution image 7 are distributed to 61 and the in-vehicle device 62.
- the image information of the image sensor 5 includes information on the position of the imaging target in the imaging information.
- the distribution image 7 is a developed image in which the distortion of the captured image 6 has been corrected so that it is easy for a human to see.
- development the conversion of the captured image 6 into the distribution image 7 is referred to as development.
- the vehicle 1 receives the designation of the timing of performing the image capturing from the image server 5 by wireless communication, and performs the image capturing at the specified timing.
- the timing at which shooting is performed refers to the position of the vehicle 1 at the time of shooting. Shooting when vehicle 1 comes to a certain position means shooting at the time when vehicle 1 comes to a certain position. Is the same as
- FIG. 2 is a block diagram of a device mounted in the vehicle 1 for capturing an image and transmitting the image to the image server 5.
- the devices in the vehicle 1 include an image control computer 10, an omnidirectional camera 21, a GPS sensor
- the omnidirectional camera 21 is a camera that receives 360-degree omnidirectional landscape at a time and outputs the received omnidirectional video at all times.
- the omnidirectional camera described in Japanese Patent No. 2939087 can be used.
- the GPS sensor 22 receives the GPS satellite force information, calculates the current position, the traveling speed, and the traveling direction of the vehicle 1, and outputs this as current position information.
- the PS sensor 22 also outputs information indicating whether or not the current position information can be output.
- the direction sensor 23 is a device that detects the current direction of the vehicle 1 such as north, south, east, west or the like using a gyroscope and outputs this as direction information.
- the speed sensor 24 detects the number of rotations of the tires of the vehicle 1, calculates the running speed of the vehicle 1 from the number of rotations and the elapsed time, and outputs this as speed information.
- the wireless device 25 converts communication data into a wireless signal, and transmits the wireless signal from the antenna 26 to a base station corresponding to the communication system of the wireless device 25.
- the communication system of the wireless device 25 includes, for example, a PDC, a wireless LAN, and the like.
- the image control computer 10 includes a sensor control unit 11, an image recording unit 12, an information processing unit 13, and a communication control unit 14.
- the sensor control unit 11 receives the current position information, direction information, and speed information of the vehicle 1 from the GPS sensor 22, the direction sensor 23, and the speed sensor 24, respectively, and processes the received information as an information processing unit. 13 output.
- the sensor control unit 11 receives and outputs the information periodically, for example, about every one second in the present embodiment, and further receives a control signal requesting the information from the information processing unit 13. They also receive and output information.
- the image recording unit 12 receives an omnidirectional image from the omnidirectional camera 21 and converts it to an omnidirectional image. And simultaneously outputs the information to the information processing unit 13.
- the image recording unit 12 records an omnidirectional video image received from the omnidirectional camera 21 as an image when receiving a control signal for image recording from the information processing unit 13!
- the information processing unit 13 receives from the communication control unit 14 a designation of the position of the vehicle 1 at the time of shooting, and stores the designation information in a memory such as a hard disk drive (HDD). 15 to save.
- a memory such as a hard disk drive (HDD). 15 to save.
- HDD hard disk drive
- areas for an information management DB (database) 16, a schedule DB 17, and a map DB 18 are secured.
- the information management DB 16 is an area for storing images recorded in the image recording unit 12 by the information processing unit 13 and shooting information of the images, which will be described later.
- the schedule DB 17 is an area for storing information of a shooting schedule such as a designation of a position of a vehicle at the time of shooting, a shooting timing, a shooting position, and the like. Note that the photographing schedule may be determined in advance.
- the map DB 18 stores map data for later-described map matching and map display.
- the information processing unit 13 periodically receives the current position information from the GPS sensor 22 from the sensor control unit 11 and reads the specified information stored in the schedule DB 17 from the memory 15. The current position of the vehicle 1 is compared with a specified position. Then, when the vehicle 1 reaches a designated position, the information processing section 13 outputs a control signal for image recording to the image recording section 12 and receives an image from the image recording section 12. This image is received by the omnidirectional camera 21.
- the information processing section 13 outputs a control signal requesting the current position information, direction information, and speed information to the sensor control section 11 at the time of the above photographing, and based on this control signal, the sensor control section 11 Is received. Then, based on the received information, it creates shooting information of the image including time at the time of shooting, current position information, direction information, and speed information, and stores it in the information management DB 16.
- FIG. 3 shows an example of photographing information of an image created by the information processing section 13.
- This creation information is described in the XML format, and includes a tag which is a character string indicating an attribute of data enclosed in brackets, and data and attributes attributed by the tag.
- the value of the name parameter "capOOOl.ipg" in the file> tag indicates the file name of the corresponding image.
- the current position information consisting of the longitude between ⁇ longitude> and the latitude between ⁇ / longitude> and the longitude between ⁇ > / Is stored. Also, a map code can be included.
- Figure 3 shows that the current location information when the image was taken is 137 ° E '41 .981998 "east longitude and 35 ° T 13.692000 ⁇ N latitude.
- Speed information is shown between ⁇ speed> and ⁇ Zspeed>.
- FIG. 3 shows that the speed of the vehicle 1 when photographed is 4.444800 kmZh!
- the information processing unit 13 transmits the captured image and the imaging information of the image to the communication control unit 14 as communication data. Note that the information processing unit 13 also includes information designating the communication control unit 14 to transmit the communication data via the wireless device 25 in the communication data.
- the communication control unit 14 is a device that controls the radio device 25, connects to the base station and the base station network, and performs data communication with the image server 5.
- the communication control unit 14 transmits, to the image server 5, communication data such as an image received from the information processing unit 13 for wireless transmission and image capturing information of the image by the data communication.
- the communication control unit 14 is configured to receive the communication data obtained by the wireless device 25 from the base station and converted from the wireless signal power of the image server 5 and output the converted communication data to the information processing unit 13.
- the wireless signal from the image server 5 is, for example, a wireless signal for specifying the timing at which the image control computer 10 performs shooting.
- FIG. 4 is a flowchart showing a process performed by the information processing unit 13 of the vehicle 1 configured as described above to capture an image with the omnidirectional camera 21 and transmit the captured image to the image server 5.
- the information processing unit 13 Prior to this processing, the information processing unit 13 also receives the designation of the position of the vehicle 1 at the time of shooting the five image servers from the communication control unit 14 and stores it in the memory 15 (step 400). .
- the vehicle 1 acquires current position information from the GPS sensor 22 from the sensor control unit 11 as a periodic process (step 401).
- the information processing section 13 reads the specified information stored in the schedule DB 17 from the memory 15 (step 405).
- step 410 the information processing section 13 compares the current position of the vehicle 1 acquired in steps 401 and 405 with the designated shooting position. Then, the information processing unit 13 determines whether or not the vehicle 1 should start photographing with the omnidirectional camera 21 by determining whether or not the specified positional force has also approached the predetermined distance. Further, if the specification of the photographing time is specified in the memory, the time force determines whether or not the force to start the photographing by the omnidirectional camera 21 is determined by determining whether or not the force has reached a predetermined time. This determination process is repeated until it is determined that photographing should be started.
- the information processing unit 13 starts photographing by the omnidirectional camera 21, and at this time, receives the current position information, the direction information, the speed information, and the like from the sensor control unit 11. (Step 415).
- the information processing unit 13 creates photographing information of the image from the current position information, traveling direction information, speed information, and the like of the vehicle 1 received from the sensor control unit 11 (Step 425), The captured image and the image information of this image are compiled as an archive (step 430), and communication data is created so that the image and the image information of the image are transmitted to the image server 5 via the wireless device 25. The data is transmitted to the communication control unit 14 (Step 435). Then, the process returns to step 401.
- an image captured by the vehicle 1 is transmitted to the image server 5 based on the designation from the image server 5 or the setting at the time of installation.
- FIG. 5 shows a configuration of an image distribution system including each device of the vehicle 1 described above. It is.
- the image distribution system includes a radio base station 31 communicating with the antenna 26 of the vehicle 1, a base station network 33 which is a wired network in which the base station participates, an image server 5 connected to the base station network 33, It comprises a base station 35 and a fixed terminal 36 connected to the image server 5 via a base station network 34, and an in-vehicle terminal 37 for performing wireless communication with the base station 35.
- the fixed terminal 36 is a terminal installed in an office or home, and this terminal can request and receive an image and image photographing information with the image server 5 via the base station network 34. It is like that.
- the in-vehicle terminal 37 is a terminal installed in the vehicle, and this terminal requests the image server 5 via the base station 35 and the base station network 34 to request and receive an image and image shooting information. You can do it! /
- the image server 5 is a device that performs processing for receiving and storing an image and image shooting information from the vehicle 1 via the base station network 33 and the radio base station 31. Further, the image server 5 communicates with the fixed terminal 36 via the base station network 34 and with the in-vehicle terminal 37 via the base station network 34 and the base station 35, and stores images stored from the fixed terminal 36 and the in-vehicle terminal 37. In addition, a list inquiry of image shooting information is received, and the list is returned to the fixed terminal 36 and the in-vehicle terminal 37 based on the inquiry. Further, the image server 5 receives a signal from the fixed terminal 36 or the in-vehicle terminal 37 requesting distribution of an image at a designated shooting location and shooting information of the image.
- the image server 5 distributes the image and the image information including the image capturing location information in response to the reception of the request.
- the image server 5 has a processing device (not shown) for performing such processing, and the processing device executes programs having various functions! /. Each program is executed in parallel, and data is transferred between them.
- the image server 5 may be a representative of them. Only those that are not selected are stored and used for distribution.
- the area is an area divided into geographical areas.
- the image server 5 includes a large-capacity storage device (not shown) having a database in which information used for such processing is recorded. Specifically, an HDD having a photographed image database 51, a database read-out information processing program 52, and an operation route database 53 is provided.
- the photographed image database 51 records an image for which vehicle power is also received and photographing information of the image. These images and the photographing information of the images are stored in an organized manner by the photographing time or place where the images were photographed. However, the image recorded in the captured image database 51 is developed from the captured image 6 to the distribution image 7 by the processing described later.
- the map information database 52 records map information having place names, road positions, lane shapes, and building information.
- Building information includes the name of the building, such as the name of a parking lot or store, the name of the owner of the building, the telephone number, the e-mail address, the home page address of the building, the location code of the building, and the name of the building. These include advertisements, catch phrases, and other information specific to buildings.
- the operation route database 53 is a vehicle that operates according to a predetermined route and a predetermined schedule, such as a regular route bus or a regular delivery truck of a transportation company.
- An operation schedule for associating the time with the traveling position is recorded on the vehicle having the function of (1). If these vehicles whose operation schedules are determined are used for photographing, it becomes a criterion for selecting a vehicle for photographing the requested image, so that image photographing and distribution can be performed efficiently.
- the above-mentioned program uses these databases!
- the image server 5 reads a database, an information processing program 54, a photographing instruction transmitting program 55, a photographing information processing program 56, a data distribution program for a user. 57, an image conversion program 58, and a vehicle data reception program 59.
- FIG. 6 shows an image distribution system in which the image server 5 stores the image and the image capturing information received from the vehicle 1, and stores the image and the image capturing information from the fixed terminal 36 and the in-vehicle terminal 37. Processing such as distributing these in response to distribution requests for No. 5 is a timing chart based on communication between the vehicle 1, the fixed terminal 36, and the in-vehicle terminal 37.
- the operation of these programs is determined by the image server 5 receiving a request for distribution of an image and image shooting information from the fixed terminal 36 or the in-vehicle terminal 37 and transmitting the image and the image shooting information of the image.
- the description will be made in accordance with the order of processing to be distributed upon request, whereby the operation of each program and the entire image server 5 will be clarified.
- the vehicle 1 may be used by a system administrator or the like at regular intervals by a system administrator when installing equipment for shooting and transmitting the shot images and the shooting information of the images to the image server 5.
- the settings are made (step 505). This setting is performed by recording the specified location and time in the memory of the vehicle 1. Then, the information processing section 13 periodically takes an image according to this setting, and transmits the taken image and the shooting information of the created image to the image server 5 via the radio base station 31 and the base station network 33 (step 510).
- the vehicle data receiving program 59 of the image server 5 receives the transmitted image and the imaging information of the image. Upon receiving these (step 515), the vehicle data reception program 59 transmits the received image to the image conversion program 58, and transmits photographing information of the received image to the photographing information processing program 56.
- the image conversion program 58 is a program that receives an image from the vehicle data reception program 59, expands and corrects the image, reads out the expanded image in the database, and transmits the image to the information processing program 54.
- Fig. 7 shows a conceptual diagram of the processing performed by the image conversion program 58.
- the image captured by the information processing section 13 and received by the vehicle data reception program 59 is an omnidirectional image such as the image 63 in FIG.
- Such an omnidirectional image has a distorted actual scene, as described above, and is difficult for humans to see and handle. Therefore, in order to make it easier for humans to handle visually, the image is converted to an image with a screen ratio corresponding to the actual scenery, such as image 64 in FIG. This transformation of the image is called image expansion.
- This development is performed so that a portion corresponding to the front direction of the vehicle in the received image is centered. Therefore, the direction of the center of the developed image 65 differs depending on the direction of the vehicle 1 when the image is captured. Therefore, the image conversion program 58
- the imaging information of the image is acquired from the base 51 or the imaging information processing program 56, and based on the orientation of the vehicle 1 at the time of capturing the image, the image is displayed so that the display direction of the image after development is north. Correction to slide the arrangement as it is. That is, a process for aligning the display directions of the captured images is performed (corresponding to conversion from image 65 to image 66).
- the image display direction refers to the direction of the center of the image display.
- the photographing information processing program 56 receives the photographing information of the image from the vehicle data reception program 59, performs a predetermined process on the photographing information, reads out the processing result information from the database, and transmits the information to the information processing program 54. It is supposed to.
- the predetermined processing means for example, converting shooting information described in the XML format as shown in FIG. 3 into a format such as a data table.
- the information processing program 54 receives the developed image from the image conversion program 58 and, at the same time, receives the shooting information of the image that has been subjected to the predetermined processing from the shooting information processing program 56. Then, the received image is associated with the photographing information of the image, and the image and the photographing information of the image are stored in the photographed image database 51, organized by the photographing time or place where the image was photographed. The shooting time and shooting location of the image are determined by referring to the shooting information of the image.
- the database reading 'information processing program 54' includes a plurality of pieces of image data taken within one predetermined area (area) among the image data obtained from the vehicle 1 or the like, the Only one of the representative images is selected and stored in the photographed image database 51.
- Fig. 8 shows a conceptual diagram of this selection method.
- the area 65 is divided into the minimum areas (areas AA, AB, BA, BB) of the map code (registered trademark).
- the selection condition hereinafter referred to as the filtering condition
- the image A and the image B with the newer shooting time are stored in the shot image database 51, and the others are discarded.
- the filtering condition is used to save only one image in one area.
- each image delivered from the image server 5 is The times are close to each other, and the sense of discomfort during continuous display is reduced.
- the filtering conditions include a condition for preferentially storing an image in which the speed of the vehicle 1 is low at the time of photographing, a condition in which the inclination of the vehicle 1 is small for photographing, and a condition for preferentially storing an image. You may use it.
- the determination of the filtering condition described above is performed using image information on the acquired image.
- a condition that preferentially saves the weather at the place where the weather is strong at the time of shooting may be used.
- the weather server 9 that is connected to the wide area network 8 and distributes weather information of each place performs the shooting at the place where the image was shot.
- the weather information at the time of the acquisition is acquired, and the information is included in the shooting information and stored in the shot image database 51.
- the information processing unit 13 of the vehicle 1 can detect the operation of the wiper (not shown). If the par is activated, it is determined that it is raining if it is raining, otherwise it is raining, and it is determined that it is raining, and the information of the determination result is transmitted to the image sano 5 together with the captured image as the imaging information. It may be.
- FIG. 10 is a flowchart showing a part of the processing performed by the information processing program 54 that uses the above-described filtering conditions.
- step 210 it is detected whether or not a photographed image is input from the image conversion program 58.
- step 220 if there is a result input in step 210, the processing is stepped.
- step 230 based on the photographing information input from the photographing information processing program 56 for the input image, the area to which the photographing point of the image belongs belongs to the existing image already in the photographed image database 51. It is determined whether or not the force is the same as the area including the shooting point of the data. If the force is the same, the process proceeds to step 250; otherwise, the process is stopped. Proceed to 240.
- step 240 the image is stored in the captured image database 51, and the process returns to step 210.
- step 250 of the images having the same shooting position area as that of the image, one of the forces is selected using the above-described filtering conditions, and only the selected one is stored in the shot image database 51. I do. Then, the process returns to step 210. Through the above processing, selection based on the filtering condition is realized.
- the information processing program 54 reads the map information from the operation route database 53, and reads the read map information. It is sent to program 56.
- the user can view the images via a network such as the base station network 34 for viewing the images.
- the user data distribution program 57 of the image server 5 requests the database readout / information processing program 54 to transmit a list of images in the captured image database 51 and map information in the map information database 52.
- the information processing program 54 searches the photographed image database 51 to create this list, and also outputs the map information from the map information database 52.
- the read and created list and the read map information are transmitted to the user data distribution program 57.
- the data distribution program for users 57 received from the information processing program 54 combines the received list with the map information, plots the photographed points on the map, and The data in the diagram with the time added is transmitted to the user as Web data.
- the process up to this point after receiving access from the user The process is the process of the image server 5 in step 525.
- the user can browse this Web data using a Web browser (Step 528).
- the user who received this Web data was requested to browse! /, Specify the location where the image was captured, and send a request to distribute the image and image capture information.
- the user data distribution program 57 receives this distribution request and reads out the specified image and the photographing information of the image from the database and requests the information processing program 54.
- the information processing program 54 receives this request and searches whether or not the requested image and the shooting information of the image exist in the shot image database 51. If it is determined as a result of the search, the database is read.'The information processing program 54 reads out the image and the shooting information of the image from the shot image database 51, and maps the map information around the position information described in the shooting information of this image into a map. The information database 52 is read out, and the image, the photographing information of the image, and the map information are transmitted to the user data distribution program 57.
- this transmission is received by the user data distribution program 57, it is distributed to the requesting user as Web data.
- the data distribution program 57 for the user receives the request for distribution of the image and the photographing information of the image
- the data distribution program 57 for the user and the reading of the database are performed to the processing power step 535 performed by the information processing program 54 so far. Corresponding.
- FIG. 11 shows an example of this Web data displayed using a Web browser.
- a split screen display using the multi-frame function of HTML is displayed.
- the upper right split screen is a panoramic display of a part of a certain direction area at a certain moment in an omnidirectional image requested by the user.
- the characters XX and ⁇ are displayed in the upper right corner along with the image.
- XX and ⁇ are place names.
- the user data distribution program 57 associates the directional information included in the shooting information with this image at the stage of creating the Web data, thereby determining the direction of the road where the intersection force on this screen extends, and also displays the map information. This can be realized by linking the position of the vehicle in the shooting information with the road, determining which road this road is heading to, and reflecting this in the user interface.
- the split screen on the left side is a map indicating the position of the displayed image and the direction V in which direction the displayed image is viewed.
- strips are shown as roads
- roads are shown in circles
- vehicle positions are shown as circles
- the hatched area in the map indicates the area where the photographing data exists, that is, the area where the image is photographed by the vehicle.
- the split screen at the lower center displays the current position information and the shooting time information of the shooting information of the image by characters. This can be displayed by linking the position of the road in the map information with the position of the vehicle and the shooting time in the shooting information.
- the divided screens that can be operated by the four lower right users can be moved forward in the front direction of the display image (that is, in the direction of the center of the display image), moved backward just behind the display image, or viewed from the viewpoint.
- Java registered trademark
- cgi cgi
- the images distributed from the image server 5 are selected according to the same filtering condition, so that a sense of discomfort when continuously displayed is reduced.
- the image server 5 receives the request for image distribution at the shooting location specified by the user through the network, and To the user. Then, the image control computer 10 receives the image from the image server 5 using the omnidirectional camera 21 mounted on the vehicle. Upon receiving the designation of the position to be performed, the photographing is performed at the designated timing, and the photographed image and the photographing information of this image are transmitted to the image server 5 via the wireless device 25.
- the image server 5 receives, from the image control computer 10, the image photographed by the vehicle and the photographing information of the image by wireless communication, and distributes the image and the photographing information of the image to the user. I do. For this reason, it is possible to upload the image around the vehicle to the image server 5 that distributes the image to the fixed user 36, the in-vehicle user 37, and the like according to the request of the user outside the vehicle.
- the image read from the database "information processing program 54" saves the image transmitted from the vehicle 1 in the photographed image database 51
- the image processing is performed based on the above-described filtering condition. Is now making a choice!
- the selection of the image using the filtering condition may be performed when the image stored in the captured image database 51 is read and passed to the user data distribution program 57.
- the database read-out information processing program 54 stores in the photographed image database 51 all images and photographing information passed from the image conversion program 58 and the photographing information processing program 56.
- the user data distribution program 57 requests an image of a certain area, one of the images of the area stored in the photographed image database 51 is selected using the above filtering conditions, and the selected one is selected. Output to the user data distribution program 57.
- the image to be finally delivered to the fixed terminal 36 or the in-vehicle terminal 37 only needs to be selected according to the filtering conditions, and the actual selection is based on the captured image database 51. It may be before or after the storage in.
- the configuration of the fixed terminal 36 and the in-vehicle terminal 37 is the main part different from the first embodiment.
- FIG. 12 shows the configuration of the vehicle-mounted device of the fixed terminal 36 and the in-vehicle terminal 37 in the present embodiment.
- This communication device has a wireless antenna 71, a wireless device 72, a GPS sensor 74, and a control computer 70.
- Radio 72 uses radio antenna 71 to connect to wide area network 8 It performs wireless transmission and reception of communication data via a connected wireless base station.
- the GPS sensor 74 is equivalent to the GPS sensor 22 shown in FIG.
- the control computer 70 has a sensor control unit 76, a communication control unit 73, an information processing unit 75, and a user interface unit 77.
- the sensor control unit 76 acquires the position information of the own vehicle by controlling the GPS sensor 74 periodically (for example, once every second), and stores the acquired position information in a buffer memory (not shown). The contents of the buffer memory are updated each time the GPS sensor 74 receives new position information.
- the communication control unit 73 controls the wireless device 72 to acquire the data of the wireless signal received by the wireless device 72, and outputs the data to the information processing unit 75.
- the communication control unit 73 outputs the data to the wireless device 72 and controls the wireless device 72 to wirelessly transmit the output data.
- the user interface unit 77 integrally includes an image display device, an audio output device, and an input device.
- the user interface unit 77 displays information to the user under the control of the information processing unit 75, and outputs the information to the information processing unit 75 when an input of the user's power is received.
- the information processing section 75 has a CPU (not shown) that operates by reading a program from a ROM (not shown) in the control computer 70 and executing the program.
- the CPU exchanges control signals and data with the sensor control unit 76, the communication control unit 73, and the user interface unit 77 in accordance with the instructions of the program.
- the program for operating the CPU will be described later. In the following, the operation of the CPU will be described as the operation of the information processing unit 75 unless otherwise specified.
- the fixed terminal 36 and the in-vehicle terminal 37 of the present embodiment provide information to the user interface unit 77 based on the image and the shooting information transmitted from the image server 5 as Web data. Is displayed. Fig. 13 shows the display screen of this information.
- the display screen includes an image display unit 81 that displays a captured image transmitted from the image server 5, and a map display unit 82 that displays a map together with a mark of the position of the image currently displayed by the image display unit 81.
- An operation unit 83 is provided for the user to input a command by performing an operation such as pressing.
- the information processing section 75 requests the image server 5 for information on the location where the shooting was performed, and receives a list of the shooting locations as a response to the request. Is displayed on the map display section 82. At this time, the shooting location is displayed in different colors depending on the shooting time. It should be noted that map data is stored in equipment (not shown) of the fixed terminal 36 and the in-vehicle terminal 37, and the information processing unit 75 can use this map data for drawing a map.
- the image server 5 requests the image server 5 for an image in an area that has proceeded in a direction opposite to the direction corresponding to the center of the image currently displayed on the image display unit 81, and receives the image. To display.
- the displayed image is shifted left or right, respectively, and the left or right edge of the protruding image is displayed on the image display unit 81. Display at the right end or left end. As described above, by pressing the right button 86 or the left button 87, an image can be displayed with each of the 360 ° front faces.
- FIG. 14 is a flowchart showing in more detail a portion related to a case where the up button 84 is pressed in a program for displaying the display screen (hereinafter, referred to as an image display program).
- step 110 the process waits until a user presses buttons 84 to 87 or the like. If an operation by the user is detected, the process proceeds to step 120, in which it is determined whether or not the command is for proceeding in the forward direction, that is, whether or not the force is the pressing of the upper button 84. Returning to 110, if the upper button 84 is pressed, the process proceeds to step 130.
- step 130 it is determined whether or not the image at the position corresponding to the traveling direction is in the image server 5. Is determined. This determination is performed using the following proximity point search algorithm.
- FIG. 15 shows a conceptual diagram of a typical case where such an upper button 84 is pressed.
- the captured image power at the intersection 78 is now displayed on the image display unit 81 with the direction indicated by the arrow 79 as the front, and the captured image is stored in the captured image database 51 of the image server 5 at the adjacent positions A, B, and C. It is assumed that it is stored.
- the proximity search algorithm is a method for determining which of the positions A, B, and C is displayed when the upper button 84 is pressed.
- This proximity point search algorithm will be described with reference to the explanatory diagram of FIG. Assuming that the clockwise angle of the direction corresponding to the front of the current image with respect to the north direction is ⁇ , the angle is from 0- ⁇ ( ⁇ is, for example, 45 °) to 0 + ⁇ with respect to the north direction, and A search area is defined as a fan-shaped area 89 sandwiched between two arcs of radius d—max and d—min centered on the shooting point 88 of the image displayed on the current image display unit 81.
- the image server 5 requests the image server 5 for a list of images photographed in the search area, that is, a list of candidates for the next image. Responsiveness from the image server 5 If there is no next candidate, the process proceeds to step 140, where the display screen is stopped and the mark 90 is displayed as shown in FIG. 17, and then the process proceeds to step 110. move on. By displaying a stop mark when there is no next candidate, the user can visually confirm that there is no next candidate.
- steps 150 and 160 are executed in parallel with one of the candidates being the next display target. If there are multiple next candidates, that is, if there are multiple corresponding images in the search area described above, one of them is selected as the next display target according to the predetermined selection rule, and then one is selected. Steps 150 and 160 are executed in parallel as the next display target.
- a predetermined selection rule a method of preferentially selecting an image having the latest photographing time, a method of preferentially selecting an object having a low vehicle speed at the time of photographing, and a method of displaying the photographing time of an image which is currently displayed. Recently, there is a method of preferentially selecting one.
- step 150 the image server 5 requests the image server 5 to distribute the one image selected in step 130, and receives the requested image and the shooting information of the image from the image server 5. .
- step 160 the next display target candidate at the shooting location of one image selected in step 130 is searched. Specifically, using the near-point search algorithm described above with eight directions of north, northeast, east, southeast, south, southwest, west, and northwest from the shooting point, candidates are determined in each of the traveling directions. The image server 5 is queried for a certain strength or not, and if there are a plurality of candidates, one of them is selected as the next candidate.
- step 150 and step 160 are completed, the process proceeds to step 170, where the image acquired in step 150 is displayed on the image display section 81, and a map centering on the image capturing point of the image is displayed on the map display section 82. To be displayed. Further, as shown in FIG. 18, an arrow 91 indicating the direction in which the candidates were found in step 160 is displayed in the image display section 81. This allows the user to know in which direction to proceed to view the next image. After step 170, processing returns to step 110.
- the user can display the image delivered from the image server 5 in the display of the inability to proceed as shown in FIG. 17 or the guidance arrow in the traveling direction shown in FIG. , Etc. can be displayed as progress assistance. Thereby, a smooth captured image can be displayed.
- This embodiment is different from the second embodiment in that, when the up button 84 is pressed in the image display program, the front direction of the next image to be displayed is controlled based on the direction at the time of shooting the image. That is what you may do.
- FIG. 19 shows a case where the effect of the present embodiment is characteristically exhibited.
- the image display program displays an image taken at the arrow of the currently curved road 41 and the front direction of the display is set to the arrow direction
- the image display program displays the photographed image of the point A with the arrow direction as the front.
- the image display program when the upper button 84 is pressed under a predetermined condition, changes the front direction of the image to be displayed next to the image to be captured. The direction of the hour or vice versa.
- FIG. 20 shows, as a flowchart, a part of an image display program for realizing the above operation.
- the flowchart in this figure shows step 170 in FIG. 14 in detail.
- step 410 it is determined whether there is one next candidate.
- the next candidate is an area that can be advanced by pressing the up button 84. That is, it is determined in step 130 whether or not it is determined that a plurality of next candidates exist. Taking FIG. 19 as an example, this corresponds to determining whether or not there is any point other than point A that can be moved from the current position by the forward button. If there is not one next candidate, the process proceeds to step 450, and if there is one, the process proceeds to step 420.
- step 420 it is determined whether or not the absolute value of the angle difference between the next candidate direction and the current direction is less than 90 °.
- the next candidate direction is the direction of the vehicle 1 when the next candidate image is captured.
- the next candidate direction in FIG. 19 is the direction of the vehicle 1 that captured this point at the point A.
- the current direction is a direction corresponding to the image portion currently displayed as the front by the image display program. For example, in FIG. 19, it is the direction of the arrow.
- the next direction is set as the next candidate direction.
- the next direction is the direction corresponding to the front part of the image display when the image display program displays the image of the next candidate. In this way, by adjusting the front direction of the screen to be displayed next to the direction at the time of capturing the image, the front direction of the display screen and the directions of the next candidate and the next candidate are further improved. It is possible to suppress an increase in the angle between the vehicle and the vehicle. Therefore, by pressing and holding the upper button 84, an image can be displayed along the road.
- step 440 the next direction is set to a direction in which the next candidate direction has increased by 180 °, that is, the next direction is the opposite direction.
- step 450 after steps 430 and 440, other processing in step 170 in Fig. 14, that is, display of a captured image with the direction specified in step 430 or 440 facing the front, display of a guidance arrow, and the like Is performed, and then the processing in FIG. 20 ends.
- FIG. 21 is a schematic diagram illustrating a characteristic portion of the present embodiment.
- Vehicles 68 and 69 as probe cars are traveling in an area 66 including a part of a road 67 with one lane on each side. If these vehicles 68 and 69 transmit images taken in this area 66 to the image server 5, the two images are considered to be images in opposite directions, and the images in opposite directions are compared with each other. Do not exclude each other under the filtering conditions shown in the first embodiment.
- the filtering conditions in the present embodiment are as follows.
- the database reading 'information processing program 54 determines that the direction of the vehicle 1 at the time of photographing the image belongs to the same area as the image of the area received from the vehicle 1 via the image conversion program 58 and belongs to the same area. If there is the same image as the above, one of them is selected according to the filtering conditions of the first embodiment, and only the selected one is stored in the captured image database 51. If an image of an area received from the vehicle 1 belongs to the same area as the image and has an image opposite to the direction of the vehicle 1 at the time of capturing the image, both of these two images are captured images. Be stored in database 51.
- the two images having opposite directions means that the two directions form an angle equal to or greater than a certain threshold angle; otherwise, the two directions are assumed to be the same. By doing so, images in both directions on a road in a certain area can be distributed, so that images such as fixed terminals 36 and in-vehicle terminals 37 can be distributed without discomfort to users. .
- FIG. 22 is a flowchart showing a part of the processing using the above-described filtering conditions performed when the information processing program 54 saves the image received from the image conversion program 58 in the captured image database 51. Steps denoted by the same reference numerals in FIG. 10 and FIG. 22 perform the same processing as each other, and a description thereof will be omitted here.
- step 245 when the images in the captured image database 51 determined to be images in the same area in step 230 and the image input from the image conversion program 58 this time are used, the It is determined whether or not the direction of the vehicle is different by 150 ° or more as a threshold angle, and if affirmative, the process proceeds to step 240, where the input image is stored in the photographed image database 51, and if negative, the process proceeds to step 240.
- step 250 one of the two images is selected according to the filtering conditions. In this way, images in both directions on roads in the same area are stored.
- the captured image database 51 is stored in both directions in one area.
- the operation of the database reading 'information processing program 54 when distributing the stored images to the fixed terminal 36 and the in-car terminal 37 will be described. A diagram for explaining this case is shown in FIG.
- Images 92, 93, and 94 are each taken at a point shown on road 96 in this figure by a vehicle facing in the direction of each arrow. Images 92 and 93 are images belonging to one area 95 surrounded by a dotted line and facing in opposite directions.
- the image 92 is currently displayed.
- the database is read out.
- the photographing information of the image 94 is returned to the user side via the user data distribution program 57 as the next candidate.
- the user obtains the image 94 by the processing of step 150 in FIG.
- the database read-out information processing program 54 receives the inquiry signal from the user side, and then receives the shooting information of the image 92 as the next candidate. To the user through the data distribution program 57 for the user. As a result, the user obtains the image 94 by the processing of step 150 in FIG.
- the user side transmits the above inquiry signal.
- information on the front direction of the currently displayed image is included in the above signal, and the database is read out.
- the information processing program 54 determines the information of the image in either direction based on the information on the front direction. Is returned.
- FIG. 24 shows a flowchart of the processing of the database reading 'information processing program 54 for this purpose.
- step 310 it is detected whether or not there is a request for an image of a specific area from the user data distribution program 57.
- step 320 if there is a result request in step 310, the process proceeds to step 330; otherwise, the process returns to step 310.
- step 330 the image of the requested area is searched from the captured image database 51.
- step 340 it is determined whether or not the corresponding image exists as a result of the search in step 330. If not, the process proceeds to Step 350; otherwise, the process proceeds to Step 360.
- step 350 information indicating that there is no image in the area is output to the user data distribution program 57, and the process returns to step 310.
- step 360 it is determined whether or not there is only one candidate. If there is only one candidate, the process proceeds to step 370. If there is more than one candidate, the process proceeds to step 380.
- step 370 the one image and the photographing information of the image are output to the user data distribution program 57, and the process returns to step 310.
- step 380 an image in a direction close to the direction at the time of the request is delivered. That is, the image display program displays the image on the image display unit 81, and distributes the image whose direction of shooting is near V ⁇ in the direction corresponding to the front part of the image. Then, the process returns to step 310.
- the lane in which the vehicle 1 is located differs depending on the direction of its movement.
- the user moves from the lane matching the front direction, that is, the traveling direction. You can browse images.
- the vehicle 1 changes the image capturing method according to the traveling speed. Further, the vehicle 1 transmits to the image server 5 information on the inclination at the time of shooting and information on whether or not the shooting location is within the intersection as shooting information. Vehicle 1, image server 5, fixed terminal 3 6. Regarding the configuration and operation of the in-vehicle terminal 37, the parts not described in the present embodiment are the same as in the second embodiment.
- FIG. 25 is a flowchart showing the processing in the information processing unit 13 of the image control computer 10 for this purpose.
- the processing shown in this figure may replace the processing of step 415 in FIG. 4, or may be performed periodically (for example, every 5 seconds) as a replacement for the entire processing in FIG. May be.
- the processing of Step 425 to Step 435 shall be performed in parallel with the processing of FIG. .
- step 610 vehicle speed information is obtained from the speed sensor 24.
- the speed sensor 24 cannot be used, information on the vehicle speed included in the current position information from the GPS sensor 22 may be used.
- step 620 it is determined whether or not the acquired vehicle speed is 0 or a value that is small enough to approximate 0. If this determination is affirmative, the process proceeds to step 630; otherwise, the process proceeds to step 625. In step 625, shooting is performed using the omnidirectional camera 21, and then the processing in FIG. 25 ends.
- step 630 it is determined whether or not the previous shooting position is the same as the current shooting position. That is, it is determined whether or not the image of this place has been photographed immediately before. Specifically, it is determined whether or not the shooting position included in the shooting information of the previous shooting and the current position included in the current position information from the GPS sensor 22 can be regarded as the same. To be regarded as the same means to be the same or approximately the same. If this determination is affirmative, the process performs shooting in step 625; if negative, the process in FIG. 25 ends.
- the photographing interval is shortened when the running speed of the vehicle is high, and the photographing interval is lengthened when the speed is slow (see FIG. 26). Specifically, the running speed of the car and the time interval of photographing are maintained in an inverse relationship. Doing this Thus, shooting can be performed at a fixed distance interval.
- the vehicle 1 includes information indicating whether or not the vehicle 1 is in the intersection at the time of shooting, that is, an intersection flag in the shooting information.
- an intersection flag in the shooting information.
- a method in which the information processing unit 13 of the image control computer 10 determines whether or not the vehicle 1 is within an intersection at the time of shooting includes a GPS at the time of shooting. There is a method of making a determination based on information on the current position acquired from the sensor 22 and map information acquired from the map DB 18 in the memory 15.
- the information processing unit 13 can detect the operation of the brake pedal, the turn signal, and the accelerator pedal of the vehicle 1, when the brake is depressed, the blinker blinks, and the accelerator is depressed in this order, it is determined that the vehicle is at the intersection. It can also be determined (see Figure 27).
- the driver may explicitly notify the information processing unit 13 of the fact that he / she will enter the intersection using an input device not shown in FIG.
- the intersection flag added to the photographing information in this manner may be used as a criterion of the filtering condition.
- the image related to the shooting information with the intersection flag turned on may be preferentially selected regardless of other criteria.
- an intersection flag is used as a selection rule used when there are two or more corresponding images in the search range. It may be possible to preferentially select an image related to shooting information having the following. By doing so, it is easy to select a route near the intersection when browsing images, which leads to improvement in operability.
- the database readout of the image server 5 is executed by the information processing program 54 by using the photographing position of the inputted photographing information and the map of the map information database 52.
- the information may be used to determine whether or not the shooting position is at an intersection, and the result of the determination may be used as a criterion for the filtering condition.
- the vehicle 1 includes information on the inclination angle of the vehicle 1 at the time of shooting in the shooting information.
- the information processing unit 13 obtains the inclination angle of the vehicle 1 from a 3D gyroscope (corresponding to an inclination amount specifying unit) that detects the inclination angle of the vehicle 1 (not shown in FIG. 2). It has become.
- the tilt angle information is an angle 0 formed by the traveling direction of the vehicle 1 (the direction of the arrow in the figure) with respect to the horizontal plane, as shown in FIG.
- FIG. 29 is a schematic diagram illustrating this correction processing. Even on a sloping road, the buildings 97 along the road are usually built horizontally. Therefore, the portion of the image taken by the vehicle 1 on the side of the road looks like the building 97 is inclined by the angle ⁇ , as shown in FIG.
- a correction as shown in FIG. 29 (b) is performed based on the tilt information included in the shooting information of the image. I do. That is, when displaying a portion in the side of the road, if the inclination angle is ⁇ , that portion is rotated by ⁇ Z2 in the direction opposite to the inclination angle. When the image is rotated in this way, the rotation causes an image chip 98, so that the display range is slightly narrowed so that the chip does not occur. Specifically, the boundary of the display range is changed from rectangle 100 to rectangle 99. By doing so, a natural image browsing process can be performed.
- the correction of the image based on the inclination information as shown in FIG. 29 may be performed by reading the database of the image server 5 and the information processing program 54.
- the vehicle 1 is configured to perform detailed photography at the pre-determined important photography location.
- FIG. 30 shows an overall view of an example of such a system for detailed imaging.
- the designation of the important point from the image server 5 may be performed based on request signals from the fixed terminal 36 and the in-car terminal 37.
- An example of more detailed shooting than usual is to shorten the time interval of shooting. This means that, for example, what was normally shot at intervals of 5 seconds is changed to be shot at intervals of 0.5 seconds.
- the speed of the vehicle 1 may be automatically reduced while the time interval of the photographing is kept constant. According to this, it is also possible to substantially shorten the distance interval of photographing.
- a signal may be output from the information processing unit 13 to a display device (display, speaker, etc.) (not shown in FIG. 2) mounted on the vehicle 1 to notify the occupant of the vehicle 1 to take a detailed photograph.
- the designation of the important point may be a point on a road or a point outside the road such as a store. If the point of focus is a point outside the road, the determination as to whether or not the force has approached the point of focus is made by determining whether or not the force has approached the point on the road that is closest to the specified point of focus (see Figure 31).
- the user can browse detailed images of important points.
- the configurations and operations of the vehicle 1, the image server 5, the fixed terminal 36, and the in-vehicle terminal 37 are the same as those of the second embodiment except for the portions described in the present embodiment.
- the present embodiment since the information on the GPS satellite power does not reach the vehicle 1 due to obstacles such as a building, a three-dimensional walkway, etc., the GPS information from the GPS sensor 22 (currently (Including position information, speed information, driving direction, etc.) cannot be obtained, the map information based on the position information at the time before and after the GPS information could be obtained is used to obtain the current position information that cannot be obtained. Estimate the shooting position at the time of contact.
- the information processing unit 13 of the image control computer 10 performs a process as shown in FIG. 32 in step 425 in FIG.
- the images are collectively transmitted to the communication control unit 14.
- step 710 it is determined whether or not GPS information can be acquired from the GPS sensor 22 as the photographing information of the image photographed in step 415 in FIG. Since the GPS sensor 22 also outputs information as to whether GPS information can be acquired, the above determination is made using this information.
- the vehicle speed information output from the speed sensor 24 indicates that the vehicle is running. May be obtained! / ⁇
- step 720 If GPS information cannot be obtained, the process proceeds to step 720, and instead of including GPS information as shooting information, information indicating that measurement is not possible, that is, a measurement impossible flag is included.
- step 760 the photographing information including the information such as the measurement impossible flag, the file name of the photographed image, and the photographing time is stored in the information management DB 16, and then the processing in FIG. 32 ends.
- step 710 If the current position information can be obtained in step 710, the process proceeds to step 730, and it is determined whether or not the shooting information having the measurement impossible flag exists in the table in which the shooting information in the information management DB 16 is stored. If it is determined, the process proceeds to step 740 if it is, otherwise the process proceeds to step 760.
- step 740 GPS information is complemented using map matching. More specifically, the information of the current position (denoted as position 1) and the time (denoted as time 1) in the GPS information obtained immediately before in step 710, and a measurement impossible flag in the information management DB 16 are provided. Location information (position 2) and time information (time 2) included in the latest GPS information, shooting time information (time 3) of the shooting information with the shooting flag, and map Based on the road information acquired from DB18, the photographing point and the traveling direction of the photographing information having the photographing impossible flag are specified using map matching.
- step 750 the information on the photographing point and the traveling direction specified in step 740 is used as the position information and traveling speed information of the photographing information having the photographing impossible flag, and the photographing impossible flag is deleted. Thereafter, the process proceeds to step 760.
- GPS information is complemented by map matching.
- the GPS information may be complemented by using map matching. This is particularly effective when it is difficult to determine whether the GPS information cannot be acquired, such as when the GPS sensor 22 does not output information about the force that cannot acquire the GPS. It is.
- complementation of GPS information by map matching performed by the image server 5 will be described.
- the GPS sensor continuously outputs the last obtained GPS information. Therefore, for example, when passing through a tunnel where the information of the GPS satellite does not reach, if this GPS information is included in the shooting information as it is and Vehicle 1 transmits to the image server 5, Vehicle 1 stops for a certain time in front of the tunnel. After that, if it moves out of the tunnel immediately, information remains.
- the information processing program 54 reads the database of the image server 5 according to the photographing information from the vehicle 1, when the vehicle 1 stops for a fixed time, and then instantaneously moves a predetermined distance or more. Based on the transmitted photographing information, the GPS information included in the photographing information at the time of the pause is corrected by using the method of complementing the GPS information by the above-described map matching.
- the configurations and operations of the vehicle 1, the image server 5, the fixed terminal 36, and the in-vehicle terminal 37 are the same as those in the second embodiment except for the portions described in the present embodiment.
- the vehicle 1 does not necessarily have to transmit the captured image and the captured information to the image server 5 using wireless communication.
- the image and the photographing information taken by the vehicle 1 are stored in a removable HDD, and the HDD is later connected to the image server 5, so that the photographed image and the photographing information by the vehicle 1 are stored in the image server 5. Can be entered.
- each means realized as software is implemented as dedicated hardware. May be configured
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
- Television Signal Processing For Recording (AREA)
- Facsimiles In General (AREA)
- Instructional Devices (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/294,882 US20060132602A1 (en) | 2003-06-12 | 2005-12-06 | Image server, image acquisition device, and image display terminal |
US13/370,430 US9369675B2 (en) | 2003-06-12 | 2012-02-10 | Image server, image deliver based on image information and condition, and image display terminal |
US13/423,514 US9369676B2 (en) | 2003-06-12 | 2012-03-19 | Image server, image deliver based on image information and condition, and image display terminal |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003167806A JP4321128B2 (ja) | 2003-06-12 | 2003-06-12 | 画像サーバ、画像収集装置、および画像表示端末 |
JP2003-167806 | 2003-06-12 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/294,882 Continuation US20060132602A1 (en) | 2003-06-12 | 2005-12-06 | Image server, image acquisition device, and image display terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004111973A1 true WO2004111973A1 (ja) | 2004-12-23 |
Family
ID=33549314
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/008112 WO2004111973A1 (ja) | 2003-06-12 | 2004-06-10 | 画像サーバ、画像収集装置、及び画像表示端末 |
Country Status (4)
Country | Link |
---|---|
US (3) | US20060132602A1 (ja) |
JP (1) | JP4321128B2 (ja) |
CN (1) | CN100530265C (ja) |
WO (1) | WO2004111973A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110181719A1 (en) * | 2005-05-11 | 2011-07-28 | Canon Kabushiki Kaisha | Network camera system and control method therefore |
WO2014150927A1 (en) * | 2013-03-15 | 2014-09-25 | Pictometry International Corp. | Virtual property reporting for automatic structure detection |
Families Citing this family (90)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11124207B2 (en) | 2014-03-18 | 2021-09-21 | Transportation Ip Holdings, Llc | Optical route examination system and method |
US20150235094A1 (en) | 2014-02-17 | 2015-08-20 | General Electric Company | Vehicle imaging system and method |
US10798282B2 (en) | 2002-06-04 | 2020-10-06 | Ge Global Sourcing Llc | Mining detection system and method |
US10110795B2 (en) | 2002-06-04 | 2018-10-23 | General Electric Company | Video system and method for data communication |
US11358615B2 (en) | 2002-06-04 | 2022-06-14 | Ge Global Sourcing Llc | System and method for determining vehicle orientation in a vehicle consist |
US11208129B2 (en) | 2002-06-04 | 2021-12-28 | Transportation Ip Holdings, Llc | Vehicle control system and method |
US9875414B2 (en) | 2014-04-15 | 2018-01-23 | General Electric Company | Route damage prediction system and method |
US7353034B2 (en) | 2005-04-04 | 2008-04-01 | X One, Inc. | Location sharing and tracking using mobile phones or other wireless devices |
JP4841864B2 (ja) * | 2005-05-27 | 2011-12-21 | 寺田軌道株式会社 | 作業計画立案システム、画像表示装置およびプログラム |
JP4770367B2 (ja) * | 2005-09-28 | 2011-09-14 | 日産自動車株式会社 | 車両位置検出システム |
JP4708203B2 (ja) * | 2006-02-08 | 2011-06-22 | パイオニア株式会社 | 地理情報表示装置及び地理情報表示プログラム |
JP2007306353A (ja) * | 2006-05-12 | 2007-11-22 | Opt Kk | 動画の表示方法、動画表示システムおよび広角動画撮像装置 |
JP5140889B2 (ja) * | 2006-08-10 | 2013-02-13 | サンリツオートメイション株式会社 | ゆらぎ補正による画像表示方法及びその方法を用いた動体リモートコントロールシステム |
ATE536297T1 (de) * | 2006-10-13 | 2011-12-15 | Continental Teves Ag & Co Ohg | Fahrzeug und verfahren zur bestimmung von in der fahrzeugumgebung befindlichen fahrzeugen |
WO2008072429A1 (ja) * | 2006-12-12 | 2008-06-19 | Locationview Co. | 地理情報関連付き画像データ表示システム |
CN103791914B (zh) * | 2007-03-23 | 2015-09-02 | 三菱电机株式会社 | 导航系统及车道信息显示方法 |
CN101743569A (zh) * | 2007-05-25 | 2010-06-16 | 谷歌公司 | 渲染、查看和注释全景图像及其应用 |
US7990394B2 (en) | 2007-05-25 | 2011-08-02 | Google Inc. | Viewing and navigating within panoramic images, and applications thereof |
KR101362764B1 (ko) * | 2007-07-02 | 2014-02-14 | 삼성전자주식회사 | 사진 파일 제공 장치 및 방법 |
KR101423928B1 (ko) | 2007-08-20 | 2014-07-28 | 삼성전자주식회사 | 전자지도에 포함된 이미지 파일을 이용한 이미지 재생장치, 이의 재생 방법 및 상기 방법을 실행하기 위한프로그램을 기록한 기록매체. |
EP2051222A1 (en) * | 2007-10-17 | 2009-04-22 | Harman/Becker Automotive Systems GmbH | Method and system for providing a visual information of a remote location to a user of a vehicle |
JP4470992B2 (ja) | 2007-12-05 | 2010-06-02 | セイコーエプソン株式会社 | 映像管理システム |
JP2009235696A (ja) * | 2008-03-26 | 2009-10-15 | Compensation Seminary Co Ltd | 建物の経年変化を調査するためのシステム、方法、及びコンピュータ・プログラム |
EP2146325B1 (de) * | 2008-07-16 | 2013-03-06 | SMR Patents S.à.r.l. | Aufzeichnungsgerät für die Aufnahme, Bearbeitung und Speicherung von Bilddaten in einem Fahrzeug sowie Verfahren |
KR101502013B1 (ko) * | 2008-12-24 | 2015-03-12 | 엘지전자 주식회사 | 이동 단말기 및 이동 단말기의 위치기반서비스 제공방법 |
US8554871B2 (en) * | 2009-01-30 | 2013-10-08 | Navteq B.V. | Method and system for exchanging location content data in different data formats |
US20110320650A1 (en) * | 2009-02-20 | 2011-12-29 | Nec Corporation | Analysis preprocessing system, analysis preprocessing method and analysis preprocessing program |
JPWO2010095459A1 (ja) * | 2009-02-20 | 2012-08-23 | 日本電気株式会社 | 解析前処理システム、解析前処理方法および解析前処理プログラム |
US8416300B2 (en) | 2009-05-20 | 2013-04-09 | International Business Machines Corporation | Traffic system for enhancing driver visibility |
US8818641B2 (en) | 2009-12-18 | 2014-08-26 | Honda Motor Co., Ltd. | Method of intersection estimation for a vehicle safety system |
US8823556B2 (en) | 2010-09-02 | 2014-09-02 | Honda Motor Co., Ltd. | Method of estimating intersection control |
US8618951B2 (en) | 2010-09-17 | 2013-12-31 | Honda Motor Co., Ltd. | Traffic control database and distribution system |
US8618952B2 (en) | 2011-01-21 | 2013-12-31 | Honda Motor Co., Ltd. | Method of intersection identification for collision warning system |
JP2012159967A (ja) * | 2011-01-31 | 2012-08-23 | Nec Corp | 通信装置、通信システムおよび通信方法 |
DE102011003553A1 (de) * | 2011-02-03 | 2012-08-09 | Robert Bosch Gmbh | Vorrichtung und Verfahren zur optischen Aufnahme des Unterbodens eines Fahrzeugs |
CA2772210C (en) * | 2011-03-24 | 2015-11-24 | Kabushiki Kaisha Topcon | Omnidirectional camera and lens hood |
JP5870618B2 (ja) * | 2011-10-21 | 2016-03-01 | 大日本印刷株式会社 | 自由視点映像表示装置 |
JP5853693B2 (ja) * | 2011-12-28 | 2016-02-09 | 株式会社Jvcケンウッド | 端末装置、表示制御プログラム、及び表示制御方法 |
JP6007495B2 (ja) * | 2011-12-28 | 2016-10-12 | 日本電気株式会社 | 画像データベースシステム |
GB201202344D0 (en) * | 2012-02-10 | 2012-03-28 | Isis Innovation | Method of locating a sensor and related apparatus |
EP2688060A4 (en) * | 2012-03-22 | 2015-08-05 | Sony Corp | DISPLAY DEVICE, IMAGE PROCESSING DEVICE AND IMAGE PROCESSING PROCESS AND COMPUTER PROGRAM THEREFOR |
JP5910241B2 (ja) * | 2012-03-29 | 2016-04-27 | 富士通株式会社 | 映像抽出装置、方法及びプログラム |
JP6058907B2 (ja) * | 2012-03-29 | 2017-01-11 | 矢崎エナジーシステム株式会社 | 車載記録装置 |
US10666860B2 (en) * | 2012-09-11 | 2020-05-26 | Ricoh Company, Ltd. | Image processor, image processing method and program, and imaging system |
EP2730890B1 (en) | 2012-11-07 | 2020-01-15 | Volvo Car Corporation | Vehicle image capture system |
JP5745497B2 (ja) | 2012-12-04 | 2015-07-08 | 任天堂株式会社 | 表示システム、表示制御装置、情報処理プログラム及び表示方法 |
US9930592B2 (en) * | 2013-02-19 | 2018-03-27 | Mimosa Networks, Inc. | Systems and methods for directing mobile device connectivity |
US9179336B2 (en) | 2013-02-19 | 2015-11-03 | Mimosa Networks, Inc. | WiFi management interface for microwave radio and reset to factory defaults |
WO2014132680A1 (ja) * | 2013-02-28 | 2014-09-04 | アイシン精機株式会社 | 車両の制御装置、及びプログラム |
US9130305B2 (en) | 2013-03-06 | 2015-09-08 | Mimosa Networks, Inc. | Waterproof apparatus for cables and cable interfaces |
US9362629B2 (en) | 2013-03-06 | 2016-06-07 | Mimosa Networks, Inc. | Enclosure for radio, parabolic dish antenna, and side lobe shields |
US10742275B2 (en) | 2013-03-07 | 2020-08-11 | Mimosa Networks, Inc. | Quad-sector antenna using circular polarization |
US9191081B2 (en) | 2013-03-08 | 2015-11-17 | Mimosa Networks, Inc. | System and method for dual-band backhaul radio |
WO2014170386A1 (de) * | 2013-04-16 | 2014-10-23 | Trajet Gmbh | Verfahren zur kombinierten bestimmung einer geschwindigkeit und einer bildaufnahme aus einem fahrzeug und dafür geeignete vorrichtung |
JP6208977B2 (ja) * | 2013-05-16 | 2017-10-04 | 株式会社Nttドコモ | 情報処理装置、通信端末およびデータ取得方法 |
US9295103B2 (en) | 2013-05-30 | 2016-03-22 | Mimosa Networks, Inc. | Wireless access points providing hybrid 802.11 and scheduled priority access communications |
US10938110B2 (en) | 2013-06-28 | 2021-03-02 | Mimosa Networks, Inc. | Ellipticity reduction in circularly polarized array antennas |
CN103685944A (zh) * | 2013-11-26 | 2014-03-26 | 移康智能科技(上海)有限公司 | 一种摄像设备的定位摄像方法 |
CN103795918A (zh) * | 2013-11-29 | 2014-05-14 | 深圳市中兴移动通信有限公司 | 一种拍摄方法和拍摄装置 |
US9001689B1 (en) | 2014-01-24 | 2015-04-07 | Mimosa Networks, Inc. | Channel optimization in half duplex communications systems |
US9780892B2 (en) | 2014-03-05 | 2017-10-03 | Mimosa Networks, Inc. | System and method for aligning a radio using an automated audio guide |
US9998246B2 (en) | 2014-03-13 | 2018-06-12 | Mimosa Networks, Inc. | Simultaneous transmission on shared channel |
US10958332B2 (en) | 2014-09-08 | 2021-03-23 | Mimosa Networks, Inc. | Wi-Fi hotspot repeater |
USD752566S1 (en) | 2014-09-12 | 2016-03-29 | Mimosa Networks, Inc. | Wireless repeater |
JP6047197B2 (ja) * | 2015-05-01 | 2016-12-21 | 任天堂株式会社 | 表示システム、表示制御装置、情報処理プログラム及び表示方法 |
JP6499514B2 (ja) * | 2015-05-29 | 2019-04-10 | 株式会社デンソーテン | ドライブレコーダ、データ記録システム、データ記録方法、及び、プログラム |
US10217283B2 (en) | 2015-12-17 | 2019-02-26 | Google Llc | Navigation through multidimensional images spaces |
WO2017109976A1 (ja) * | 2015-12-25 | 2017-06-29 | パイオニア株式会社 | 距離推定装置、距離推定方法及びプログラム |
WO2017109979A1 (ja) * | 2015-12-25 | 2017-06-29 | パイオニア株式会社 | 距離推定装置、距離推定方法及びプログラム |
US10749263B2 (en) | 2016-01-11 | 2020-08-18 | Mimosa Networks, Inc. | Printed circuit board mounted antenna and waveguide interface |
WO2017183292A1 (ja) * | 2016-04-20 | 2017-10-26 | 株式会社ソニー・インタラクティブエンタテインメント | 処理装置および画像決定方法 |
JP6580516B2 (ja) * | 2016-05-02 | 2019-09-25 | 株式会社ソニー・インタラクティブエンタテインメント | 処理装置および画像決定方法 |
JP6543823B2 (ja) * | 2016-05-13 | 2019-07-17 | 本田技研工業株式会社 | 鞍乗り型車両の光学センサ配置構造 |
CN107515006A (zh) * | 2016-06-15 | 2017-12-26 | 华为终端(东莞)有限公司 | 一种地图更新方法和车载终端 |
WO2018022526A1 (en) | 2016-07-29 | 2018-02-01 | Mimosa Networks, Inc. | Multi-band access point antenna array |
JP2018037886A (ja) * | 2016-08-31 | 2018-03-08 | 株式会社東芝 | 画像配信装置、画像配信システム、および画像配信方法 |
WO2018163750A1 (ja) * | 2017-03-06 | 2018-09-13 | パイオニア株式会社 | 距離推定装置、距離推定方法及びプログラム |
US10339392B2 (en) * | 2017-06-15 | 2019-07-02 | Blackberry Limited | Method and system for rear status detection |
US10511074B2 (en) | 2018-01-05 | 2019-12-17 | Mimosa Networks, Inc. | Higher signal isolation solutions for printed circuit board mounted antenna and waveguide interface |
US11069986B2 (en) | 2018-03-02 | 2021-07-20 | Airspan Ip Holdco Llc | Omni-directional orthogonally-polarized antenna system for MIMO applications |
JP7026003B2 (ja) * | 2018-06-11 | 2022-02-25 | 本田技研工業株式会社 | 表示制御装置及びプログラム |
WO2020019115A1 (zh) * | 2018-07-23 | 2020-01-30 | 深圳前海达闼云端智能科技有限公司 | 融合建图方法、相关装置及计算机可读存储介质 |
US11289821B2 (en) | 2018-09-11 | 2022-03-29 | Air Span Ip Holdco Llc | Sector antenna systems and methods for providing high gain and high side-lobe rejection |
KR102522923B1 (ko) * | 2018-12-24 | 2023-04-20 | 한국전자통신연구원 | 차량의 자기위치 추정 장치 및 그 방법 |
US20200232803A1 (en) | 2019-01-23 | 2020-07-23 | Uber Technologies, Inc. | Generating composite images for display on a mobile device based on ground truth image rendering |
JP2020166584A (ja) * | 2019-03-29 | 2020-10-08 | トヨタ自動車株式会社 | 画像情報収集システム及び車両 |
GB201909556D0 (en) * | 2019-07-03 | 2019-08-14 | Tomtom Traffic Bv | Collecting user-contributed data relating to a navibable network |
JP6879351B2 (ja) * | 2019-10-31 | 2021-06-02 | 株式会社リコー | 撮影装置、表示方法、撮影方法、およびプログラム |
JP7230783B2 (ja) * | 2019-11-15 | 2023-03-01 | トヨタ自動車株式会社 | 服装情報取得システムおよび服装情報取得方法 |
JP7055853B2 (ja) * | 2020-10-28 | 2022-04-18 | 株式会社東芝 | 画像配信装置、画像配信システム、および画像配信方法 |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0952555A (ja) * | 1995-08-11 | 1997-02-25 | Mitsubishi Electric Corp | 周辺監視装置 |
JP2001014332A (ja) * | 1999-06-30 | 2001-01-19 | Denso Corp | 情報サービスシステム |
JP2001184595A (ja) * | 1999-12-27 | 2001-07-06 | Matsushita Electric Ind Co Ltd | 伝送システム、送信装置、受信装置、表示装置、および車 |
JP2001236509A (ja) * | 2000-02-24 | 2001-08-31 | Minolta Co Ltd | 被写体の傾き検出装置および方法 |
JP2002099537A (ja) * | 2000-09-21 | 2002-04-05 | Ntt Docomo Inc | 活動情報公開システムおよび活動情報公開方法 |
JP2002318127A (ja) * | 2001-04-23 | 2002-10-31 | Ntt Advanced Technology Corp | 携帯機位置検出システムおよび携帯機位置検出方法 |
JP2003030201A (ja) * | 2001-07-19 | 2003-01-31 | Matsushita Electric Ind Co Ltd | 画像管理装置及び映像配信方法 |
JP2003085690A (ja) * | 2001-09-13 | 2003-03-20 | Alpine Electronics Inc | ライブビュー装置及びライブビューシステム |
JP2003123190A (ja) * | 2001-10-10 | 2003-04-25 | Sumitomo Electric Ind Ltd | 画像送信システム、画像送信装置、及び画像取得装置 |
JP2003162793A (ja) * | 2001-11-26 | 2003-06-06 | Minolta Co Ltd | 道路画像データベースシステムおよびそのための車載装置 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5043816A (en) * | 1988-12-26 | 1991-08-27 | Casio Computer Co., Ltd. | Electronic still camera including photographing timing control |
JPH11144185A (ja) * | 1997-09-03 | 1999-05-28 | Honda Motor Co Ltd | 自動運転制御誘導システム |
DE19746570A1 (de) * | 1997-10-22 | 1999-05-06 | Daimler Chrysler Ag | Verfahren und Vorrichtung zur großflächigen Verkehrslageüberwachung |
US7027087B2 (en) * | 1998-08-21 | 2006-04-11 | Nikon Corporation | Electronic camera |
JP3019299B1 (ja) | 1998-12-09 | 2000-03-13 | 株式会社リオスコーポレーション | 実写画像データを記録した記録媒体および画像読み出し装置 |
JP2000278672A (ja) * | 1999-01-21 | 2000-10-06 | Matsushita Electric Ind Co Ltd | ネットワーク型監視装置 |
US7313289B2 (en) * | 2000-08-30 | 2007-12-25 | Ricoh Company, Ltd. | Image processing method and apparatus and computer-readable storage medium using improved distortion correction |
US6351710B1 (en) * | 2000-09-28 | 2002-02-26 | Michael F. Mays | Method and system for visual addressing |
JP3809901B2 (ja) | 2000-12-06 | 2006-08-16 | 株式会社つくばマルチメディア | 地図誘導映像システム |
JP3804766B2 (ja) | 2001-03-15 | 2006-08-02 | シャープ株式会社 | 画像通信装置および携帯型電話機 |
US20030081934A1 (en) * | 2001-10-30 | 2003-05-01 | Kirmuss Charles Bruno | Mobile video recorder control and interface |
US20030212567A1 (en) * | 2002-05-07 | 2003-11-13 | Hitachi Ltd. | Witness information service with image capturing and sharing |
JP3744002B2 (ja) * | 2002-10-04 | 2006-02-08 | ソニー株式会社 | 表示装置、撮像装置、および撮像/表示システム |
US20050137794A1 (en) * | 2003-12-18 | 2005-06-23 | Dehua Cui | Intersection route navigation system for a motor vehicle |
-
2003
- 2003-06-12 JP JP2003167806A patent/JP4321128B2/ja not_active Expired - Fee Related
-
2004
- 2004-06-10 CN CNB2004800160861A patent/CN100530265C/zh not_active Expired - Fee Related
- 2004-06-10 WO PCT/JP2004/008112 patent/WO2004111973A1/ja active Application Filing
-
2005
- 2005-12-06 US US11/294,882 patent/US20060132602A1/en not_active Abandoned
-
2012
- 2012-02-10 US US13/370,430 patent/US9369675B2/en not_active Expired - Fee Related
- 2012-03-19 US US13/423,514 patent/US9369676B2/en not_active Expired - Fee Related
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0952555A (ja) * | 1995-08-11 | 1997-02-25 | Mitsubishi Electric Corp | 周辺監視装置 |
JP2001014332A (ja) * | 1999-06-30 | 2001-01-19 | Denso Corp | 情報サービスシステム |
JP2001184595A (ja) * | 1999-12-27 | 2001-07-06 | Matsushita Electric Ind Co Ltd | 伝送システム、送信装置、受信装置、表示装置、および車 |
JP2001236509A (ja) * | 2000-02-24 | 2001-08-31 | Minolta Co Ltd | 被写体の傾き検出装置および方法 |
JP2002099537A (ja) * | 2000-09-21 | 2002-04-05 | Ntt Docomo Inc | 活動情報公開システムおよび活動情報公開方法 |
JP2002318127A (ja) * | 2001-04-23 | 2002-10-31 | Ntt Advanced Technology Corp | 携帯機位置検出システムおよび携帯機位置検出方法 |
JP2003030201A (ja) * | 2001-07-19 | 2003-01-31 | Matsushita Electric Ind Co Ltd | 画像管理装置及び映像配信方法 |
JP2003085690A (ja) * | 2001-09-13 | 2003-03-20 | Alpine Electronics Inc | ライブビュー装置及びライブビューシステム |
JP2003123190A (ja) * | 2001-10-10 | 2003-04-25 | Sumitomo Electric Ind Ltd | 画像送信システム、画像送信装置、及び画像取得装置 |
JP2003162793A (ja) * | 2001-11-26 | 2003-06-06 | Minolta Co Ltd | 道路画像データベースシステムおよびそのための車載装置 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110181719A1 (en) * | 2005-05-11 | 2011-07-28 | Canon Kabushiki Kaisha | Network camera system and control method therefore |
US8908078B2 (en) * | 2005-05-11 | 2014-12-09 | Canon Kabushiki Kaisha | Network camera system and control method therefor in which, when a photo-taking condition changes, a user can readily recognize an area where the condition change is occurring |
WO2014150927A1 (en) * | 2013-03-15 | 2014-09-25 | Pictometry International Corp. | Virtual property reporting for automatic structure detection |
US9753950B2 (en) | 2013-03-15 | 2017-09-05 | Pictometry International Corp. | Virtual property reporting for automatic structure detection |
Also Published As
Publication number | Publication date |
---|---|
CN100530265C (zh) | 2009-08-19 |
US20120140077A1 (en) | 2012-06-07 |
US20060132602A1 (en) | 2006-06-22 |
JP2005006081A (ja) | 2005-01-06 |
JP4321128B2 (ja) | 2009-08-26 |
US9369675B2 (en) | 2016-06-14 |
CN1802678A (zh) | 2006-07-12 |
US20120176500A1 (en) | 2012-07-12 |
US9369676B2 (en) | 2016-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2004111973A1 (ja) | 画像サーバ、画像収集装置、及び画像表示端末 | |
KR101893532B1 (ko) | 요청된 차량 획득 교통 상황 영상 다운로드 | |
JP3951786B2 (ja) | 画像アップロード装置 | |
JP4926400B2 (ja) | 移動カメラシステム | |
JP4434219B2 (ja) | 画像サーバ | |
US20030214582A1 (en) | Video delivery apparatus and video information delivery system | |
JPH1194571A (ja) | 記録再生装置、記録再生方法、及び記録媒体 | |
JP7275556B2 (ja) | 情報処理システム、プログラム、及び情報処理方法 | |
WO2015019917A1 (ja) | ユーザの位置を基準にローカルな観光情報を検索する方法 | |
CN112836079A (zh) | 图像数据分发系统和图像数据显示终端 | |
JP2005202397A (ja) | 端末装置 | |
JPH11160080A (ja) | 移動体情報システム | |
JP2002022463A (ja) | ナビゲーション用画像データ収集装置及びナビゲーション用画像データ収集システム | |
WO2010032282A1 (ja) | サーバ装置、移動端末装置、交差点案内システム及び交差点案内方法 | |
JPH11211499A (ja) | 情報提供システム | |
JP2003037838A (ja) | 画像配信システム | |
JPH11144192A (ja) | 交通情報表示装置および画像表示装置 | |
JP2006031583A (ja) | 車載システム及び遠隔地点観測システム | |
JP2003536185A (ja) | 交通情報伝達方法および交通情報システム | |
JP2001005994A (ja) | 画像処理装置および画像処理方法 | |
JP7348724B2 (ja) | 車載装置および表示方法 | |
JP7046555B2 (ja) | 車載装置、サーバ、表示方法、送信方法 | |
JP4730414B2 (ja) | 携帯端末装置 | |
KR101397664B1 (ko) | 자동차 주행 상태 정보 제공 시스템 및 그 방법 | |
JPH11313215A (ja) | 画像データ送信装置及び画像データ配信システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 11294882 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20048160861 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 11294882 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |